Oct 02 10:51:23 crc systemd[1]: Starting Kubernetes Kubelet... Oct 02 10:51:23 crc restorecon[4734]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:51:23 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:51:24 crc restorecon[4734]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 02 10:51:24 crc restorecon[4734]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 02 10:51:25 crc kubenswrapper[4766]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 10:51:25 crc kubenswrapper[4766]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 02 10:51:25 crc kubenswrapper[4766]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 10:51:25 crc kubenswrapper[4766]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 10:51:25 crc kubenswrapper[4766]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 02 10:51:25 crc kubenswrapper[4766]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.555345 4766 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561844 4766 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561879 4766 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561884 4766 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561890 4766 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561893 4766 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561898 4766 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561904 4766 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561908 4766 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561912 4766 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561916 4766 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561921 4766 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561927 4766 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561933 4766 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561939 4766 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561945 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561950 4766 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561957 4766 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561970 4766 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561979 4766 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561983 4766 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561988 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561992 4766 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.561997 4766 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562001 4766 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562005 4766 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562009 4766 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562013 4766 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562018 4766 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562024 4766 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562033 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562037 4766 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562041 4766 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562045 4766 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562049 4766 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562054 4766 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562058 4766 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562062 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562066 4766 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562071 4766 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562075 4766 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562080 4766 feature_gate.go:330] unrecognized feature gate: Example Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562084 4766 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.562089 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563058 4766 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563067 4766 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563072 4766 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563078 4766 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563083 4766 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563088 4766 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563092 4766 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563097 4766 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563101 4766 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563107 4766 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563114 4766 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563119 4766 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563124 4766 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563130 4766 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563136 4766 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563142 4766 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563147 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563152 4766 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563157 4766 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563162 4766 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563167 4766 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563172 4766 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563177 4766 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563182 4766 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563187 4766 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563192 4766 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563196 4766 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.563201 4766 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565111 4766 flags.go:64] FLAG: --address="0.0.0.0" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565132 4766 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565142 4766 flags.go:64] FLAG: --anonymous-auth="true" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565148 4766 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565154 4766 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565160 4766 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565168 4766 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565175 4766 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565180 4766 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565185 4766 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565190 4766 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565196 4766 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565201 4766 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565207 4766 flags.go:64] FLAG: --cgroup-root="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565211 4766 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565216 4766 flags.go:64] FLAG: --client-ca-file="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565220 4766 flags.go:64] FLAG: --cloud-config="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565224 4766 flags.go:64] FLAG: --cloud-provider="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565228 4766 flags.go:64] FLAG: --cluster-dns="[]" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565233 4766 flags.go:64] FLAG: --cluster-domain="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565237 4766 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565242 4766 flags.go:64] FLAG: --config-dir="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565247 4766 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565252 4766 flags.go:64] FLAG: --container-log-max-files="5" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565258 4766 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565264 4766 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565269 4766 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565274 4766 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565278 4766 flags.go:64] FLAG: --contention-profiling="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565283 4766 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565288 4766 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565293 4766 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565297 4766 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565308 4766 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565313 4766 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565317 4766 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565322 4766 flags.go:64] FLAG: --enable-load-reader="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565327 4766 flags.go:64] FLAG: --enable-server="true" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565331 4766 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565339 4766 flags.go:64] FLAG: --event-burst="100" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565344 4766 flags.go:64] FLAG: --event-qps="50" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565348 4766 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565353 4766 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565358 4766 flags.go:64] FLAG: --eviction-hard="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565363 4766 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565368 4766 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565372 4766 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565377 4766 flags.go:64] FLAG: --eviction-soft="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565381 4766 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565386 4766 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565390 4766 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565394 4766 flags.go:64] FLAG: --experimental-mounter-path="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565398 4766 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565404 4766 flags.go:64] FLAG: --fail-swap-on="true" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565409 4766 flags.go:64] FLAG: --feature-gates="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565415 4766 flags.go:64] FLAG: --file-check-frequency="20s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565420 4766 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565426 4766 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565432 4766 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565438 4766 flags.go:64] FLAG: --healthz-port="10248" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565443 4766 flags.go:64] FLAG: --help="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565449 4766 flags.go:64] FLAG: --hostname-override="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565454 4766 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565459 4766 flags.go:64] FLAG: --http-check-frequency="20s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565464 4766 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565469 4766 flags.go:64] FLAG: --image-credential-provider-config="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565473 4766 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565478 4766 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565482 4766 flags.go:64] FLAG: --image-service-endpoint="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565486 4766 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565490 4766 flags.go:64] FLAG: --kube-api-burst="100" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565495 4766 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565503 4766 flags.go:64] FLAG: --kube-api-qps="50" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565521 4766 flags.go:64] FLAG: --kube-reserved="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565526 4766 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565531 4766 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565535 4766 flags.go:64] FLAG: --kubelet-cgroups="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565539 4766 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565543 4766 flags.go:64] FLAG: --lock-file="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565548 4766 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565553 4766 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565558 4766 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565564 4766 flags.go:64] FLAG: --log-json-split-stream="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565570 4766 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565574 4766 flags.go:64] FLAG: --log-text-split-stream="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565578 4766 flags.go:64] FLAG: --logging-format="text" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565582 4766 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565587 4766 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565591 4766 flags.go:64] FLAG: --manifest-url="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565595 4766 flags.go:64] FLAG: --manifest-url-header="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565601 4766 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565606 4766 flags.go:64] FLAG: --max-open-files="1000000" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565611 4766 flags.go:64] FLAG: --max-pods="110" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565615 4766 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565620 4766 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565625 4766 flags.go:64] FLAG: --memory-manager-policy="None" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565629 4766 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565633 4766 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565637 4766 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565642 4766 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565653 4766 flags.go:64] FLAG: --node-status-max-images="50" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565658 4766 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565662 4766 flags.go:64] FLAG: --oom-score-adj="-999" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565668 4766 flags.go:64] FLAG: --pod-cidr="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565673 4766 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565681 4766 flags.go:64] FLAG: --pod-manifest-path="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565686 4766 flags.go:64] FLAG: --pod-max-pids="-1" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565690 4766 flags.go:64] FLAG: --pods-per-core="0" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565695 4766 flags.go:64] FLAG: --port="10250" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565700 4766 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565705 4766 flags.go:64] FLAG: --provider-id="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565709 4766 flags.go:64] FLAG: --qos-reserved="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565714 4766 flags.go:64] FLAG: --read-only-port="10255" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565718 4766 flags.go:64] FLAG: --register-node="true" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565722 4766 flags.go:64] FLAG: --register-schedulable="true" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565727 4766 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565735 4766 flags.go:64] FLAG: --registry-burst="10" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565739 4766 flags.go:64] FLAG: --registry-qps="5" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565744 4766 flags.go:64] FLAG: --reserved-cpus="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565749 4766 flags.go:64] FLAG: --reserved-memory="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565755 4766 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565759 4766 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565765 4766 flags.go:64] FLAG: --rotate-certificates="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565770 4766 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565774 4766 flags.go:64] FLAG: --runonce="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565778 4766 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565783 4766 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565787 4766 flags.go:64] FLAG: --seccomp-default="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565791 4766 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565795 4766 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565799 4766 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565804 4766 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565808 4766 flags.go:64] FLAG: --storage-driver-password="root" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565812 4766 flags.go:64] FLAG: --storage-driver-secure="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565816 4766 flags.go:64] FLAG: --storage-driver-table="stats" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565820 4766 flags.go:64] FLAG: --storage-driver-user="root" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565825 4766 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565830 4766 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565834 4766 flags.go:64] FLAG: --system-cgroups="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565839 4766 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565847 4766 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565851 4766 flags.go:64] FLAG: --tls-cert-file="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565856 4766 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565861 4766 flags.go:64] FLAG: --tls-min-version="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565865 4766 flags.go:64] FLAG: --tls-private-key-file="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565870 4766 flags.go:64] FLAG: --topology-manager-policy="none" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565874 4766 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565878 4766 flags.go:64] FLAG: --topology-manager-scope="container" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565883 4766 flags.go:64] FLAG: --v="2" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565889 4766 flags.go:64] FLAG: --version="false" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565895 4766 flags.go:64] FLAG: --vmodule="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565907 4766 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.565912 4766 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566022 4766 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566027 4766 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566037 4766 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566041 4766 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566046 4766 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566050 4766 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566053 4766 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566057 4766 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566061 4766 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566064 4766 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566068 4766 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566072 4766 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566075 4766 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566079 4766 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566082 4766 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566087 4766 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566091 4766 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566095 4766 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566098 4766 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566102 4766 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566105 4766 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566109 4766 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566112 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566117 4766 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566121 4766 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566125 4766 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566129 4766 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566133 4766 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566137 4766 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566140 4766 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566144 4766 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566148 4766 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566151 4766 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566155 4766 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566158 4766 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566162 4766 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566165 4766 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566169 4766 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566174 4766 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566177 4766 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566182 4766 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566186 4766 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566191 4766 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566195 4766 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566199 4766 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566203 4766 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566206 4766 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566211 4766 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566215 4766 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566219 4766 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566223 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566226 4766 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566230 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566234 4766 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566237 4766 feature_gate.go:330] unrecognized feature gate: Example Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566241 4766 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566246 4766 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566250 4766 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566253 4766 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566257 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566260 4766 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566264 4766 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566267 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566271 4766 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566274 4766 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566281 4766 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566285 4766 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566288 4766 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566291 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566296 4766 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.566301 4766 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.567513 4766 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.577311 4766 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.577422 4766 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577538 4766 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577554 4766 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577561 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577567 4766 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577572 4766 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577576 4766 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577581 4766 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577585 4766 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577590 4766 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577595 4766 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577600 4766 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577605 4766 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577609 4766 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577613 4766 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577618 4766 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577622 4766 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577627 4766 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577632 4766 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577637 4766 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577641 4766 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577645 4766 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577650 4766 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577654 4766 feature_gate.go:330] unrecognized feature gate: Example Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577659 4766 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577664 4766 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577674 4766 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577680 4766 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577686 4766 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577691 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577697 4766 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577703 4766 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577712 4766 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577718 4766 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577725 4766 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577734 4766 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577741 4766 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577746 4766 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577753 4766 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577758 4766 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577764 4766 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577769 4766 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577774 4766 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577779 4766 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577784 4766 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577789 4766 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577794 4766 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577799 4766 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577804 4766 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577808 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577814 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577818 4766 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577824 4766 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577830 4766 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577836 4766 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577842 4766 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577847 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577851 4766 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577856 4766 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577860 4766 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577864 4766 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577869 4766 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577873 4766 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577877 4766 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577881 4766 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577886 4766 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577891 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577896 4766 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577902 4766 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577907 4766 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577914 4766 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.577920 4766 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.577928 4766 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578111 4766 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578120 4766 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578125 4766 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578130 4766 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578134 4766 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578138 4766 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578143 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578147 4766 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578151 4766 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578157 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578161 4766 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578165 4766 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578176 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578181 4766 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578186 4766 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578191 4766 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578195 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578199 4766 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578203 4766 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578208 4766 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578213 4766 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578217 4766 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578222 4766 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578227 4766 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578231 4766 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578237 4766 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578243 4766 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578248 4766 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578253 4766 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578258 4766 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578263 4766 feature_gate.go:330] unrecognized feature gate: Example Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578267 4766 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578272 4766 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578278 4766 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578284 4766 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578288 4766 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578293 4766 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578297 4766 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578302 4766 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578306 4766 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578311 4766 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578316 4766 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578321 4766 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578326 4766 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578333 4766 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578338 4766 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578342 4766 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578348 4766 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578354 4766 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578359 4766 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578366 4766 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578371 4766 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578375 4766 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578380 4766 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578385 4766 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578390 4766 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578394 4766 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578399 4766 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578405 4766 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578411 4766 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578415 4766 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578420 4766 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578425 4766 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578430 4766 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578435 4766 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578440 4766 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578445 4766 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578450 4766 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578454 4766 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578459 4766 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.578465 4766 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.578472 4766 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.578795 4766 server.go:940] "Client rotation is on, will bootstrap in background" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.584165 4766 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.584286 4766 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.585662 4766 server.go:997] "Starting client certificate rotation" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.585696 4766 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.585913 4766 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-26 08:54:07.875987242 +0000 UTC Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.586062 4766 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1318h2m42.289930212s for next certificate rotation Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.615016 4766 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.620556 4766 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.669069 4766 log.go:25] "Validated CRI v1 runtime API" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.730628 4766 log.go:25] "Validated CRI v1 image API" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.733404 4766 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.754432 4766 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-02-10-46-19-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.754479 4766 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.771016 4766 manager.go:217] Machine: {Timestamp:2025-10-02 10:51:25.767879352 +0000 UTC m=+0.710750326 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:d3914833-6e1d-48ec-a496-ffff0864ff9c BootID:0c3177b4-52e1-4f6e-a9c9-0faf43eec636 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:e9:c0:b7 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:e9:c0:b7 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:51:e7:d4 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:8a:7a:30 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:29:13:b0 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:53:94:51 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:f5:67:9d Speed:-1 Mtu:1496} {Name:eth10 MacAddress:66:e9:69:ae:2f:43 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fe:01:a5:1e:ee:47 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.771345 4766 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.771546 4766 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.771899 4766 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.772112 4766 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.772162 4766 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.772423 4766 topology_manager.go:138] "Creating topology manager with none policy" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.772435 4766 container_manager_linux.go:303] "Creating device plugin manager" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.772981 4766 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.773008 4766 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.773610 4766 state_mem.go:36] "Initialized new in-memory state store" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.773788 4766 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.804382 4766 kubelet.go:418] "Attempting to sync node with API server" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.804459 4766 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.804526 4766 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.804548 4766 kubelet.go:324] "Adding apiserver pod source" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.804572 4766 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.812433 4766 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.812978 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:25 crc kubenswrapper[4766]: E1002 10:51:25.813111 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.812978 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:25 crc kubenswrapper[4766]: E1002 10:51:25.813184 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.813288 4766 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.821298 4766 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.823116 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.823148 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.823161 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.823191 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.823206 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.823216 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.823224 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.823238 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.823249 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.823257 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.823276 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.823286 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.824190 4766 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.824719 4766 server.go:1280] "Started kubelet" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.824850 4766 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.825012 4766 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.826067 4766 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.826573 4766 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 02 10:51:25 crc systemd[1]: Started Kubernetes Kubelet. Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.828758 4766 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.828792 4766 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.828838 4766 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 03:10:08.914267789 +0000 UTC Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.828894 4766 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 880h18m43.085377225s for next certificate rotation Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.829042 4766 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.829067 4766 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.829166 4766 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 02 10:51:25 crc kubenswrapper[4766]: E1002 10:51:25.829355 4766 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.829898 4766 factory.go:55] Registering systemd factory Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.829937 4766 factory.go:221] Registration of the systemd container factory successfully Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.829970 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:25 crc kubenswrapper[4766]: E1002 10:51:25.830030 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.830288 4766 factory.go:153] Registering CRI-O factory Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.830364 4766 factory.go:221] Registration of the crio container factory successfully Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.830491 4766 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.830632 4766 factory.go:103] Registering Raw factory Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.830733 4766 manager.go:1196] Started watching for new ooms in manager Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.831789 4766 manager.go:319] Starting recovery of all containers Oct 02 10:51:25 crc kubenswrapper[4766]: E1002 10:51:25.832987 4766 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" interval="200ms" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.837265 4766 server.go:460] "Adding debug handlers to kubelet server" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.841575 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842344 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842371 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842383 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842393 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842403 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842416 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842426 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842438 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842450 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842460 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842470 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842480 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842491 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842504 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842530 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842540 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842575 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842609 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842624 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842636 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842803 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842823 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842886 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842900 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842911 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842926 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842940 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842951 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842960 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842968 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842978 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842988 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.842996 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843006 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843015 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843025 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843033 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843042 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843055 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843066 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843075 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843086 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843095 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843105 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843116 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843126 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843140 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843153 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843164 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843175 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843266 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843280 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843291 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843306 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843316 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843327 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843338 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843348 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843358 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843368 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843377 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843386 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843396 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843405 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843430 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843439 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843448 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843458 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843468 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843478 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843487 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843496 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843507 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843532 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843541 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843554 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843566 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843580 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843592 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843603 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843611 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843621 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843629 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843638 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843647 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843657 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843666 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843676 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843686 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843695 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843707 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843718 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843728 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843738 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843749 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843758 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843769 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843779 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843788 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843802 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843813 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843823 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843834 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843848 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843860 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843870 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843882 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843904 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843916 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843927 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843941 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843953 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843977 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.843994 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.844010 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.844021 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.844036 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.844049 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.844060 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.844070 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.844108 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.844119 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.844135 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.844236 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.844249 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.844258 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847615 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847666 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847684 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847701 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847716 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847732 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847751 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847765 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847804 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847819 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847832 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847845 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847858 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847869 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847883 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847899 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847915 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847927 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847940 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847953 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847966 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847980 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: E1002 10:51:25.841282 4766 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.200:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186aa70ea79742aa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-02 10:51:25.82468881 +0000 UTC m=+0.767559754,LastTimestamp:2025-10-02 10:51:25.82468881 +0000 UTC m=+0.767559754,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.847994 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848007 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848021 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848035 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848053 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848065 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848078 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848091 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848103 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848117 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848135 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848150 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848163 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848176 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848188 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848200 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848217 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848231 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848243 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848257 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848269 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848281 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848293 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848305 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848324 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848338 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848353 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848366 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848378 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848391 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848403 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848417 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848430 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848444 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.848459 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.851635 4766 manager.go:324] Recovery completed Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852182 4766 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852223 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852240 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852256 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852271 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852285 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852297 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852315 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852328 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852341 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852354 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852366 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852380 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852396 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852411 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852424 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852437 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852449 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852462 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852477 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852526 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852541 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852554 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852572 4766 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852610 4766 reconstruct.go:97] "Volume reconstruction finished" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.852620 4766 reconciler.go:26] "Reconciler: start to sync state" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.864576 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.866267 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.866321 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.866336 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.867572 4766 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.867598 4766 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.867625 4766 state_mem.go:36] "Initialized new in-memory state store" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.878255 4766 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.879922 4766 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.879963 4766 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.880016 4766 kubelet.go:2335] "Starting kubelet main sync loop" Oct 02 10:51:25 crc kubenswrapper[4766]: E1002 10:51:25.880192 4766 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 02 10:51:25 crc kubenswrapper[4766]: W1002 10:51:25.880915 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:25 crc kubenswrapper[4766]: E1002 10:51:25.881012 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.895656 4766 policy_none.go:49] "None policy: Start" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.896876 4766 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.896913 4766 state_mem.go:35] "Initializing new in-memory state store" Oct 02 10:51:25 crc kubenswrapper[4766]: E1002 10:51:25.929873 4766 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.962543 4766 manager.go:334] "Starting Device Plugin manager" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.962604 4766 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.962619 4766 server.go:79] "Starting device plugin registration server" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.963039 4766 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.963060 4766 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.964072 4766 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.964238 4766 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.964249 4766 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 02 10:51:25 crc kubenswrapper[4766]: E1002 10:51:25.970482 4766 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.980779 4766 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.980952 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.982083 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.982121 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.982135 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.982297 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.982542 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.982574 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.983213 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.983241 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.983251 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.983351 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.983376 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.983387 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.983536 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.983608 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.983650 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.984196 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.984249 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.984261 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.984335 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.984352 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.984357 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.984362 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.984478 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.984528 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.985347 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.985370 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.985371 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.985379 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.985390 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.985402 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.985500 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.985647 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.985673 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.986491 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.986526 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.986536 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.986534 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.986558 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.986567 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.986677 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.986698 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.987337 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.987366 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:25 crc kubenswrapper[4766]: I1002 10:51:25.987377 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:26 crc kubenswrapper[4766]: E1002 10:51:26.033604 4766 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" interval="400ms" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.054949 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.055125 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.055214 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.055306 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.055401 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.055473 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.055587 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.055696 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.055805 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.055877 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.055944 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.056017 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.056084 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.056161 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.056226 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.063856 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.065078 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.065203 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.065270 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.065380 4766 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 10:51:26 crc kubenswrapper[4766]: E1002 10:51:26.065914 4766 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.200:6443: connect: connection refused" node="crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.157329 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.157562 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.157465 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.157672 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.157680 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.157698 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.157735 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.157777 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.157864 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.157802 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.157910 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.157904 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.157931 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.157989 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.158004 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.158022 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.158036 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.158002 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.158112 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.158140 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.158162 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.158172 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.158204 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.158232 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.158283 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.158316 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.158390 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.158406 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.158346 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.158573 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.266295 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.268425 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.268548 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.268568 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.268616 4766 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 10:51:26 crc kubenswrapper[4766]: E1002 10:51:26.269373 4766 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.200:6443: connect: connection refused" node="crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.315285 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.329867 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.347201 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.375843 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.383851 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:51:26 crc kubenswrapper[4766]: W1002 10:51:26.404168 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-88e8cb6619e9e1e5ef0258e62f1b0b396bbeffd40ca52efb2b43cb73b2a8d5d0 WatchSource:0}: Error finding container 88e8cb6619e9e1e5ef0258e62f1b0b396bbeffd40ca52efb2b43cb73b2a8d5d0: Status 404 returned error can't find the container with id 88e8cb6619e9e1e5ef0258e62f1b0b396bbeffd40ca52efb2b43cb73b2a8d5d0 Oct 02 10:51:26 crc kubenswrapper[4766]: W1002 10:51:26.405084 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-dd2a28a88ddb44c27f56b0f66831bb4c34fc16bb341cc8a4a85e30f411d8af11 WatchSource:0}: Error finding container dd2a28a88ddb44c27f56b0f66831bb4c34fc16bb341cc8a4a85e30f411d8af11: Status 404 returned error can't find the container with id dd2a28a88ddb44c27f56b0f66831bb4c34fc16bb341cc8a4a85e30f411d8af11 Oct 02 10:51:26 crc kubenswrapper[4766]: W1002 10:51:26.415752 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-36b3274dfc713b1e713bbc7a703506c18a72ceb95f5f7a2174e6d08d1d85b7ed WatchSource:0}: Error finding container 36b3274dfc713b1e713bbc7a703506c18a72ceb95f5f7a2174e6d08d1d85b7ed: Status 404 returned error can't find the container with id 36b3274dfc713b1e713bbc7a703506c18a72ceb95f5f7a2174e6d08d1d85b7ed Oct 02 10:51:26 crc kubenswrapper[4766]: E1002 10:51:26.435685 4766 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" interval="800ms" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.670384 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.671859 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.671915 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.671924 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.671951 4766 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 10:51:26 crc kubenswrapper[4766]: E1002 10:51:26.672330 4766 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.200:6443: connect: connection refused" node="crc" Oct 02 10:51:26 crc kubenswrapper[4766]: W1002 10:51:26.694018 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:26 crc kubenswrapper[4766]: E1002 10:51:26.694103 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:51:26 crc kubenswrapper[4766]: W1002 10:51:26.801933 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:26 crc kubenswrapper[4766]: E1002 10:51:26.802052 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.826369 4766 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:26 crc kubenswrapper[4766]: W1002 10:51:26.829832 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:26 crc kubenswrapper[4766]: E1002 10:51:26.829912 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:51:26 crc kubenswrapper[4766]: W1002 10:51:26.884202 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:26 crc kubenswrapper[4766]: E1002 10:51:26.884302 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.887741 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dd2a28a88ddb44c27f56b0f66831bb4c34fc16bb341cc8a4a85e30f411d8af11"} Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.888830 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"88e8cb6619e9e1e5ef0258e62f1b0b396bbeffd40ca52efb2b43cb73b2a8d5d0"} Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.889858 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"36b3274dfc713b1e713bbc7a703506c18a72ceb95f5f7a2174e6d08d1d85b7ed"} Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.891212 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6ef620e7551052fbc286e67003288858dbc4e64db86207b79e340e023ab9bd71"} Oct 02 10:51:26 crc kubenswrapper[4766]: I1002 10:51:26.892412 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f66c4e4408ccb6693dc38bd3ef7ba592649a00e9b7a29d8d5f7a3e8f0bb78e8a"} Oct 02 10:51:27 crc kubenswrapper[4766]: E1002 10:51:27.236715 4766 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" interval="1.6s" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.473373 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.475109 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.475141 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.475151 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.475172 4766 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 10:51:27 crc kubenswrapper[4766]: E1002 10:51:27.475632 4766 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.200:6443: connect: connection refused" node="crc" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.826058 4766 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.939562 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a"} Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.939597 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.939615 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a"} Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.939627 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab"} Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.939636 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58"} Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.940520 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.940578 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.940597 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.941275 4766 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97" exitCode=0 Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.941348 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97"} Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.941375 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.942318 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.942350 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.942361 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.942872 4766 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="87cf73671615c0ee67ed0a4e457f8c298516c3abead405cbef99b7925552b974" exitCode=0 Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.942904 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"87cf73671615c0ee67ed0a4e457f8c298516c3abead405cbef99b7925552b974"} Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.942978 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.944580 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.944844 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.944873 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.944886 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.945736 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.945752 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.945760 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.945911 4766 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="cb21316d8de77cacc4f5ec3d1ea00e5d572853ccd2796b7deaed6cb4e8a450f3" exitCode=0 Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.945975 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"cb21316d8de77cacc4f5ec3d1ea00e5d572853ccd2796b7deaed6cb4e8a450f3"} Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.946090 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.947363 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.947392 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.947401 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.947998 4766 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840" exitCode=0 Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.948026 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840"} Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.948111 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.949235 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.949262 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:27 crc kubenswrapper[4766]: I1002 10:51:27.949273 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:28 crc kubenswrapper[4766]: W1002 10:51:28.474479 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:28 crc kubenswrapper[4766]: E1002 10:51:28.474628 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:51:28 crc kubenswrapper[4766]: W1002 10:51:28.573908 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:28 crc kubenswrapper[4766]: E1002 10:51:28.574006 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.826119 4766 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:28 crc kubenswrapper[4766]: E1002 10:51:28.838658 4766 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.200:6443: connect: connection refused" interval="3.2s" Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.956032 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f754efe5b2b5464759b533ec933dc5834a1be242592ca9375184f5fc24a72f29"} Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.956140 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ae830f784230e44d237e8f6a6606c9969e54046b8010f6dcd0ca446ea264676f"} Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.956155 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"79274b877cf8ff3109da1dc078ee5f0b19a541fb7026b110c1969e0c2d341cb6"} Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.956062 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.957022 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.957101 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.957126 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.959044 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d"} Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.959156 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd"} Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.959184 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2"} Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.961415 4766 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="371309b1c4a030c7bd0c872fa5b6afe2aac6f7a2a1bf379c5b11533e142aa72a" exitCode=0 Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.961524 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"371309b1c4a030c7bd0c872fa5b6afe2aac6f7a2a1bf379c5b11533e142aa72a"} Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.961576 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.962667 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.962721 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.962813 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.963600 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2863522cb0c674d69bc44013b1f0ee9f9adb918a9119fc4d05d3b19cf4ed73e4"} Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.963655 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.963679 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.964901 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.964944 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.964959 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.965839 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.965875 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:28 crc kubenswrapper[4766]: I1002 10:51:28.965888 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.076630 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.078662 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.078719 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.078730 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.078758 4766 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 10:51:29 crc kubenswrapper[4766]: E1002 10:51:29.079326 4766 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.200:6443: connect: connection refused" node="crc" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.636338 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:51:29 crc kubenswrapper[4766]: W1002 10:51:29.678899 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:29 crc kubenswrapper[4766]: E1002 10:51:29.678985 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.826653 4766 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.837175 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:51:29 crc kubenswrapper[4766]: W1002 10:51:29.841126 4766 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:29 crc kubenswrapper[4766]: E1002 10:51:29.841285 4766 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.200:6443: connect: connection refused" logger="UnhandledError" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.974826 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1c1e2fa8026fcad3886fdfffb123c6fd95722ec22115ddcbcfc478326b301094"} Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.974885 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df"} Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.974920 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.976124 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.976151 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.976161 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.977664 4766 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="be763462d66bc78259e0217785ee9280b921fb1da9d576731b0a68d82a0fe3e6" exitCode=0 Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.977784 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.977838 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.977847 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.978449 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"be763462d66bc78259e0217785ee9280b921fb1da9d576731b0a68d82a0fe3e6"} Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.978540 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.979105 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.979170 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.979179 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.979213 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.979227 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.979253 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.979266 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.979274 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.979184 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.979986 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.980048 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:29 crc kubenswrapper[4766]: I1002 10:51:29.980059 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:30 crc kubenswrapper[4766]: I1002 10:51:30.826522 4766 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.200:6443: connect: connection refused Oct 02 10:51:30 crc kubenswrapper[4766]: I1002 10:51:30.983939 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e35460bafc4d9ad68045d3879f7dba6690b2dca730672f61bafa0e3a8d82c02c"} Oct 02 10:51:30 crc kubenswrapper[4766]: I1002 10:51:30.984007 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:30 crc kubenswrapper[4766]: I1002 10:51:30.984059 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:30 crc kubenswrapper[4766]: I1002 10:51:30.984126 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:30 crc kubenswrapper[4766]: I1002 10:51:30.988588 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:30 crc kubenswrapper[4766]: I1002 10:51:30.988719 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:30 crc kubenswrapper[4766]: I1002 10:51:30.988832 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:30 crc kubenswrapper[4766]: I1002 10:51:30.988896 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:30 crc kubenswrapper[4766]: I1002 10:51:30.988859 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:30 crc kubenswrapper[4766]: I1002 10:51:30.989059 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.630530 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.631354 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.633623 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.633664 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.633676 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.637351 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.988366 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.990280 4766 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1c1e2fa8026fcad3886fdfffb123c6fd95722ec22115ddcbcfc478326b301094" exitCode=255 Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.990375 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1c1e2fa8026fcad3886fdfffb123c6fd95722ec22115ddcbcfc478326b301094"} Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.990416 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.991463 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.991541 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.991556 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.992068 4766 scope.go:117] "RemoveContainer" containerID="1c1e2fa8026fcad3886fdfffb123c6fd95722ec22115ddcbcfc478326b301094" Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.994269 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"81594c415f91899483dd0aaa9b4c7579954111d17a09fe4cd1772a1b7e30e828"} Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.994305 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.994316 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"51b2f1d11586bb63a7a938d67ffe2d79e0737bd5cfd5843516729a47bb8e10dc"} Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.994378 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.995255 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.995297 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:31 crc kubenswrapper[4766]: I1002 10:51:31.995313 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:32 crc kubenswrapper[4766]: I1002 10:51:32.280376 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:32 crc kubenswrapper[4766]: I1002 10:51:32.281700 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:32 crc kubenswrapper[4766]: I1002 10:51:32.281739 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:32 crc kubenswrapper[4766]: I1002 10:51:32.281751 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:32 crc kubenswrapper[4766]: I1002 10:51:32.281772 4766 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 10:51:32 crc kubenswrapper[4766]: I1002 10:51:32.637338 4766 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 10:51:32 crc kubenswrapper[4766]: I1002 10:51:32.637458 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 10:51:32 crc kubenswrapper[4766]: I1002 10:51:32.998804 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 10:51:33 crc kubenswrapper[4766]: I1002 10:51:33.000775 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a"} Oct 02 10:51:33 crc kubenswrapper[4766]: I1002 10:51:33.000869 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:33 crc kubenswrapper[4766]: I1002 10:51:33.001645 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:33 crc kubenswrapper[4766]: I1002 10:51:33.001683 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:33 crc kubenswrapper[4766]: I1002 10:51:33.001692 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:33 crc kubenswrapper[4766]: I1002 10:51:33.004852 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3e713fc811a7629ea42b01bdb34bdfbe8a623903dc45e3c25ace5c4b3f1ab478"} Oct 02 10:51:33 crc kubenswrapper[4766]: I1002 10:51:33.004890 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9fec99f47c0eecc37f54abcdeb52db64e8c976a72f51715f62886146e0828a41"} Oct 02 10:51:33 crc kubenswrapper[4766]: I1002 10:51:33.004907 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:33 crc kubenswrapper[4766]: I1002 10:51:33.004978 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:33 crc kubenswrapper[4766]: I1002 10:51:33.005593 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:33 crc kubenswrapper[4766]: I1002 10:51:33.005629 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:33 crc kubenswrapper[4766]: I1002 10:51:33.005640 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:33 crc kubenswrapper[4766]: I1002 10:51:33.006556 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:33 crc kubenswrapper[4766]: I1002 10:51:33.006580 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:33 crc kubenswrapper[4766]: I1002 10:51:33.006588 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:33 crc kubenswrapper[4766]: I1002 10:51:33.948997 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:34 crc kubenswrapper[4766]: I1002 10:51:34.008309 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:34 crc kubenswrapper[4766]: I1002 10:51:34.008342 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:34 crc kubenswrapper[4766]: I1002 10:51:34.008309 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:34 crc kubenswrapper[4766]: I1002 10:51:34.009428 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:34 crc kubenswrapper[4766]: I1002 10:51:34.009465 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:34 crc kubenswrapper[4766]: I1002 10:51:34.009479 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:34 crc kubenswrapper[4766]: I1002 10:51:34.009871 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:34 crc kubenswrapper[4766]: I1002 10:51:34.009915 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:34 crc kubenswrapper[4766]: I1002 10:51:34.009925 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:34 crc kubenswrapper[4766]: I1002 10:51:34.018528 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:35 crc kubenswrapper[4766]: I1002 10:51:35.011553 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:35 crc kubenswrapper[4766]: I1002 10:51:35.012595 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:35 crc kubenswrapper[4766]: I1002 10:51:35.012635 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:35 crc kubenswrapper[4766]: I1002 10:51:35.012650 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:35 crc kubenswrapper[4766]: I1002 10:51:35.627713 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 02 10:51:35 crc kubenswrapper[4766]: I1002 10:51:35.628060 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:35 crc kubenswrapper[4766]: I1002 10:51:35.630301 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:35 crc kubenswrapper[4766]: I1002 10:51:35.630367 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:35 crc kubenswrapper[4766]: I1002 10:51:35.630379 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:35 crc kubenswrapper[4766]: E1002 10:51:35.970604 4766 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 10:51:36 crc kubenswrapper[4766]: I1002 10:51:36.014056 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:36 crc kubenswrapper[4766]: I1002 10:51:36.015706 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:36 crc kubenswrapper[4766]: I1002 10:51:36.015763 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:36 crc kubenswrapper[4766]: I1002 10:51:36.015778 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:36 crc kubenswrapper[4766]: I1002 10:51:36.539541 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:51:36 crc kubenswrapper[4766]: I1002 10:51:36.539724 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:36 crc kubenswrapper[4766]: I1002 10:51:36.540909 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:36 crc kubenswrapper[4766]: I1002 10:51:36.540980 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:36 crc kubenswrapper[4766]: I1002 10:51:36.541016 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:37 crc kubenswrapper[4766]: I1002 10:51:37.337979 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 02 10:51:37 crc kubenswrapper[4766]: I1002 10:51:37.338183 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:37 crc kubenswrapper[4766]: I1002 10:51:37.339470 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:37 crc kubenswrapper[4766]: I1002 10:51:37.339565 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:37 crc kubenswrapper[4766]: I1002 10:51:37.339580 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:41 crc kubenswrapper[4766]: I1002 10:51:41.827074 4766 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 02 10:51:42 crc kubenswrapper[4766]: E1002 10:51:42.040846 4766 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Oct 02 10:51:42 crc kubenswrapper[4766]: I1002 10:51:42.162313 4766 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 02 10:51:42 crc kubenswrapper[4766]: I1002 10:51:42.162975 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 02 10:51:42 crc kubenswrapper[4766]: I1002 10:51:42.169562 4766 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Oct 02 10:51:42 crc kubenswrapper[4766]: I1002 10:51:42.169630 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 02 10:51:42 crc kubenswrapper[4766]: I1002 10:51:42.637722 4766 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 02 10:51:42 crc kubenswrapper[4766]: I1002 10:51:42.637821 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 10:51:43 crc kubenswrapper[4766]: I1002 10:51:43.539628 4766 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 02 10:51:43 crc kubenswrapper[4766]: I1002 10:51:43.539705 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 02 10:51:44 crc kubenswrapper[4766]: I1002 10:51:44.023138 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:44 crc kubenswrapper[4766]: I1002 10:51:44.023301 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:44 crc kubenswrapper[4766]: I1002 10:51:44.023959 4766 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 02 10:51:44 crc kubenswrapper[4766]: I1002 10:51:44.024028 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 02 10:51:44 crc kubenswrapper[4766]: I1002 10:51:44.024583 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:44 crc kubenswrapper[4766]: I1002 10:51:44.024621 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:44 crc kubenswrapper[4766]: I1002 10:51:44.024630 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:44 crc kubenswrapper[4766]: I1002 10:51:44.027189 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:44 crc kubenswrapper[4766]: I1002 10:51:44.039933 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:44 crc kubenswrapper[4766]: I1002 10:51:44.040584 4766 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 02 10:51:44 crc kubenswrapper[4766]: I1002 10:51:44.040667 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 02 10:51:44 crc kubenswrapper[4766]: I1002 10:51:44.041162 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:44 crc kubenswrapper[4766]: I1002 10:51:44.041204 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:44 crc kubenswrapper[4766]: I1002 10:51:44.041214 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:45 crc kubenswrapper[4766]: I1002 10:51:45.653943 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 02 10:51:45 crc kubenswrapper[4766]: I1002 10:51:45.654134 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:45 crc kubenswrapper[4766]: I1002 10:51:45.655078 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:45 crc kubenswrapper[4766]: I1002 10:51:45.655102 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:45 crc kubenswrapper[4766]: I1002 10:51:45.655111 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:45 crc kubenswrapper[4766]: I1002 10:51:45.664637 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 02 10:51:45 crc kubenswrapper[4766]: E1002 10:51:45.970848 4766 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 02 10:51:46 crc kubenswrapper[4766]: I1002 10:51:46.045247 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:46 crc kubenswrapper[4766]: I1002 10:51:46.048871 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:46 crc kubenswrapper[4766]: I1002 10:51:46.048978 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:46 crc kubenswrapper[4766]: I1002 10:51:46.048998 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:46 crc kubenswrapper[4766]: I1002 10:51:46.421938 4766 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 02 10:51:46 crc kubenswrapper[4766]: I1002 10:51:46.422028 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 02 10:51:46 crc kubenswrapper[4766]: I1002 10:51:46.488988 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:51:46 crc kubenswrapper[4766]: I1002 10:51:46.489143 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:46 crc kubenswrapper[4766]: I1002 10:51:46.490478 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:46 crc kubenswrapper[4766]: I1002 10:51:46.490585 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:46 crc kubenswrapper[4766]: I1002 10:51:46.490603 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.133515 4766 trace.go:236] Trace[830170665]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 10:51:35.462) (total time: 11671ms): Oct 02 10:51:47 crc kubenswrapper[4766]: Trace[830170665]: ---"Objects listed" error: 11671ms (10:51:47.133) Oct 02 10:51:47 crc kubenswrapper[4766]: Trace[830170665]: [11.671410321s] [11.671410321s] END Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.133549 4766 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.133742 4766 trace.go:236] Trace[191986894]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 10:51:34.394) (total time: 12738ms): Oct 02 10:51:47 crc kubenswrapper[4766]: Trace[191986894]: ---"Objects listed" error: 12738ms (10:51:47.133) Oct 02 10:51:47 crc kubenswrapper[4766]: Trace[191986894]: [12.738956719s] [12.738956719s] END Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.133774 4766 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 02 10:51:47 crc kubenswrapper[4766]: E1002 10:51:47.135287 4766 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.135967 4766 trace.go:236] Trace[1761801713]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 10:51:33.947) (total time: 13188ms): Oct 02 10:51:47 crc kubenswrapper[4766]: Trace[1761801713]: ---"Objects listed" error: 13188ms (10:51:47.135) Oct 02 10:51:47 crc kubenswrapper[4766]: Trace[1761801713]: [13.188708973s] [13.188708973s] END Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.135998 4766 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.135972 4766 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.136303 4766 trace.go:236] Trace[1310044974]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Oct-2025 10:51:34.845) (total time: 12290ms): Oct 02 10:51:47 crc kubenswrapper[4766]: Trace[1310044974]: ---"Objects listed" error: 12290ms (10:51:47.136) Oct 02 10:51:47 crc kubenswrapper[4766]: Trace[1310044974]: [12.29088689s] [12.29088689s] END Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.136321 4766 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.817410 4766 apiserver.go:52] "Watching apiserver" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.828740 4766 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.829656 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5vgtz","openshift-machine-config-operator/machine-config-daemon-l99lx","openshift-multus/multus-2jxdg","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-node-27vgl","openshift-multus/multus-additional-cni-plugins-4wx78","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.830194 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.830572 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:51:47 crc kubenswrapper[4766]: E1002 10:51:47.831358 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.831681 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:51:47 crc kubenswrapper[4766]: E1002 10:51:47.831770 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.831856 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.831958 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.832353 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:51:47 crc kubenswrapper[4766]: E1002 10:51:47.832417 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.832722 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5vgtz" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.835386 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.835734 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.835895 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.836180 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.836280 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.836706 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.836718 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.837120 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.837264 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.837391 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.837946 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.838068 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.838209 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.839141 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.839368 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.840017 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.840143 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.840233 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.840468 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.840668 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.840693 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.841254 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.841323 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.841272 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.842013 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.842037 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.842165 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.842168 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.842201 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.842248 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.842307 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.842349 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.842443 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.842570 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.843304 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.854414 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.865260 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.875769 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.885660 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.919031 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.930903 4766 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.931092 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.939868 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.939935 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.939914 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.939964 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940265 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940304 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940329 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940355 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940382 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940407 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940426 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940450 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940472 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940493 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940614 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940652 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940676 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940699 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940722 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940740 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940761 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940780 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940815 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940841 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940862 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940880 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940900 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940923 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940944 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940962 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.940983 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941004 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941027 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941024 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941048 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941046 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941086 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941119 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941150 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941192 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941232 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941266 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941296 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941298 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941320 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941344 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941420 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941443 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941472 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941496 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941556 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941589 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941608 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941626 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941657 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941675 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941693 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941728 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941746 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941796 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941812 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941828 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941843 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941859 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941875 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941892 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941908 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941928 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941946 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941964 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941981 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942000 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942020 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942036 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942062 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942103 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942123 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942148 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942163 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942178 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942195 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942212 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942229 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942250 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942268 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942285 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942304 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942323 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942341 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942362 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942382 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942400 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942419 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942437 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942455 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942473 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942490 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942524 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942542 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942562 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942580 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942601 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942619 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942638 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942655 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942671 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942700 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942720 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942739 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942756 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942776 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942796 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942815 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942840 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942890 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942916 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942939 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942959 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942975 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.942995 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943015 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943043 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943069 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943100 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943130 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943153 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943173 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943194 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943218 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943240 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943262 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943597 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943624 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943645 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943670 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943705 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943760 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943779 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943799 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943818 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943835 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943854 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943875 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943906 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943927 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943947 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943965 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943985 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944002 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944022 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944039 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944059 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944099 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944121 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944138 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944168 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944186 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944204 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944223 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944242 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944262 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944282 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944302 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944326 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944345 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944364 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944385 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944403 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944422 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944442 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944462 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944480 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944518 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944626 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944661 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944690 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944709 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944726 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944745 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944767 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944788 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944805 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944824 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944843 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944878 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944896 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944914 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944935 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944953 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944971 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944991 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945013 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945035 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945052 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945083 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945113 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945143 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945162 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945179 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945246 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945291 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-systemd\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945314 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945334 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b897c4d4-6c9c-4d4b-a684-d66c59d00190-cnibin\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945353 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-cni-netd\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945371 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-run-multus-certs\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945388 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-node-log\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945406 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945426 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlxkt\" (UniqueName: \"kubernetes.io/projected/317f8a30-cdef-4f82-832c-4f3bc2674379-kube-api-access-jlxkt\") pod \"node-resolver-5vgtz\" (UID: \"317f8a30-cdef-4f82-832c-4f3bc2674379\") " pod="openshift-dns/node-resolver-5vgtz" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945444 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-var-lib-cni-bin\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945463 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-kubelet\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945478 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-systemd-units\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945495 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-run-netns\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945533 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx6lz\" (UniqueName: \"kubernetes.io/projected/b897c4d4-6c9c-4d4b-a684-d66c59d00190-kube-api-access-rx6lz\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945550 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b897c4d4-6c9c-4d4b-a684-d66c59d00190-os-release\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945566 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovnkube-config\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945581 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-multus-cni-dir\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945599 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-var-lib-kubelet\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945615 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-hostroot\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945634 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/317f8a30-cdef-4f82-832c-4f3bc2674379-hosts-file\") pod \"node-resolver-5vgtz\" (UID: \"317f8a30-cdef-4f82-832c-4f3bc2674379\") " pod="openshift-dns/node-resolver-5vgtz" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945652 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-cnibin\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945671 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-var-lib-openvswitch\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945693 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945716 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-ovn\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945748 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovnkube-script-lib\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945781 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945806 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b897c4d4-6c9c-4d4b-a684-d66c59d00190-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945830 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-etc-openvswitch\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945857 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-env-overrides\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945882 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945900 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-var-lib-cni-multus\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945922 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945940 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-run-netns\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945961 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cd484f43-26b6-4e55-b872-7502e8d6e8c7-rootfs\") pod \"machine-config-daemon-l99lx\" (UID: \"cd484f43-26b6-4e55-b872-7502e8d6e8c7\") " pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945981 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd484f43-26b6-4e55-b872-7502e8d6e8c7-proxy-tls\") pod \"machine-config-daemon-l99lx\" (UID: \"cd484f43-26b6-4e55-b872-7502e8d6e8c7\") " pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946001 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946021 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946044 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946068 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-run-k8s-cni-cncf-io\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946097 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-slash\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946125 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946159 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-run-ovn-kubernetes\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946188 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946214 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946236 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovn-node-metrics-cert\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946257 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946276 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-cni-binary-copy\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946294 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fb7h\" (UniqueName: \"kubernetes.io/projected/11cc785e-5bdc-4827-913a-4d899eb5a83c-kube-api-access-9fb7h\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946312 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-multus-socket-dir-parent\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946330 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-multus-conf-dir\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946345 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-etc-kubernetes\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946369 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946385 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-openvswitch\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946403 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b897c4d4-6c9c-4d4b-a684-d66c59d00190-cni-binary-copy\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946421 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-cni-bin\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946437 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-system-cni-dir\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946453 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-multus-daemon-config\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946471 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pn5s\" (UniqueName: \"kubernetes.io/projected/cd484f43-26b6-4e55-b872-7502e8d6e8c7-kube-api-access-5pn5s\") pod \"machine-config-daemon-l99lx\" (UID: \"cd484f43-26b6-4e55-b872-7502e8d6e8c7\") " pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946488 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-log-socket\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946526 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-os-release\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946547 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwgcg\" (UniqueName: \"kubernetes.io/projected/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-kube-api-access-fwgcg\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946568 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cd484f43-26b6-4e55-b872-7502e8d6e8c7-mcd-auth-proxy-config\") pod \"machine-config-daemon-l99lx\" (UID: \"cd484f43-26b6-4e55-b872-7502e8d6e8c7\") " pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946585 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b897c4d4-6c9c-4d4b-a684-d66c59d00190-system-cni-dir\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946621 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b897c4d4-6c9c-4d4b-a684-d66c59d00190-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946699 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946711 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946723 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941713 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.941918 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943331 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.981034 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943426 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.943492 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944661 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.944826 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.945654 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.946066 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.947490 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.948960 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.950867 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.951284 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.951708 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.979700 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.979995 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.980226 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.980466 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.981193 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.980564 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.980784 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.980953 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.981111 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.981271 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.981322 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.981383 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.981380 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.981469 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.981477 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.981543 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.981589 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.981641 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.981706 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.981813 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.981893 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.981896 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.982002 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.982088 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.982171 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.982327 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.982905 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.983112 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.983304 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.983495 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.983757 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.996031 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.996646 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.996985 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.997183 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.997217 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.997398 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.997694 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.998120 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.998462 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.998468 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.998489 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.998619 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.998759 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.998871 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: E1002 10:51:47.999070 4766 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:51:47 crc kubenswrapper[4766]: E1002 10:51:47.999141 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:51:48.499121063 +0000 UTC m=+23.441992077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.999220 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.999269 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.999436 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.999845 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:47 crc kubenswrapper[4766]: I1002 10:51:47.999932 4766 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.000160 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.000317 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.000617 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.000746 4766 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.001365 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:51:48.501344223 +0000 UTC m=+23.444215247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.001744 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.001805 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.003589 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.004065 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.004341 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.004672 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.007458 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.008876 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.010224 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.011570 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.012326 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.012823 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.013712 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.016218 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.017684 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.017725 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.017740 4766 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.017808 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 10:51:48.517787888 +0000 UTC m=+23.460658912 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.019084 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.019251 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.020118 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.020359 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.021237 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.021552 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.025762 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.025823 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.027209 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.027456 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.027705 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.027778 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.028055 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.028257 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.028649 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.029091 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.029487 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.032247 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.039701 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.040773 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.045539 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.045606 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.045847 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.046387 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.048253 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.048282 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.048299 4766 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.048342 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 10:51:48.548327683 +0000 UTC m=+23.491198627 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.060816 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.061243 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.061626 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.061885 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.062282 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.062637 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.062883 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.063249 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.063427 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.063605 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.063797 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.064878 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.065215 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.066064 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.068579 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovnkube-config\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.068621 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-multus-cni-dir\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.068648 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-var-lib-kubelet\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.070898 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:51:48.570870898 +0000 UTC m=+23.513741842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.070974 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.071215 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.071828 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-hostroot\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072068 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072087 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b897c4d4-6c9c-4d4b-a684-d66c59d00190-os-release\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072178 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-cnibin\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072214 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-var-lib-openvswitch\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072243 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/317f8a30-cdef-4f82-832c-4f3bc2674379-hosts-file\") pod \"node-resolver-5vgtz\" (UID: \"317f8a30-cdef-4f82-832c-4f3bc2674379\") " pod="openshift-dns/node-resolver-5vgtz" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072271 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-ovn\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072291 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovnkube-script-lib\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072312 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-etc-openvswitch\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072330 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-env-overrides\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072362 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-var-lib-cni-multus\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072384 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b897c4d4-6c9c-4d4b-a684-d66c59d00190-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072407 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-run-netns\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072430 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cd484f43-26b6-4e55-b872-7502e8d6e8c7-rootfs\") pod \"machine-config-daemon-l99lx\" (UID: \"cd484f43-26b6-4e55-b872-7502e8d6e8c7\") " pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072455 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd484f43-26b6-4e55-b872-7502e8d6e8c7-proxy-tls\") pod \"machine-config-daemon-l99lx\" (UID: \"cd484f43-26b6-4e55-b872-7502e8d6e8c7\") " pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072524 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-run-k8s-cni-cncf-io\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072550 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-slash\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072578 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072601 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-run-ovn-kubernetes\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072631 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovn-node-metrics-cert\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072654 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072674 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-cni-binary-copy\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072698 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-multus-socket-dir-parent\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072726 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-multus-conf-dir\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072760 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-etc-kubernetes\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072726 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-etc-openvswitch\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072804 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-openvswitch\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072829 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fb7h\" (UniqueName: \"kubernetes.io/projected/11cc785e-5bdc-4827-913a-4d899eb5a83c-kube-api-access-9fb7h\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072850 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b897c4d4-6c9c-4d4b-a684-d66c59d00190-cni-binary-copy\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072874 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-cni-bin\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072894 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-system-cni-dir\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072922 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-multus-daemon-config\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072946 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-log-socket\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072967 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-os-release\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072159 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b897c4d4-6c9c-4d4b-a684-d66c59d00190-os-release\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.072989 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwgcg\" (UniqueName: \"kubernetes.io/projected/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-kube-api-access-fwgcg\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073124 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cd484f43-26b6-4e55-b872-7502e8d6e8c7-mcd-auth-proxy-config\") pod \"machine-config-daemon-l99lx\" (UID: \"cd484f43-26b6-4e55-b872-7502e8d6e8c7\") " pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073148 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pn5s\" (UniqueName: \"kubernetes.io/projected/cd484f43-26b6-4e55-b872-7502e8d6e8c7-kube-api-access-5pn5s\") pod \"machine-config-daemon-l99lx\" (UID: \"cd484f43-26b6-4e55-b872-7502e8d6e8c7\") " pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073161 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073169 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b897c4d4-6c9c-4d4b-a684-d66c59d00190-system-cni-dir\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073218 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b897c4d4-6c9c-4d4b-a684-d66c59d00190-system-cni-dir\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073225 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b897c4d4-6c9c-4d4b-a684-d66c59d00190-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073310 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-systemd\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073348 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b897c4d4-6c9c-4d4b-a684-d66c59d00190-cnibin\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073375 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-cni-netd\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073399 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-run-multus-certs\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073422 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073453 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlxkt\" (UniqueName: \"kubernetes.io/projected/317f8a30-cdef-4f82-832c-4f3bc2674379-kube-api-access-jlxkt\") pod \"node-resolver-5vgtz\" (UID: \"317f8a30-cdef-4f82-832c-4f3bc2674379\") " pod="openshift-dns/node-resolver-5vgtz" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073519 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-var-lib-cni-bin\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073548 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-kubelet\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073571 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-systemd-units\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073946 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-env-overrides\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.074025 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-multus-socket-dir-parent\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.074068 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-multus-conf-dir\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.074106 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-etc-kubernetes\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.074137 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-openvswitch\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.074200 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-cni-binary-copy\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.074271 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-var-lib-cni-multus\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075068 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-run-netns\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075136 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cd484f43-26b6-4e55-b872-7502e8d6e8c7-rootfs\") pod \"machine-config-daemon-l99lx\" (UID: \"cd484f43-26b6-4e55-b872-7502e8d6e8c7\") " pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075108 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-run-netns\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075172 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-node-log\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075223 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx6lz\" (UniqueName: \"kubernetes.io/projected/b897c4d4-6c9c-4d4b-a684-d66c59d00190-kube-api-access-rx6lz\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075451 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075486 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075535 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075551 4766 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075566 4766 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075579 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075594 4766 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075606 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075619 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075632 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075644 4766 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075657 4766 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075670 4766 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075682 4766 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075695 4766 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075709 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075723 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075739 4766 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075754 4766 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075767 4766 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075780 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075795 4766 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075808 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075822 4766 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075834 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075847 4766 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075860 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075872 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075916 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075929 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075941 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075956 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075968 4766 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075981 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075996 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076009 4766 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076021 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076036 4766 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076050 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076063 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076075 4766 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076088 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076102 4766 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076113 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076127 4766 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076138 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076151 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076163 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076177 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076190 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076202 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076215 4766 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076227 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076240 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076254 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076267 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076280 4766 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076294 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076308 4766 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076322 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076336 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076349 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076363 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076377 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076390 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076402 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076414 4766 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076427 4766 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076439 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076451 4766 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076463 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076476 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076489 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076523 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076537 4766 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076555 4766 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076568 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076580 4766 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076595 4766 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076611 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076625 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076637 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076650 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076662 4766 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076677 4766 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076690 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076702 4766 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076714 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076730 4766 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076744 4766 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076757 4766 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076770 4766 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076783 4766 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076797 4766 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076809 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076821 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076834 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076849 4766 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076862 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076874 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076887 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076899 4766 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076913 4766 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076928 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076941 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076955 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076969 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076982 4766 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.076996 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.077011 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.077025 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.077038 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.078741 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b897c4d4-6c9c-4d4b-a684-d66c59d00190-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.078810 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-systemd\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.079035 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b897c4d4-6c9c-4d4b-a684-d66c59d00190-cnibin\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.079068 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-cni-netd\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.079094 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-run-multus-certs\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.079118 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073420 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.073443 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.074132 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.077896 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.079003 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.079410 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-var-lib-cni-bin\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.079450 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-kubelet\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.079480 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-systemd-units\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.080394 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-run-netns\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.075076 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b897c4d4-6c9c-4d4b-a684-d66c59d00190-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.080458 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-node-log\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.080784 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-hostroot\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.080862 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-multus-cni-dir\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.080897 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-var-lib-kubelet\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.081416 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-cnibin\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.081918 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.082363 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.085140 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.085819 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/317f8a30-cdef-4f82-832c-4f3bc2674379-hosts-file\") pod \"node-resolver-5vgtz\" (UID: \"317f8a30-cdef-4f82-832c-4f3bc2674379\") " pod="openshift-dns/node-resolver-5vgtz" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.085911 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-var-lib-openvswitch\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.086645 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-ovn\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.087882 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-log-socket\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.088651 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.088742 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-host-run-k8s-cni-cncf-io\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.089759 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-os-release\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.089866 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-run-ovn-kubernetes\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.088781 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-slash\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.090322 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.090482 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.090628 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-system-cni-dir\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.090667 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.090605 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-cni-bin\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.091535 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.093786 4766 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a" exitCode=255 Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.093841 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a"} Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.093895 4766 scope.go:117] "RemoveContainer" containerID="1c1e2fa8026fcad3886fdfffb123c6fd95722ec22115ddcbcfc478326b301094" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.097193 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx6lz\" (UniqueName: \"kubernetes.io/projected/b897c4d4-6c9c-4d4b-a684-d66c59d00190-kube-api-access-rx6lz\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.101823 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlxkt\" (UniqueName: \"kubernetes.io/projected/317f8a30-cdef-4f82-832c-4f3bc2674379-kube-api-access-jlxkt\") pod \"node-resolver-5vgtz\" (UID: \"317f8a30-cdef-4f82-832c-4f3bc2674379\") " pod="openshift-dns/node-resolver-5vgtz" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.106890 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.110395 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.110749 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b897c4d4-6c9c-4d4b-a684-d66c59d00190-cni-binary-copy\") pod \"multus-additional-cni-plugins-4wx78\" (UID: \"b897c4d4-6c9c-4d4b-a684-d66c59d00190\") " pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.110823 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.120284 4766 scope.go:117] "RemoveContainer" containerID="ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a" Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.120546 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.129045 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.130546 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.141842 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.151793 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.152341 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.152338 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.152748 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.152892 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.153147 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.156813 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.157884 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.161279 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.169829 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.175826 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.179605 4766 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.179641 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.179654 4766 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.179666 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.179678 4766 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.179692 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.179705 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.179717 4766 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.179730 4766 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.179741 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.179753 4766 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.179924 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5vgtz" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.180092 4766 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.180387 4766 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.180408 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.180418 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.180430 4766 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.180441 4766 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.187331 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.187617 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.188738 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.188991 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.189035 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.189167 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.189732 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.189851 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovnkube-script-lib\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.199193 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.214530 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.218118 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.218276 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.218435 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.218894 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.219173 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwgcg\" (UniqueName: \"kubernetes.io/projected/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-kube-api-access-fwgcg\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: W1002 10:51:48.219326 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-6fdd35635ed2b811793119e17e30c6598ee2e4c6eeac370914426f60b6247d2c WatchSource:0}: Error finding container 6fdd35635ed2b811793119e17e30c6598ee2e4c6eeac370914426f60b6247d2c: Status 404 returned error can't find the container with id 6fdd35635ed2b811793119e17e30c6598ee2e4c6eeac370914426f60b6247d2c Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.219492 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4wx78" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.222089 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.225209 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.233119 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.249299 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.260161 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.260319 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.260574 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.260746 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.260986 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.261153 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.261315 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.261535 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.261554 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.262026 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.262224 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovnkube-config\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.265033 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.265172 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.265402 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.269229 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.281828 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.281856 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.281865 4766 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.281875 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.281884 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.281894 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.281904 4766 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.281915 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.281923 4766 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.281933 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.281943 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.281952 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.282002 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.282014 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.282051 4766 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.282327 4766 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.282337 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.282347 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.282355 4766 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.282364 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.282373 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.282381 4766 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.282389 4766 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.282398 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.284084 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.284135 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.284213 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.284265 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.284264 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.284340 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.284662 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.284821 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.285091 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.286885 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.287520 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a6aa81c2-8c87-43df-badb-7b9dbef84ccf-multus-daemon-config\") pod \"multus-2jxdg\" (UID: \"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\") " pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.292181 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.292285 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.292353 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.292574 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.292964 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cd484f43-26b6-4e55-b872-7502e8d6e8c7-proxy-tls\") pod \"machine-config-daemon-l99lx\" (UID: \"cd484f43-26b6-4e55-b872-7502e8d6e8c7\") " pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.293218 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pn5s\" (UniqueName: \"kubernetes.io/projected/cd484f43-26b6-4e55-b872-7502e8d6e8c7-kube-api-access-5pn5s\") pod \"machine-config-daemon-l99lx\" (UID: \"cd484f43-26b6-4e55-b872-7502e8d6e8c7\") " pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.293462 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cd484f43-26b6-4e55-b872-7502e8d6e8c7-mcd-auth-proxy-config\") pod \"machine-config-daemon-l99lx\" (UID: \"cd484f43-26b6-4e55-b872-7502e8d6e8c7\") " pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.293672 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.293700 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.293826 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.293866 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.293880 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.294188 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.299264 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.321666 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.321764 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.322325 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.322397 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.323570 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.323745 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.323858 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.322088 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.324195 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.324363 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.324467 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.324513 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.325169 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fb7h\" (UniqueName: \"kubernetes.io/projected/11cc785e-5bdc-4827-913a-4d899eb5a83c-kube-api-access-9fb7h\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.325805 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovn-node-metrics-cert\") pod \"ovnkube-node-27vgl\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.325951 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.383552 4766 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.383925 4766 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.383939 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.383951 4766 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.383966 4766 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.383976 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.383984 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.383995 4766 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384004 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384012 4766 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384024 4766 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384031 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384044 4766 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384053 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384064 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384073 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384081 4766 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384095 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384104 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384116 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384125 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384132 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384146 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384156 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384165 4766 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384177 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384186 4766 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384198 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384209 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384222 4766 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384254 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384263 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.384272 4766 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.490066 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.501950 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-2jxdg" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.530667 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.585894 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.586098 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:51:49.586063966 +0000 UTC m=+24.528934910 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.586172 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.586253 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.586349 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.586408 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.586402 4766 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.586514 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:51:49.586490389 +0000 UTC m=+24.529361323 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.586637 4766 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.587355 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:51:49.587319395 +0000 UTC m=+24.530190539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.586660 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.587433 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.587455 4766 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.587522 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 10:51:49.58748756 +0000 UTC m=+24.530358524 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.586709 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.587567 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.587593 4766 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.587639 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 10:51:49.587625565 +0000 UTC m=+24.530496729 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.796656 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.880292 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:51:48 crc kubenswrapper[4766]: E1002 10:51:48.880459 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:51:48 crc kubenswrapper[4766]: I1002 10:51:48.890089 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:48 crc kubenswrapper[4766]: W1002 10:51:48.926984 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6aa81c2_8c87_43df_badb_7b9dbef84ccf.slice/crio-57188a77a6616ac2cb59749dd53147c1a8e471079c11e790237387e578361177 WatchSource:0}: Error finding container 57188a77a6616ac2cb59749dd53147c1a8e471079c11e790237387e578361177: Status 404 returned error can't find the container with id 57188a77a6616ac2cb59749dd53147c1a8e471079c11e790237387e578361177 Oct 02 10:51:48 crc kubenswrapper[4766]: W1002 10:51:48.942357 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd484f43_26b6_4e55_b872_7502e8d6e8c7.slice/crio-7175ef11bfb346f5bd12003cc088ad8d192019471a7c53bfa5b36e97d84caa20 WatchSource:0}: Error finding container 7175ef11bfb346f5bd12003cc088ad8d192019471a7c53bfa5b36e97d84caa20: Status 404 returned error can't find the container with id 7175ef11bfb346f5bd12003cc088ad8d192019471a7c53bfa5b36e97d84caa20 Oct 02 10:51:48 crc kubenswrapper[4766]: W1002 10:51:48.994021 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11cc785e_5bdc_4827_913a_4d899eb5a83c.slice/crio-ef3a0757191602abc70c5d4b1c5a8b503649299161f2b42799c0b1d3c4d87647 WatchSource:0}: Error finding container ef3a0757191602abc70c5d4b1c5a8b503649299161f2b42799c0b1d3c4d87647: Status 404 returned error can't find the container with id ef3a0757191602abc70c5d4b1c5a8b503649299161f2b42799c0b1d3c4d87647 Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.083812 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.090760 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.092784 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.092813 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.097890 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4cbd58ef1b15024912949805e9fe5af43a1a809c7f6cd494a7dffb96c58c537e"} Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.100532 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5vgtz" event={"ID":"317f8a30-cdef-4f82-832c-4f3bc2674379","Type":"ContainerStarted","Data":"4d956c9f248f8d05dc8721f40db2412e7e19b3d4a1bbb3146d4a4edc306fb5c0"} Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.101537 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5fc9f4cee823320bb575219e992c8b44d42ce4c67a91bfdaafc2848e4c9762f5"} Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.102640 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerStarted","Data":"ef3a0757191602abc70c5d4b1c5a8b503649299161f2b42799c0b1d3c4d87647"} Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.103646 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"7175ef11bfb346f5bd12003cc088ad8d192019471a7c53bfa5b36e97d84caa20"} Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.105471 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2jxdg" event={"ID":"a6aa81c2-8c87-43df-badb-7b9dbef84ccf","Type":"ContainerStarted","Data":"57188a77a6616ac2cb59749dd53147c1a8e471079c11e790237387e578361177"} Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.107018 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" event={"ID":"b897c4d4-6c9c-4d4b-a684-d66c59d00190","Type":"ContainerStarted","Data":"30dfb756b4e5e08f9fed2cd11deb61860c641cdff5c1103dfca2d5c6b06d3669"} Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.108957 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6fdd35635ed2b811793119e17e30c6598ee2e4c6eeac370914426f60b6247d2c"} Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.109470 4766 scope.go:117] "RemoveContainer" containerID="ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a" Oct 02 10:51:49 crc kubenswrapper[4766]: E1002 10:51:49.109657 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.119964 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.129978 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.136781 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.144772 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.156104 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.166133 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.175791 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.188012 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.199451 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.209939 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.220066 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.220877 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.223551 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.235287 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.299072 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.299116 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.588481 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-w4c82"] Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.588954 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w4c82" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.591932 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.592214 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.593832 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.594037 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.601757 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.601943 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:51:49 crc kubenswrapper[4766]: E1002 10:51:49.601974 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:51:51.60193657 +0000 UTC m=+26.544807644 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.602033 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.602105 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:51:49 crc kubenswrapper[4766]: E1002 10:51:49.602119 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:51:49 crc kubenswrapper[4766]: E1002 10:51:49.602147 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:51:49 crc kubenswrapper[4766]: E1002 10:51:49.602162 4766 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.602185 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:51:49 crc kubenswrapper[4766]: E1002 10:51:49.602224 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 10:51:51.602200689 +0000 UTC m=+26.545071813 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:49 crc kubenswrapper[4766]: E1002 10:51:49.602260 4766 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:51:49 crc kubenswrapper[4766]: E1002 10:51:49.602325 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:51:49 crc kubenswrapper[4766]: E1002 10:51:49.602375 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:51:49 crc kubenswrapper[4766]: E1002 10:51:49.602390 4766 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:49 crc kubenswrapper[4766]: E1002 10:51:49.602414 4766 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:51:49 crc kubenswrapper[4766]: E1002 10:51:49.602351 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:51:51.602329503 +0000 UTC m=+26.545200447 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:51:49 crc kubenswrapper[4766]: E1002 10:51:49.602495 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 10:51:51.602455737 +0000 UTC m=+26.545326731 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:49 crc kubenswrapper[4766]: E1002 10:51:49.602554 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:51:51.602539509 +0000 UTC m=+26.545410483 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.605021 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.619464 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.631867 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.639980 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.641592 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.643438 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.649437 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.655394 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.672557 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.685922 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.697109 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.703011 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhcvn\" (UniqueName: \"kubernetes.io/projected/343775d7-8fd1-4ce4-b05c-ab27e9406a9c-kube-api-access-bhcvn\") pod \"node-ca-w4c82\" (UID: \"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\") " pod="openshift-image-registry/node-ca-w4c82" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.703115 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/343775d7-8fd1-4ce4-b05c-ab27e9406a9c-host\") pod \"node-ca-w4c82\" (UID: \"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\") " pod="openshift-image-registry/node-ca-w4c82" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.703141 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/343775d7-8fd1-4ce4-b05c-ab27e9406a9c-serviceca\") pod \"node-ca-w4c82\" (UID: \"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\") " pod="openshift-image-registry/node-ca-w4c82" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.713726 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.734649 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.749322 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.760502 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.768770 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.784872 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.799033 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.803775 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhcvn\" (UniqueName: \"kubernetes.io/projected/343775d7-8fd1-4ce4-b05c-ab27e9406a9c-kube-api-access-bhcvn\") pod \"node-ca-w4c82\" (UID: \"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\") " pod="openshift-image-registry/node-ca-w4c82" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.803879 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/343775d7-8fd1-4ce4-b05c-ab27e9406a9c-host\") pod \"node-ca-w4c82\" (UID: \"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\") " pod="openshift-image-registry/node-ca-w4c82" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.803981 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/343775d7-8fd1-4ce4-b05c-ab27e9406a9c-serviceca\") pod \"node-ca-w4c82\" (UID: \"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\") " pod="openshift-image-registry/node-ca-w4c82" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.804935 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/343775d7-8fd1-4ce4-b05c-ab27e9406a9c-host\") pod \"node-ca-w4c82\" (UID: \"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\") " pod="openshift-image-registry/node-ca-w4c82" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.805118 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/343775d7-8fd1-4ce4-b05c-ab27e9406a9c-serviceca\") pod \"node-ca-w4c82\" (UID: \"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\") " pod="openshift-image-registry/node-ca-w4c82" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.809623 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.824658 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhcvn\" (UniqueName: \"kubernetes.io/projected/343775d7-8fd1-4ce4-b05c-ab27e9406a9c-kube-api-access-bhcvn\") pod \"node-ca-w4c82\" (UID: \"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\") " pod="openshift-image-registry/node-ca-w4c82" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.824646 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.867245 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.881027 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.881041 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:51:49 crc kubenswrapper[4766]: E1002 10:51:49.881283 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:51:49 crc kubenswrapper[4766]: E1002 10:51:49.881350 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.882433 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.885759 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.886504 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.887184 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.887781 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.888363 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.888924 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.889605 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.890163 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.892153 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.892748 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.893693 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.894332 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.894835 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.895724 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.896246 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.897291 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.898067 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.898482 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.898486 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.901432 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.902141 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.903572 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.904227 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.904328 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w4c82" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.904789 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.905868 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.906274 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.907363 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.908042 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.908919 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.909480 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.910356 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.910862 4766 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.910963 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.913175 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.913579 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.914160 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.915144 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: W1002 10:51:49.918422 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod343775d7_8fd1_4ce4_b05c_ab27e9406a9c.slice/crio-e05d515d660aca544749275e348c3749f1b6717a0c171f48fd34bc31422358bf WatchSource:0}: Error finding container e05d515d660aca544749275e348c3749f1b6717a0c171f48fd34bc31422358bf: Status 404 returned error can't find the container with id e05d515d660aca544749275e348c3749f1b6717a0c171f48fd34bc31422358bf Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.920231 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.920925 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.921896 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.922561 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.923596 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.924045 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.924831 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.926064 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.927035 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.927495 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.928444 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.928580 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.929560 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.931202 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.931743 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.932255 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.933346 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.934128 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.935461 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.936018 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.943320 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.953272 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.961882 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.972887 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:49 crc kubenswrapper[4766]: I1002 10:51:49.982547 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.119958 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05"} Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.120014 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0"} Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.122442 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b"} Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.122489 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65"} Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.124676 4766 generic.go:334] "Generic (PLEG): container finished" podID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerID="897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d" exitCode=0 Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.124799 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerDied","Data":"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d"} Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.127328 4766 generic.go:334] "Generic (PLEG): container finished" podID="b897c4d4-6c9c-4d4b-a684-d66c59d00190" containerID="4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8" exitCode=0 Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.127604 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" event={"ID":"b897c4d4-6c9c-4d4b-a684-d66c59d00190","Type":"ContainerDied","Data":"4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8"} Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.131357 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7"} Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.132896 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.136251 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2jxdg" event={"ID":"a6aa81c2-8c87-43df-badb-7b9dbef84ccf","Type":"ContainerStarted","Data":"da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071"} Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.141082 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5vgtz" event={"ID":"317f8a30-cdef-4f82-832c-4f3bc2674379","Type":"ContainerStarted","Data":"374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620"} Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.144534 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.147435 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.148543 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w4c82" event={"ID":"343775d7-8fd1-4ce4-b05c-ab27e9406a9c","Type":"ContainerStarted","Data":"e05d515d660aca544749275e348c3749f1b6717a0c171f48fd34bc31422358bf"} Oct 02 10:51:50 crc kubenswrapper[4766]: E1002 10:51:50.153595 4766 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.156658 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.167368 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.184159 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.199391 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.211257 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.222133 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.235771 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.247162 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.259868 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.268584 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.278391 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.289369 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.299826 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.309104 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.316177 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.324431 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.338930 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.352104 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.362582 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.374863 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.386831 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.400028 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.414894 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.426526 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.440041 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.474963 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:50 crc kubenswrapper[4766]: I1002 10:51:50.880289 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:51:50 crc kubenswrapper[4766]: E1002 10:51:50.880869 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.155698 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerStarted","Data":"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7"} Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.155744 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerStarted","Data":"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f"} Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.155755 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerStarted","Data":"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a"} Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.155765 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerStarted","Data":"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3"} Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.155777 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerStarted","Data":"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b"} Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.159841 4766 generic.go:334] "Generic (PLEG): container finished" podID="b897c4d4-6c9c-4d4b-a684-d66c59d00190" containerID="f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420" exitCode=0 Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.159937 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" event={"ID":"b897c4d4-6c9c-4d4b-a684-d66c59d00190","Type":"ContainerDied","Data":"f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420"} Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.163057 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w4c82" event={"ID":"343775d7-8fd1-4ce4-b05c-ab27e9406a9c","Type":"ContainerStarted","Data":"cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02"} Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.173655 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.183972 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.196131 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.219698 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.243953 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.258879 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.273899 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.288890 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.309216 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.324001 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.340053 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.352246 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.366498 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.380042 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.394289 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.406122 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.422953 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.443130 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.458829 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.469171 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.480464 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.502255 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.517449 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.528892 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.540737 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.550346 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.561613 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.588360 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.623226 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.623353 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:51:51 crc kubenswrapper[4766]: E1002 10:51:51.623380 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:51:55.623355616 +0000 UTC m=+30.566226570 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.623414 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.623445 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.623481 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:51:51 crc kubenswrapper[4766]: E1002 10:51:51.623487 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:51:51 crc kubenswrapper[4766]: E1002 10:51:51.623506 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:51:51 crc kubenswrapper[4766]: E1002 10:51:51.623539 4766 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:51 crc kubenswrapper[4766]: E1002 10:51:51.623578 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 10:51:55.623568312 +0000 UTC m=+30.566439256 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:51 crc kubenswrapper[4766]: E1002 10:51:51.623625 4766 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:51:51 crc kubenswrapper[4766]: E1002 10:51:51.623659 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:51:55.623649815 +0000 UTC m=+30.566520759 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:51:51 crc kubenswrapper[4766]: E1002 10:51:51.623700 4766 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:51:51 crc kubenswrapper[4766]: E1002 10:51:51.623725 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:51:55.623717217 +0000 UTC m=+30.566588161 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:51:51 crc kubenswrapper[4766]: E1002 10:51:51.623731 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:51:51 crc kubenswrapper[4766]: E1002 10:51:51.623771 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:51:51 crc kubenswrapper[4766]: E1002 10:51:51.623787 4766 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:51 crc kubenswrapper[4766]: E1002 10:51:51.623864 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 10:51:55.623839301 +0000 UTC m=+30.566710345 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.880624 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:51:51 crc kubenswrapper[4766]: I1002 10:51:51.880624 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:51:51 crc kubenswrapper[4766]: E1002 10:51:51.880858 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:51:51 crc kubenswrapper[4766]: E1002 10:51:51.881184 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.169530 4766 generic.go:334] "Generic (PLEG): container finished" podID="b897c4d4-6c9c-4d4b-a684-d66c59d00190" containerID="1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f" exitCode=0 Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.169637 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" event={"ID":"b897c4d4-6c9c-4d4b-a684-d66c59d00190","Type":"ContainerDied","Data":"1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f"} Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.175563 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerStarted","Data":"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7"} Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.176774 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd"} Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.185939 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.198268 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.212242 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.227239 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.239146 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.251555 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.270212 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.284313 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.296376 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.308228 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.324242 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.337287 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.353030 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.362842 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.377257 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.387847 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.399256 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.415266 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.429092 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.443957 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.454387 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.468010 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.506210 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.547558 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.587055 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.629450 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.675922 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.709896 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:52Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:52 crc kubenswrapper[4766]: I1002 10:51:52.880604 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:51:52 crc kubenswrapper[4766]: E1002 10:51:52.880746 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.182710 4766 generic.go:334] "Generic (PLEG): container finished" podID="b897c4d4-6c9c-4d4b-a684-d66c59d00190" containerID="bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875" exitCode=0 Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.182778 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" event={"ID":"b897c4d4-6c9c-4d4b-a684-d66c59d00190","Type":"ContainerDied","Data":"bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875"} Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.200333 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.213499 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.226681 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.238594 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.248659 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.260088 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.278402 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.289772 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.302072 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.311058 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.322575 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.338030 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.350626 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.360279 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.536254 4766 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.538298 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.538515 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.538547 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.538556 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.538655 4766 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.539012 4766 scope.go:117] "RemoveContainer" containerID="ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a" Oct 02 10:51:53 crc kubenswrapper[4766]: E1002 10:51:53.539158 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.544681 4766 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.544959 4766 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.545827 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.545861 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.545872 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.545887 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.545897 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:53Z","lastTransitionTime":"2025-10-02T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:53 crc kubenswrapper[4766]: E1002 10:51:53.562588 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.566028 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.566060 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.566071 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.566087 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.566098 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:53Z","lastTransitionTime":"2025-10-02T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:53 crc kubenswrapper[4766]: E1002 10:51:53.576722 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.580460 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.580526 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.580540 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.580564 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.580577 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:53Z","lastTransitionTime":"2025-10-02T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:53 crc kubenswrapper[4766]: E1002 10:51:53.595214 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.598744 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.598778 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.598787 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.598804 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.598814 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:53Z","lastTransitionTime":"2025-10-02T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:53 crc kubenswrapper[4766]: E1002 10:51:53.611774 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.618994 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.619295 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.619406 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.619489 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.619646 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:53Z","lastTransitionTime":"2025-10-02T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:53 crc kubenswrapper[4766]: E1002 10:51:53.631869 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:53Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:53 crc kubenswrapper[4766]: E1002 10:51:53.632020 4766 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.633855 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.633910 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.633923 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.633946 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.633960 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:53Z","lastTransitionTime":"2025-10-02T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.736691 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.736738 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.736751 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.736769 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.736783 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:53Z","lastTransitionTime":"2025-10-02T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.839497 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.839561 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.839571 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.839586 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.839597 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:53Z","lastTransitionTime":"2025-10-02T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.881089 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.881089 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:51:53 crc kubenswrapper[4766]: E1002 10:51:53.881233 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:51:53 crc kubenswrapper[4766]: E1002 10:51:53.881282 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.942552 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.942604 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.942615 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.942632 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:53 crc kubenswrapper[4766]: I1002 10:51:53.942643 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:53Z","lastTransitionTime":"2025-10-02T10:51:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.044774 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.044812 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.044823 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.044841 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.044854 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:54Z","lastTransitionTime":"2025-10-02T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.147754 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.147787 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.147797 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.147813 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.147823 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:54Z","lastTransitionTime":"2025-10-02T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.188811 4766 generic.go:334] "Generic (PLEG): container finished" podID="b897c4d4-6c9c-4d4b-a684-d66c59d00190" containerID="8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7" exitCode=0 Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.188882 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" event={"ID":"b897c4d4-6c9c-4d4b-a684-d66c59d00190","Type":"ContainerDied","Data":"8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7"} Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.194192 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerStarted","Data":"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69"} Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.205997 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.218391 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.231188 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.245152 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.249890 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.249933 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.249948 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.249968 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.249980 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:54Z","lastTransitionTime":"2025-10-02T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.257152 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.277067 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.290793 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.301981 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.310903 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.327643 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.341158 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.353183 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.357039 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.357079 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.357090 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.357106 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.357120 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:54Z","lastTransitionTime":"2025-10-02T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.364890 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.377255 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:54Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.459663 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.459702 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.459712 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.459732 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.459743 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:54Z","lastTransitionTime":"2025-10-02T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.562355 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.562399 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.562412 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.562430 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.562449 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:54Z","lastTransitionTime":"2025-10-02T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.665237 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.665276 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.665287 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.665306 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.665317 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:54Z","lastTransitionTime":"2025-10-02T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.767544 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.767592 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.767609 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.767623 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.767632 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:54Z","lastTransitionTime":"2025-10-02T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.870960 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.871025 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.871038 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.871062 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.871076 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:54Z","lastTransitionTime":"2025-10-02T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.881233 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:51:54 crc kubenswrapper[4766]: E1002 10:51:54.881358 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.973836 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.973894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.973904 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.973923 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:54 crc kubenswrapper[4766]: I1002 10:51:54.973937 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:54Z","lastTransitionTime":"2025-10-02T10:51:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.077138 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.077194 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.077210 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.077230 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.077244 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:55Z","lastTransitionTime":"2025-10-02T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.179526 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.179571 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.179583 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.179600 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.179611 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:55Z","lastTransitionTime":"2025-10-02T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.202730 4766 generic.go:334] "Generic (PLEG): container finished" podID="b897c4d4-6c9c-4d4b-a684-d66c59d00190" containerID="fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6" exitCode=0 Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.202800 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" event={"ID":"b897c4d4-6c9c-4d4b-a684-d66c59d00190","Type":"ContainerDied","Data":"fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6"} Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.215177 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.228073 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.246143 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.263995 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.276735 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.281666 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.281715 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.281730 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.281748 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.281759 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:55Z","lastTransitionTime":"2025-10-02T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.291244 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.307077 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.319722 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.330852 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.340195 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.351534 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.362106 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.374226 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.384102 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.384154 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.384214 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.384241 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.384279 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:55Z","lastTransitionTime":"2025-10-02T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.389229 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.487147 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.487193 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.487203 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.487222 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.487233 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:55Z","lastTransitionTime":"2025-10-02T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.589261 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.589295 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.589304 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.589339 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.589350 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:55Z","lastTransitionTime":"2025-10-02T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.662852 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.662994 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.663029 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:51:55 crc kubenswrapper[4766]: E1002 10:51:55.663104 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:52:03.663073296 +0000 UTC m=+38.605944240 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:51:55 crc kubenswrapper[4766]: E1002 10:51:55.663158 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:51:55 crc kubenswrapper[4766]: E1002 10:51:55.663183 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:51:55 crc kubenswrapper[4766]: E1002 10:51:55.663198 4766 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:55 crc kubenswrapper[4766]: E1002 10:51:55.663243 4766 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:51:55 crc kubenswrapper[4766]: E1002 10:51:55.663250 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 10:52:03.663233141 +0000 UTC m=+38.606104085 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.663180 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:51:55 crc kubenswrapper[4766]: E1002 10:51:55.663311 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.663330 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:51:55 crc kubenswrapper[4766]: E1002 10:51:55.663351 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:51:55 crc kubenswrapper[4766]: E1002 10:51:55.663365 4766 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:55 crc kubenswrapper[4766]: E1002 10:51:55.663318 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:52:03.663306813 +0000 UTC m=+38.606177757 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:51:55 crc kubenswrapper[4766]: E1002 10:51:55.663445 4766 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:51:55 crc kubenswrapper[4766]: E1002 10:51:55.663448 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 10:52:03.663430627 +0000 UTC m=+38.606301571 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:51:55 crc kubenswrapper[4766]: E1002 10:51:55.663473 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:52:03.663466388 +0000 UTC m=+38.606337322 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.692097 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.692145 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.692157 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.692175 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.692187 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:55Z","lastTransitionTime":"2025-10-02T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.795596 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.795629 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.795638 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.795664 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.795674 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:55Z","lastTransitionTime":"2025-10-02T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.880745 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.880832 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:51:55 crc kubenswrapper[4766]: E1002 10:51:55.880924 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:51:55 crc kubenswrapper[4766]: E1002 10:51:55.880984 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.893821 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.898485 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.898552 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.898564 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.898580 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.898593 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:55Z","lastTransitionTime":"2025-10-02T10:51:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.905798 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.919132 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.929773 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.942817 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.956112 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.974227 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:55 crc kubenswrapper[4766]: I1002 10:51:55.998556 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.000527 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.000602 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.000615 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.000632 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.000642 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:56Z","lastTransitionTime":"2025-10-02T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.014562 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.027847 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.039191 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.052496 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.068538 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.082129 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.103728 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.103778 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.103791 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.103822 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.103836 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:56Z","lastTransitionTime":"2025-10-02T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.205938 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.205973 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.205987 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.206006 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.206019 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:56Z","lastTransitionTime":"2025-10-02T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.309352 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.309406 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.309416 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.309433 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.309443 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:56Z","lastTransitionTime":"2025-10-02T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.412297 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.412724 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.412736 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.412752 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.412763 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:56Z","lastTransitionTime":"2025-10-02T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.515106 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.515152 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.515162 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.515177 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.515191 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:56Z","lastTransitionTime":"2025-10-02T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.617381 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.617427 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.617436 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.617451 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.617464 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:56Z","lastTransitionTime":"2025-10-02T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.720817 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.720879 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.720892 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.720915 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.720931 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:56Z","lastTransitionTime":"2025-10-02T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.823342 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.823389 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.823402 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.823424 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.823438 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:56Z","lastTransitionTime":"2025-10-02T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.880740 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:51:56 crc kubenswrapper[4766]: E1002 10:51:56.880889 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.926624 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.926779 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.926842 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.926901 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:56 crc kubenswrapper[4766]: I1002 10:51:56.926961 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:56Z","lastTransitionTime":"2025-10-02T10:51:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.030265 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.030306 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.030318 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.030336 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.030353 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:57Z","lastTransitionTime":"2025-10-02T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.133460 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.133494 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.133519 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.133537 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.133548 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:57Z","lastTransitionTime":"2025-10-02T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.212925 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerStarted","Data":"5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1"} Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.213534 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.213555 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.213564 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.217310 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" event={"ID":"b897c4d4-6c9c-4d4b-a684-d66c59d00190","Type":"ContainerStarted","Data":"13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495"} Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.225696 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.235919 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.235960 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.235973 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.235991 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.236005 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:57Z","lastTransitionTime":"2025-10-02T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.237157 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.242752 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.242970 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.249642 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.265823 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.279142 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.288909 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.298515 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.312451 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.327276 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.338749 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.340815 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.340896 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.340912 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.340938 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.340951 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:57Z","lastTransitionTime":"2025-10-02T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.352310 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.361391 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.374921 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.388884 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.403463 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.420457 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.434012 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.444091 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.444140 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.444152 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.444167 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.444181 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:57Z","lastTransitionTime":"2025-10-02T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.450172 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.472994 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.487767 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.500811 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.511621 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.524725 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.540880 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.546279 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.546681 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.546773 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.546863 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.546953 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:57Z","lastTransitionTime":"2025-10-02T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.555800 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.566969 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.581116 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.593357 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:51:57Z is after 2025-08-24T17:21:41Z" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.649667 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.649730 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.649741 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.649764 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.649780 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:57Z","lastTransitionTime":"2025-10-02T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.752811 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.752868 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.752885 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.752905 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.752918 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:57Z","lastTransitionTime":"2025-10-02T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.856567 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.856633 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.856649 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.856681 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.856696 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:57Z","lastTransitionTime":"2025-10-02T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.880440 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.880474 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:51:57 crc kubenswrapper[4766]: E1002 10:51:57.880634 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:51:57 crc kubenswrapper[4766]: E1002 10:51:57.880765 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.973330 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.973367 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.973378 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.973396 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:57 crc kubenswrapper[4766]: I1002 10:51:57.973409 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:57Z","lastTransitionTime":"2025-10-02T10:51:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.075725 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.075795 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.075807 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.075826 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.075841 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:58Z","lastTransitionTime":"2025-10-02T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.178890 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.178937 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.178946 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.178963 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.178973 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:58Z","lastTransitionTime":"2025-10-02T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.280983 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.281060 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.281074 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.281094 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.281106 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:58Z","lastTransitionTime":"2025-10-02T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.387664 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.387957 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.387973 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.387995 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.388007 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:58Z","lastTransitionTime":"2025-10-02T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.490188 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.490239 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.490251 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.490267 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.490279 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:58Z","lastTransitionTime":"2025-10-02T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.593568 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.593610 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.593618 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.593634 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.593645 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:58Z","lastTransitionTime":"2025-10-02T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.696214 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.696929 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.696963 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.696986 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.697001 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:58Z","lastTransitionTime":"2025-10-02T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.800060 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.800113 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.800126 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.800143 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.800155 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:58Z","lastTransitionTime":"2025-10-02T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.881300 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:51:58 crc kubenswrapper[4766]: E1002 10:51:58.881468 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.902849 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.902906 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.902918 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.902937 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:58 crc kubenswrapper[4766]: I1002 10:51:58.902950 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:58Z","lastTransitionTime":"2025-10-02T10:51:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.004711 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.004746 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.004754 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.004770 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.004780 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:59Z","lastTransitionTime":"2025-10-02T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.107792 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.107840 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.107852 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.107869 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.107880 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:59Z","lastTransitionTime":"2025-10-02T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.210530 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.210572 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.210584 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.210600 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.210610 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:59Z","lastTransitionTime":"2025-10-02T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.313070 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.313100 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.313109 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.313122 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.313130 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:59Z","lastTransitionTime":"2025-10-02T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.415230 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.415268 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.415288 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.415306 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.415318 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:59Z","lastTransitionTime":"2025-10-02T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.518110 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.518153 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.518163 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.518179 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.518189 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:59Z","lastTransitionTime":"2025-10-02T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.621230 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.621277 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.621290 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.621319 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.621338 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:59Z","lastTransitionTime":"2025-10-02T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.723794 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.723987 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.724039 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.724062 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.724075 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:59Z","lastTransitionTime":"2025-10-02T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.826860 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.826913 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.826925 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.826941 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.826951 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:59Z","lastTransitionTime":"2025-10-02T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.881284 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.881369 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:51:59 crc kubenswrapper[4766]: E1002 10:51:59.881592 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:51:59 crc kubenswrapper[4766]: E1002 10:51:59.881600 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.929198 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.929241 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.929251 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.929269 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:51:59 crc kubenswrapper[4766]: I1002 10:51:59.929279 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:51:59Z","lastTransitionTime":"2025-10-02T10:51:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.031636 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.031683 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.031695 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.031712 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.031724 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:00Z","lastTransitionTime":"2025-10-02T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.080113 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n"] Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.080742 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.082660 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.082984 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.093236 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.105484 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.118083 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.130776 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.133938 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.133969 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.133979 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.133993 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.134004 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:00Z","lastTransitionTime":"2025-10-02T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.142373 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.155224 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.171730 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.184431 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.198404 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.209288 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.209613 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c99ede6-74b7-406b-8195-c9364efc146f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cll7n\" (UID: \"8c99ede6-74b7-406b-8195-c9364efc146f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.209695 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c99ede6-74b7-406b-8195-c9364efc146f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cll7n\" (UID: \"8c99ede6-74b7-406b-8195-c9364efc146f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.209725 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4jpv\" (UniqueName: \"kubernetes.io/projected/8c99ede6-74b7-406b-8195-c9364efc146f-kube-api-access-r4jpv\") pod \"ovnkube-control-plane-749d76644c-cll7n\" (UID: \"8c99ede6-74b7-406b-8195-c9364efc146f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.209762 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c99ede6-74b7-406b-8195-c9364efc146f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cll7n\" (UID: \"8c99ede6-74b7-406b-8195-c9364efc146f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.219081 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.235091 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.236480 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.236552 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.236565 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.236584 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.236597 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:00Z","lastTransitionTime":"2025-10-02T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.247157 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.260675 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.270276 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:00Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.311401 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c99ede6-74b7-406b-8195-c9364efc146f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cll7n\" (UID: \"8c99ede6-74b7-406b-8195-c9364efc146f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.311459 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4jpv\" (UniqueName: \"kubernetes.io/projected/8c99ede6-74b7-406b-8195-c9364efc146f-kube-api-access-r4jpv\") pod \"ovnkube-control-plane-749d76644c-cll7n\" (UID: \"8c99ede6-74b7-406b-8195-c9364efc146f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.311500 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c99ede6-74b7-406b-8195-c9364efc146f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cll7n\" (UID: \"8c99ede6-74b7-406b-8195-c9364efc146f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.311556 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c99ede6-74b7-406b-8195-c9364efc146f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cll7n\" (UID: \"8c99ede6-74b7-406b-8195-c9364efc146f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.312206 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8c99ede6-74b7-406b-8195-c9364efc146f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cll7n\" (UID: \"8c99ede6-74b7-406b-8195-c9364efc146f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.312401 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8c99ede6-74b7-406b-8195-c9364efc146f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cll7n\" (UID: \"8c99ede6-74b7-406b-8195-c9364efc146f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.317005 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8c99ede6-74b7-406b-8195-c9364efc146f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cll7n\" (UID: \"8c99ede6-74b7-406b-8195-c9364efc146f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.329318 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4jpv\" (UniqueName: \"kubernetes.io/projected/8c99ede6-74b7-406b-8195-c9364efc146f-kube-api-access-r4jpv\") pod \"ovnkube-control-plane-749d76644c-cll7n\" (UID: \"8c99ede6-74b7-406b-8195-c9364efc146f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.340361 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.340405 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.340416 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.340433 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.340444 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:00Z","lastTransitionTime":"2025-10-02T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.394133 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" Oct 02 10:52:00 crc kubenswrapper[4766]: W1002 10:52:00.410715 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c99ede6_74b7_406b_8195_c9364efc146f.slice/crio-b34debb48ce7fb785204795d934d752de46a954870646a66133a944e6a3bb683 WatchSource:0}: Error finding container b34debb48ce7fb785204795d934d752de46a954870646a66133a944e6a3bb683: Status 404 returned error can't find the container with id b34debb48ce7fb785204795d934d752de46a954870646a66133a944e6a3bb683 Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.443278 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.443366 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.443384 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.443432 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.443447 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:00Z","lastTransitionTime":"2025-10-02T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.545823 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.545862 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.545873 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.545896 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.545907 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:00Z","lastTransitionTime":"2025-10-02T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.648844 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.648879 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.648889 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.648906 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.648917 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:00Z","lastTransitionTime":"2025-10-02T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.750908 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.750949 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.750962 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.750981 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.750995 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:00Z","lastTransitionTime":"2025-10-02T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.854012 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.854056 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.854066 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.854082 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.854091 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:00Z","lastTransitionTime":"2025-10-02T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.884856 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:00 crc kubenswrapper[4766]: E1002 10:52:00.884993 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.956521 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.956565 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.956576 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.956594 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:00 crc kubenswrapper[4766]: I1002 10:52:00.956606 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:00Z","lastTransitionTime":"2025-10-02T10:52:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.059709 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.059758 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.059772 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.059794 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.059809 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:01Z","lastTransitionTime":"2025-10-02T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.162398 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.162444 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.162456 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.162470 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.162480 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:01Z","lastTransitionTime":"2025-10-02T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.168213 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-klg2z"] Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.168857 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:01 crc kubenswrapper[4766]: E1002 10:52:01.168947 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.187030 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:01Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.199875 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:01Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.211181 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:01Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.221366 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:01Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.231483 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" event={"ID":"8c99ede6-74b7-406b-8195-c9364efc146f","Type":"ContainerStarted","Data":"b34debb48ce7fb785204795d934d752de46a954870646a66133a944e6a3bb683"} Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.233202 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:01Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.242701 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:01Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.255752 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:01Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.264726 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.264772 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.264784 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.264800 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.264810 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:01Z","lastTransitionTime":"2025-10-02T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.270033 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:01Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.280340 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:01Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.291991 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:01Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.307318 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:01Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.321539 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs\") pod \"network-metrics-daemon-klg2z\" (UID: \"6d68573a-5250-4407-8631-2199a3de7e9e\") " pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.321652 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2xzh\" (UniqueName: \"kubernetes.io/projected/6d68573a-5250-4407-8631-2199a3de7e9e-kube-api-access-w2xzh\") pod \"network-metrics-daemon-klg2z\" (UID: \"6d68573a-5250-4407-8631-2199a3de7e9e\") " pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.321933 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:01Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.333173 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:01Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.345074 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:01Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.355657 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:01Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.366966 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.367003 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.367014 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.367029 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.367039 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:01Z","lastTransitionTime":"2025-10-02T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.369812 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:01Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.422280 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2xzh\" (UniqueName: \"kubernetes.io/projected/6d68573a-5250-4407-8631-2199a3de7e9e-kube-api-access-w2xzh\") pod \"network-metrics-daemon-klg2z\" (UID: \"6d68573a-5250-4407-8631-2199a3de7e9e\") " pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.422365 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs\") pod \"network-metrics-daemon-klg2z\" (UID: \"6d68573a-5250-4407-8631-2199a3de7e9e\") " pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:01 crc kubenswrapper[4766]: E1002 10:52:01.422490 4766 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:52:01 crc kubenswrapper[4766]: E1002 10:52:01.422567 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs podName:6d68573a-5250-4407-8631-2199a3de7e9e nodeName:}" failed. No retries permitted until 2025-10-02 10:52:01.922549964 +0000 UTC m=+36.865420908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs") pod "network-metrics-daemon-klg2z" (UID: "6d68573a-5250-4407-8631-2199a3de7e9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.437496 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2xzh\" (UniqueName: \"kubernetes.io/projected/6d68573a-5250-4407-8631-2199a3de7e9e-kube-api-access-w2xzh\") pod \"network-metrics-daemon-klg2z\" (UID: \"6d68573a-5250-4407-8631-2199a3de7e9e\") " pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.468900 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.468955 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.468971 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.468991 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.469003 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:01Z","lastTransitionTime":"2025-10-02T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.571830 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.571873 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.571882 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.571925 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.571939 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:01Z","lastTransitionTime":"2025-10-02T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.675105 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.675145 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.675154 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.675169 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.675180 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:01Z","lastTransitionTime":"2025-10-02T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.777591 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.777627 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.777636 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.777651 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.777663 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:01Z","lastTransitionTime":"2025-10-02T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.879826 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.879874 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.879887 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.879905 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.879918 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:01Z","lastTransitionTime":"2025-10-02T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.880284 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.880282 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:01 crc kubenswrapper[4766]: E1002 10:52:01.880423 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:01 crc kubenswrapper[4766]: E1002 10:52:01.880540 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.927868 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs\") pod \"network-metrics-daemon-klg2z\" (UID: \"6d68573a-5250-4407-8631-2199a3de7e9e\") " pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:01 crc kubenswrapper[4766]: E1002 10:52:01.928155 4766 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:52:01 crc kubenswrapper[4766]: E1002 10:52:01.928261 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs podName:6d68573a-5250-4407-8631-2199a3de7e9e nodeName:}" failed. No retries permitted until 2025-10-02 10:52:02.928238022 +0000 UTC m=+37.871109036 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs") pod "network-metrics-daemon-klg2z" (UID: "6d68573a-5250-4407-8631-2199a3de7e9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.983163 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.983538 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.983562 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.983579 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:01 crc kubenswrapper[4766]: I1002 10:52:01.983589 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:01Z","lastTransitionTime":"2025-10-02T10:52:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.086726 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.086771 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.086787 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.086806 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.086818 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:02Z","lastTransitionTime":"2025-10-02T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.189365 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.189414 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.189428 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.189446 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.189464 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:02Z","lastTransitionTime":"2025-10-02T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.235416 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" event={"ID":"8c99ede6-74b7-406b-8195-c9364efc146f","Type":"ContainerStarted","Data":"56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da"} Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.291815 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.291871 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.291884 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.291905 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.291918 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:02Z","lastTransitionTime":"2025-10-02T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.394417 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.394452 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.394463 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.394480 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.394491 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:02Z","lastTransitionTime":"2025-10-02T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.497036 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.497076 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.497084 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.497099 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.497107 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:02Z","lastTransitionTime":"2025-10-02T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.600176 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.600249 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.600284 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.600312 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.600327 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:02Z","lastTransitionTime":"2025-10-02T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.703607 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.703637 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.703645 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.703658 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.703667 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:02Z","lastTransitionTime":"2025-10-02T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.806720 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.806754 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.806766 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.806783 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.806795 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:02Z","lastTransitionTime":"2025-10-02T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.880818 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:02 crc kubenswrapper[4766]: E1002 10:52:02.880972 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.881379 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:02 crc kubenswrapper[4766]: E1002 10:52:02.881439 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.909416 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.909446 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.909457 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.909473 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.909482 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:02Z","lastTransitionTime":"2025-10-02T10:52:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:02 crc kubenswrapper[4766]: I1002 10:52:02.940292 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs\") pod \"network-metrics-daemon-klg2z\" (UID: \"6d68573a-5250-4407-8631-2199a3de7e9e\") " pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:02 crc kubenswrapper[4766]: E1002 10:52:02.940453 4766 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:52:02 crc kubenswrapper[4766]: E1002 10:52:02.940540 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs podName:6d68573a-5250-4407-8631-2199a3de7e9e nodeName:}" failed. No retries permitted until 2025-10-02 10:52:04.940519543 +0000 UTC m=+39.883390487 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs") pod "network-metrics-daemon-klg2z" (UID: "6d68573a-5250-4407-8631-2199a3de7e9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.011849 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.012224 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.012304 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.012450 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.012545 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:03Z","lastTransitionTime":"2025-10-02T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.115431 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.115472 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.115481 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.115511 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.115526 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:03Z","lastTransitionTime":"2025-10-02T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.221840 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.221904 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.221917 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.221938 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.221955 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:03Z","lastTransitionTime":"2025-10-02T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.240355 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovnkube-controller/0.log" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.243450 4766 generic.go:334] "Generic (PLEG): container finished" podID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerID="5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1" exitCode=1 Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.243543 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerDied","Data":"5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1"} Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.244353 4766 scope.go:117] "RemoveContainer" containerID="5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.245314 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" event={"ID":"8c99ede6-74b7-406b-8195-c9364efc146f","Type":"ContainerStarted","Data":"8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727"} Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.260390 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.274465 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.286006 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.299612 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.312667 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.324085 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.324142 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.324157 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.324182 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.324198 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:03Z","lastTransitionTime":"2025-10-02T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.332453 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:02Z\\\",\\\"message\\\":\\\"ending *v1.Pod event handler 6 for removal\\\\nI1002 10:52:01.937601 6107 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 10:52:01.937591 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:52:01.937621 6107 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 10:52:01.937634 6107 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:52:01.937638 6107 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 10:52:01.937659 6107 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:52:01.937664 6107 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 10:52:01.937697 6107 factory.go:656] Stopping watch factory\\\\nI1002 10:52:01.937701 6107 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:52:01.937709 6107 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 10:52:01.937634 6107 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 10:52:01.937729 6107 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:52:01.937737 6107 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.348237 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.362140 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.374850 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.388239 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.401625 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.414674 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.428590 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.428626 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.428639 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.428656 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.428668 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:03Z","lastTransitionTime":"2025-10-02T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.429045 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.440617 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.455339 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.475120 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.489602 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.500983 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.512457 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.523409 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.531179 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.531253 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.531265 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.531293 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.531306 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:03Z","lastTransitionTime":"2025-10-02T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.535561 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.546867 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.558428 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.568522 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.581554 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.592922 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.602724 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.618485 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.633392 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.633694 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.633784 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.633861 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.633941 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:03Z","lastTransitionTime":"2025-10-02T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.637600 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:02Z\\\",\\\"message\\\":\\\"ending *v1.Pod event handler 6 for removal\\\\nI1002 10:52:01.937601 6107 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 10:52:01.937591 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:52:01.937621 6107 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 10:52:01.937634 6107 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:52:01.937638 6107 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 10:52:01.937659 6107 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:52:01.937664 6107 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 10:52:01.937697 6107 factory.go:656] Stopping watch factory\\\\nI1002 10:52:01.937701 6107 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:52:01.937709 6107 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 10:52:01.937634 6107 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 10:52:01.937729 6107 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:52:01.937737 6107 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.654013 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.667195 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.670777 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.670818 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.670832 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.670849 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.670860 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:03Z","lastTransitionTime":"2025-10-02T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.679529 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.682336 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.686451 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.686513 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.686525 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.686541 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.686553 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:03Z","lastTransitionTime":"2025-10-02T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.698234 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.701924 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.701969 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.701981 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.702002 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.702013 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:03Z","lastTransitionTime":"2025-10-02T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.713987 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.719043 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.719082 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.719091 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.719107 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.719117 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:03Z","lastTransitionTime":"2025-10-02T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.731087 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.735044 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.735080 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.735092 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.735111 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.735123 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:03Z","lastTransitionTime":"2025-10-02T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.747046 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.747174 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.747210 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.747251 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.747288 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.747357 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.747390 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.747404 4766 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.747412 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.747431 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.747433 4766 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.747443 4766 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.747456 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 10:52:19.747438253 +0000 UTC m=+54.690309197 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.747607 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:52:19.747584888 +0000 UTC m=+54.690455882 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.747624 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 10:52:19.747616169 +0000 UTC m=+54.690487213 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.747838 4766 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.747874 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:52:19.747853946 +0000 UTC m=+54.690724890 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.747905 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:52:19.747897898 +0000 UTC m=+54.690768842 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.747991 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:03Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.748150 4766 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.752416 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.752453 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.752465 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.752481 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.752491 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:03Z","lastTransitionTime":"2025-10-02T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.854640 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.854672 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.854681 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.854696 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.854705 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:03Z","lastTransitionTime":"2025-10-02T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.880595 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.880760 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.880900 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:03 crc kubenswrapper[4766]: E1002 10:52:03.881067 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.957453 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.957481 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.957493 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.957523 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:03 crc kubenswrapper[4766]: I1002 10:52:03.957536 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:03Z","lastTransitionTime":"2025-10-02T10:52:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.060163 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.060224 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.060238 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.060281 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.060295 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:04Z","lastTransitionTime":"2025-10-02T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.162557 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.162608 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.162621 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.162637 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.162648 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:04Z","lastTransitionTime":"2025-10-02T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.250903 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovnkube-controller/0.log" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.253813 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerStarted","Data":"1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033"} Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.264548 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.264592 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.264603 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.264618 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.264630 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:04Z","lastTransitionTime":"2025-10-02T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.266921 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.279454 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.290253 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.300725 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.312388 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.326095 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.338964 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.348686 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.361327 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.366604 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.366644 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.366657 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.366677 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.366688 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:04Z","lastTransitionTime":"2025-10-02T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.407015 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.419097 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.437489 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:02Z\\\",\\\"message\\\":\\\"ending *v1.Pod event handler 6 for removal\\\\nI1002 10:52:01.937601 6107 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 10:52:01.937591 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:52:01.937621 6107 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 10:52:01.937634 6107 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:52:01.937638 6107 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 10:52:01.937659 6107 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:52:01.937664 6107 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 10:52:01.937697 6107 factory.go:656] Stopping watch factory\\\\nI1002 10:52:01.937701 6107 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:52:01.937709 6107 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 10:52:01.937634 6107 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 10:52:01.937729 6107 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:52:01.937737 6107 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.450540 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.464340 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.468721 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.468761 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.468775 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.468815 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.468827 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:04Z","lastTransitionTime":"2025-10-02T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.476313 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.500751 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:04Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.570806 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.570833 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.570841 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.570854 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.570863 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:04Z","lastTransitionTime":"2025-10-02T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.673063 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.673097 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.673110 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.673127 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.673138 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:04Z","lastTransitionTime":"2025-10-02T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.775877 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.775921 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.775930 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.775945 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.775956 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:04Z","lastTransitionTime":"2025-10-02T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.878938 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.878970 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.878981 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.878997 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.879008 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:04Z","lastTransitionTime":"2025-10-02T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.880767 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:04 crc kubenswrapper[4766]: E1002 10:52:04.880866 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.880935 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:04 crc kubenswrapper[4766]: E1002 10:52:04.881025 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.958310 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs\") pod \"network-metrics-daemon-klg2z\" (UID: \"6d68573a-5250-4407-8631-2199a3de7e9e\") " pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:04 crc kubenswrapper[4766]: E1002 10:52:04.958570 4766 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:52:04 crc kubenswrapper[4766]: E1002 10:52:04.958664 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs podName:6d68573a-5250-4407-8631-2199a3de7e9e nodeName:}" failed. No retries permitted until 2025-10-02 10:52:08.958644619 +0000 UTC m=+43.901515553 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs") pod "network-metrics-daemon-klg2z" (UID: "6d68573a-5250-4407-8631-2199a3de7e9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.981478 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.981542 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.981555 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.981574 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:04 crc kubenswrapper[4766]: I1002 10:52:04.981588 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:04Z","lastTransitionTime":"2025-10-02T10:52:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.087851 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.087896 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.087911 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.087928 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.087937 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:05Z","lastTransitionTime":"2025-10-02T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.190357 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.190681 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.190692 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.190707 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.190717 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:05Z","lastTransitionTime":"2025-10-02T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.258678 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovnkube-controller/1.log" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.259208 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovnkube-controller/0.log" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.262284 4766 generic.go:334] "Generic (PLEG): container finished" podID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerID="1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033" exitCode=1 Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.262330 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerDied","Data":"1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033"} Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.262365 4766 scope.go:117] "RemoveContainer" containerID="5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.263266 4766 scope.go:117] "RemoveContainer" containerID="1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033" Oct 02 10:52:05 crc kubenswrapper[4766]: E1002 10:52:05.263441 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.279731 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.293271 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.293312 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.293322 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.293338 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.293311 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.293368 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:05Z","lastTransitionTime":"2025-10-02T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.304430 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.316266 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.333707 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:02Z\\\",\\\"message\\\":\\\"ending *v1.Pod event handler 6 for removal\\\\nI1002 10:52:01.937601 6107 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 10:52:01.937591 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:52:01.937621 6107 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 10:52:01.937634 6107 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:52:01.937638 6107 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 10:52:01.937659 6107 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:52:01.937664 6107 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 10:52:01.937697 6107 factory.go:656] Stopping watch factory\\\\nI1002 10:52:01.937701 6107 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:52:01.937709 6107 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 10:52:01.937634 6107 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 10:52:01.937729 6107 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:52:01.937737 6107 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:04Z\\\",\\\"message\\\":\\\"e-controller-manager/kube-controller-manager-crc\\\\nI1002 10:52:04.374806 6315 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1002 10:52:04.374852 6315 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374859 6315 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374865 6315 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 10:52:04.374871 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 10:52:04.374877 6315 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374888 6315 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1002 10:52:04.374912 6315 kube.go:317] Updating pod openshift-multus/network-metrics-daemon-klg2z\\\\nF1002 10:52:04.374964 6315 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.345107 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.358287 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.369229 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.380400 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.393053 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.395612 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.395668 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.395685 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.395705 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.395722 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:05Z","lastTransitionTime":"2025-10-02T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.406908 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.421803 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.432299 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.443236 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.454546 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.465053 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.498030 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.498074 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.498086 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.498103 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.498141 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:05Z","lastTransitionTime":"2025-10-02T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.600768 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.600804 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.600817 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.600834 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.600846 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:05Z","lastTransitionTime":"2025-10-02T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.702832 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.702880 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.702894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.702915 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.702928 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:05Z","lastTransitionTime":"2025-10-02T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.806663 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.806710 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.806722 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.806739 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.806750 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:05Z","lastTransitionTime":"2025-10-02T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.880307 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:05 crc kubenswrapper[4766]: E1002 10:52:05.880485 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.880589 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:05 crc kubenswrapper[4766]: E1002 10:52:05.881037 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.881298 4766 scope.go:117] "RemoveContainer" containerID="ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.896656 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.907894 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.909682 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.909722 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.909735 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.909751 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.909765 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:05Z","lastTransitionTime":"2025-10-02T10:52:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.922638 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.935743 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.949036 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.960307 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.972842 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:05 crc kubenswrapper[4766]: I1002 10:52:05.995009 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:02Z\\\",\\\"message\\\":\\\"ending *v1.Pod event handler 6 for removal\\\\nI1002 10:52:01.937601 6107 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 10:52:01.937591 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:52:01.937621 6107 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 10:52:01.937634 6107 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:52:01.937638 6107 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 10:52:01.937659 6107 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:52:01.937664 6107 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 10:52:01.937697 6107 factory.go:656] Stopping watch factory\\\\nI1002 10:52:01.937701 6107 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:52:01.937709 6107 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 10:52:01.937634 6107 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 10:52:01.937729 6107 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:52:01.937737 6107 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:04Z\\\",\\\"message\\\":\\\"e-controller-manager/kube-controller-manager-crc\\\\nI1002 10:52:04.374806 6315 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1002 10:52:04.374852 6315 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374859 6315 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374865 6315 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 10:52:04.374871 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 10:52:04.374877 6315 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374888 6315 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1002 10:52:04.374912 6315 kube.go:317] Updating pod openshift-multus/network-metrics-daemon-klg2z\\\\nF1002 10:52:04.374964 6315 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.008311 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.014919 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.014948 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.014956 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.014972 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.014982 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:06Z","lastTransitionTime":"2025-10-02T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.022675 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.036433 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.049665 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.060332 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.071421 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.083809 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.095752 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.117153 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.117179 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.117188 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.117201 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.117212 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:06Z","lastTransitionTime":"2025-10-02T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.219458 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.219492 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.219517 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.219529 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.219538 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:06Z","lastTransitionTime":"2025-10-02T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.267701 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovnkube-controller/1.log" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.272008 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.273784 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8"} Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.274612 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.287458 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.304475 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:02Z\\\",\\\"message\\\":\\\"ending *v1.Pod event handler 6 for removal\\\\nI1002 10:52:01.937601 6107 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 10:52:01.937591 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:52:01.937621 6107 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 10:52:01.937634 6107 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:52:01.937638 6107 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 10:52:01.937659 6107 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:52:01.937664 6107 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 10:52:01.937697 6107 factory.go:656] Stopping watch factory\\\\nI1002 10:52:01.937701 6107 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:52:01.937709 6107 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 10:52:01.937634 6107 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 10:52:01.937729 6107 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:52:01.937737 6107 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:04Z\\\",\\\"message\\\":\\\"e-controller-manager/kube-controller-manager-crc\\\\nI1002 10:52:04.374806 6315 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1002 10:52:04.374852 6315 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374859 6315 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374865 6315 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 10:52:04.374871 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 10:52:04.374877 6315 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374888 6315 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1002 10:52:04.374912 6315 kube.go:317] Updating pod openshift-multus/network-metrics-daemon-klg2z\\\\nF1002 10:52:04.374964 6315 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.317128 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.322031 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.322061 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.322071 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.322089 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.322101 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:06Z","lastTransitionTime":"2025-10-02T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.332757 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.345145 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.359116 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.369791 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.382431 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.401288 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.412760 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.424253 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.424298 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.424311 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.424328 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.424337 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:06Z","lastTransitionTime":"2025-10-02T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.427600 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.441012 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.452712 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.464131 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.476108 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.488061 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.526937 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.526972 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.526983 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.527000 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.527011 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:06Z","lastTransitionTime":"2025-10-02T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.630987 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.631033 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.631048 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.631067 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.631080 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:06Z","lastTransitionTime":"2025-10-02T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.733096 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.733148 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.733160 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.733178 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.733193 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:06Z","lastTransitionTime":"2025-10-02T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.835789 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.835886 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.835898 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.835918 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.835930 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:06Z","lastTransitionTime":"2025-10-02T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.881108 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.881186 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:06 crc kubenswrapper[4766]: E1002 10:52:06.881244 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:06 crc kubenswrapper[4766]: E1002 10:52:06.881346 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.937687 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.937735 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.937746 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.937764 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:06 crc kubenswrapper[4766]: I1002 10:52:06.937777 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:06Z","lastTransitionTime":"2025-10-02T10:52:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.041198 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.041229 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.041238 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.041254 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.041266 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:07Z","lastTransitionTime":"2025-10-02T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.143483 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.143538 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.143551 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.143571 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.143580 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:07Z","lastTransitionTime":"2025-10-02T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.245642 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.245677 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.245686 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.245701 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.245715 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:07Z","lastTransitionTime":"2025-10-02T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.348356 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.348417 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.348432 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.348459 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.348478 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:07Z","lastTransitionTime":"2025-10-02T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.451050 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.451081 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.451089 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.451107 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.451121 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:07Z","lastTransitionTime":"2025-10-02T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.553818 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.553864 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.553872 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.553888 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.553898 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:07Z","lastTransitionTime":"2025-10-02T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.656517 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.656588 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.656599 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.656622 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.656631 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:07Z","lastTransitionTime":"2025-10-02T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.759409 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.759449 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.759459 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.759481 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.759520 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:07Z","lastTransitionTime":"2025-10-02T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.861532 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.861576 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.861585 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.861600 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.861609 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:07Z","lastTransitionTime":"2025-10-02T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.880773 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.880879 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:07 crc kubenswrapper[4766]: E1002 10:52:07.880991 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:07 crc kubenswrapper[4766]: E1002 10:52:07.881070 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.964199 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.964242 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.964255 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.964276 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:07 crc kubenswrapper[4766]: I1002 10:52:07.964285 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:07Z","lastTransitionTime":"2025-10-02T10:52:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.067126 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.067205 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.067232 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.067262 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.067273 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:08Z","lastTransitionTime":"2025-10-02T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.169680 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.169723 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.169732 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.169749 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.169760 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:08Z","lastTransitionTime":"2025-10-02T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.272005 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.272043 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.272052 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.272065 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.272073 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:08Z","lastTransitionTime":"2025-10-02T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.379076 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.379152 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.379167 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.379188 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.379201 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:08Z","lastTransitionTime":"2025-10-02T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.481899 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.481937 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.481947 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.481966 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.481987 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:08Z","lastTransitionTime":"2025-10-02T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.584868 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.584923 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.584937 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.584961 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.584976 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:08Z","lastTransitionTime":"2025-10-02T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.687929 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.687970 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.687983 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.688000 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.688012 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:08Z","lastTransitionTime":"2025-10-02T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.791210 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.791245 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.791255 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.791268 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.791277 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:08Z","lastTransitionTime":"2025-10-02T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.880591 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:08 crc kubenswrapper[4766]: E1002 10:52:08.880761 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.881010 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:08 crc kubenswrapper[4766]: E1002 10:52:08.881204 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.894220 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.894278 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.894291 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.894314 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.894328 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:08Z","lastTransitionTime":"2025-10-02T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.995541 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs\") pod \"network-metrics-daemon-klg2z\" (UID: \"6d68573a-5250-4407-8631-2199a3de7e9e\") " pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:08 crc kubenswrapper[4766]: E1002 10:52:08.995695 4766 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:52:08 crc kubenswrapper[4766]: E1002 10:52:08.995788 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs podName:6d68573a-5250-4407-8631-2199a3de7e9e nodeName:}" failed. No retries permitted until 2025-10-02 10:52:16.995771977 +0000 UTC m=+51.938642931 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs") pod "network-metrics-daemon-klg2z" (UID: "6d68573a-5250-4407-8631-2199a3de7e9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.996998 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.997087 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.997107 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.997131 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:08 crc kubenswrapper[4766]: I1002 10:52:08.997145 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:08Z","lastTransitionTime":"2025-10-02T10:52:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.099361 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.099423 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.099436 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.099454 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.099463 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:09Z","lastTransitionTime":"2025-10-02T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.201868 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.201920 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.201935 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.201953 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.201966 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:09Z","lastTransitionTime":"2025-10-02T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.304211 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.304316 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.304330 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.304348 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.304363 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:09Z","lastTransitionTime":"2025-10-02T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.406737 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.406997 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.407125 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.407221 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.407299 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:09Z","lastTransitionTime":"2025-10-02T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.509307 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.509916 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.509988 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.510097 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.510180 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:09Z","lastTransitionTime":"2025-10-02T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.612591 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.612898 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.612991 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.613092 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.613217 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:09Z","lastTransitionTime":"2025-10-02T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.716484 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.716773 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.716856 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.716933 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.716992 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:09Z","lastTransitionTime":"2025-10-02T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.819304 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.819339 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.819349 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.819365 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.819375 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:09Z","lastTransitionTime":"2025-10-02T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.880750 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.880756 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:09 crc kubenswrapper[4766]: E1002 10:52:09.881196 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:09 crc kubenswrapper[4766]: E1002 10:52:09.881419 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.922226 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.922280 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.922293 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.922311 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:09 crc kubenswrapper[4766]: I1002 10:52:09.922329 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:09Z","lastTransitionTime":"2025-10-02T10:52:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.025481 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.026088 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.026375 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.026526 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.026626 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:10Z","lastTransitionTime":"2025-10-02T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.128943 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.128980 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.128991 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.129006 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.129017 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:10Z","lastTransitionTime":"2025-10-02T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.231581 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.231890 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.232024 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.232111 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.232217 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:10Z","lastTransitionTime":"2025-10-02T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.335218 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.335258 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.335275 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.335299 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.335313 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:10Z","lastTransitionTime":"2025-10-02T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.438846 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.438881 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.438895 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.438913 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.438923 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:10Z","lastTransitionTime":"2025-10-02T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.542337 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.543077 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.543270 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.543332 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.543354 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:10Z","lastTransitionTime":"2025-10-02T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.646013 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.646058 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.646101 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.646122 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.646136 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:10Z","lastTransitionTime":"2025-10-02T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.748433 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.748487 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.748528 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.748548 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.748562 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:10Z","lastTransitionTime":"2025-10-02T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.850398 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.850449 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.850461 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.850477 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.850488 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:10Z","lastTransitionTime":"2025-10-02T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.880343 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.880344 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:10 crc kubenswrapper[4766]: E1002 10:52:10.880531 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:10 crc kubenswrapper[4766]: E1002 10:52:10.880585 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.952971 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.953024 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.953042 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.953066 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:10 crc kubenswrapper[4766]: I1002 10:52:10.953083 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:10Z","lastTransitionTime":"2025-10-02T10:52:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.055974 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.056019 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.056032 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.056051 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.056065 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:11Z","lastTransitionTime":"2025-10-02T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.158648 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.158688 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.158699 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.158714 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.158723 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:11Z","lastTransitionTime":"2025-10-02T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.262069 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.262130 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.262148 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.262168 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.262180 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:11Z","lastTransitionTime":"2025-10-02T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.366109 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.366148 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.366158 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.366171 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.366180 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:11Z","lastTransitionTime":"2025-10-02T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.469019 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.469087 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.469101 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.469119 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.469133 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:11Z","lastTransitionTime":"2025-10-02T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.572182 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.572242 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.572255 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.572280 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.572298 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:11Z","lastTransitionTime":"2025-10-02T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.675207 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.675274 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.675289 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.675315 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.675335 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:11Z","lastTransitionTime":"2025-10-02T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.778128 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.778181 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.778194 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.778220 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.778235 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:11Z","lastTransitionTime":"2025-10-02T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.880413 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:11 crc kubenswrapper[4766]: E1002 10:52:11.880582 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.880619 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.880688 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:11 crc kubenswrapper[4766]: E1002 10:52:11.880697 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.880711 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.880724 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.880745 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.880758 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:11Z","lastTransitionTime":"2025-10-02T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.983921 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.983966 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.983999 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.984017 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:11 crc kubenswrapper[4766]: I1002 10:52:11.984029 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:11Z","lastTransitionTime":"2025-10-02T10:52:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.086759 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.086804 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.086812 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.086826 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.086836 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:12Z","lastTransitionTime":"2025-10-02T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.189546 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.189603 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.189612 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.189626 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.189652 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:12Z","lastTransitionTime":"2025-10-02T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.291729 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.291761 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.291771 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.291788 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.291798 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:12Z","lastTransitionTime":"2025-10-02T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.394352 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.394385 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.394394 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.394412 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.394420 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:12Z","lastTransitionTime":"2025-10-02T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.497345 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.497451 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.497465 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.497486 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.497545 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:12Z","lastTransitionTime":"2025-10-02T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.600052 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.600172 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.600187 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.600266 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.600282 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:12Z","lastTransitionTime":"2025-10-02T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.703405 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.703487 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.703497 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.703531 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.703542 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:12Z","lastTransitionTime":"2025-10-02T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.806553 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.806597 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.806606 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.806622 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.806634 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:12Z","lastTransitionTime":"2025-10-02T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.880813 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.880828 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:12 crc kubenswrapper[4766]: E1002 10:52:12.880966 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:12 crc kubenswrapper[4766]: E1002 10:52:12.881060 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.909490 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.909564 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.909576 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.909618 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:12 crc kubenswrapper[4766]: I1002 10:52:12.909633 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:12Z","lastTransitionTime":"2025-10-02T10:52:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.012183 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.012224 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.012234 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.012251 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.012262 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:13Z","lastTransitionTime":"2025-10-02T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.114973 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.115026 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.115037 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.115053 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.115068 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:13Z","lastTransitionTime":"2025-10-02T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.217350 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.217389 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.217399 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.217412 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.217424 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:13Z","lastTransitionTime":"2025-10-02T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.320316 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.320353 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.320362 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.320378 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.320387 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:13Z","lastTransitionTime":"2025-10-02T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.423368 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.423408 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.423421 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.423439 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.423453 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:13Z","lastTransitionTime":"2025-10-02T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.526179 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.526235 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.526246 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.526265 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.526277 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:13Z","lastTransitionTime":"2025-10-02T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.628871 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.628930 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.628943 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.628965 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.628981 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:13Z","lastTransitionTime":"2025-10-02T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.731492 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.731557 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.731572 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.731592 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.731604 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:13Z","lastTransitionTime":"2025-10-02T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.834062 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.834111 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.834126 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.834146 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.834161 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:13Z","lastTransitionTime":"2025-10-02T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.880688 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.880775 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:13 crc kubenswrapper[4766]: E1002 10:52:13.880830 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:13 crc kubenswrapper[4766]: E1002 10:52:13.880915 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.936697 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.936737 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.936746 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.936762 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:13 crc kubenswrapper[4766]: I1002 10:52:13.936774 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:13Z","lastTransitionTime":"2025-10-02T10:52:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.039194 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.039240 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.039251 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.039269 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.039284 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:14Z","lastTransitionTime":"2025-10-02T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.073740 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.073788 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.073797 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.073810 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.073821 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:14Z","lastTransitionTime":"2025-10-02T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:14 crc kubenswrapper[4766]: E1002 10:52:14.087967 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.091367 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.091426 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.091440 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.091458 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.091469 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:14Z","lastTransitionTime":"2025-10-02T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:14 crc kubenswrapper[4766]: E1002 10:52:14.106380 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.109444 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.109464 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.109474 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.109487 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.109496 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:14Z","lastTransitionTime":"2025-10-02T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:14 crc kubenswrapper[4766]: E1002 10:52:14.122364 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.125954 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.125996 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.126006 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.126021 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.126030 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:14Z","lastTransitionTime":"2025-10-02T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:14 crc kubenswrapper[4766]: E1002 10:52:14.137478 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.141517 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.141567 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.141581 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.141602 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.141620 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:14Z","lastTransitionTime":"2025-10-02T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:14 crc kubenswrapper[4766]: E1002 10:52:14.155037 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:14Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:14 crc kubenswrapper[4766]: E1002 10:52:14.155175 4766 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.157200 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.157244 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.157257 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.157273 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.157286 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:14Z","lastTransitionTime":"2025-10-02T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.260462 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.260523 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.260533 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.260555 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.260564 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:14Z","lastTransitionTime":"2025-10-02T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.362668 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.362724 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.362738 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.362756 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.362769 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:14Z","lastTransitionTime":"2025-10-02T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.465561 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.465643 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.465660 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.465680 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.465695 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:14Z","lastTransitionTime":"2025-10-02T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.567624 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.567663 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.567673 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.567689 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.567698 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:14Z","lastTransitionTime":"2025-10-02T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.669530 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.669650 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.669663 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.669682 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.669694 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:14Z","lastTransitionTime":"2025-10-02T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.771870 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.771921 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.771934 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.771953 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.771967 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:14Z","lastTransitionTime":"2025-10-02T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.873813 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.873869 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.873897 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.873922 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.873938 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:14Z","lastTransitionTime":"2025-10-02T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.881138 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.881224 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:14 crc kubenswrapper[4766]: E1002 10:52:14.881265 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:14 crc kubenswrapper[4766]: E1002 10:52:14.881405 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.976130 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.976184 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.976195 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.976212 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:14 crc kubenswrapper[4766]: I1002 10:52:14.976223 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:14Z","lastTransitionTime":"2025-10-02T10:52:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.079077 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.079125 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.079136 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.079151 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.079162 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:15Z","lastTransitionTime":"2025-10-02T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.181752 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.181799 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.181811 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.181828 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.181841 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:15Z","lastTransitionTime":"2025-10-02T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.284625 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.284662 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.284671 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.284689 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.284700 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:15Z","lastTransitionTime":"2025-10-02T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.386909 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.386978 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.386996 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.387016 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.387029 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:15Z","lastTransitionTime":"2025-10-02T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.489379 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.489428 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.489440 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.489455 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.489468 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:15Z","lastTransitionTime":"2025-10-02T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.592285 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.592338 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.592349 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.592364 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.592374 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:15Z","lastTransitionTime":"2025-10-02T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.695127 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.695171 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.695183 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.695199 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.695213 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:15Z","lastTransitionTime":"2025-10-02T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.798177 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.798221 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.798233 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.798252 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.798263 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:15Z","lastTransitionTime":"2025-10-02T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.881160 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.881160 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:15 crc kubenswrapper[4766]: E1002 10:52:15.881298 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:15 crc kubenswrapper[4766]: E1002 10:52:15.881533 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.896737 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.900384 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.900424 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.900471 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.900489 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.900514 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:15Z","lastTransitionTime":"2025-10-02T10:52:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.911942 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.923039 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.938249 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.949896 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.959930 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.971579 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:15 crc kubenswrapper[4766]: I1002 10:52:15.990074 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.002786 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.002832 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.002841 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.002856 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.002869 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:16Z","lastTransitionTime":"2025-10-02T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.004582 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.015955 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.026342 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.036791 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.047005 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.063164 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:02Z\\\",\\\"message\\\":\\\"ending *v1.Pod event handler 6 for removal\\\\nI1002 10:52:01.937601 6107 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 10:52:01.937591 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:52:01.937621 6107 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 10:52:01.937634 6107 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:52:01.937638 6107 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 10:52:01.937659 6107 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:52:01.937664 6107 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 10:52:01.937697 6107 factory.go:656] Stopping watch factory\\\\nI1002 10:52:01.937701 6107 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:52:01.937709 6107 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 10:52:01.937634 6107 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 10:52:01.937729 6107 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:52:01.937737 6107 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:04Z\\\",\\\"message\\\":\\\"e-controller-manager/kube-controller-manager-crc\\\\nI1002 10:52:04.374806 6315 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1002 10:52:04.374852 6315 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374859 6315 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374865 6315 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 10:52:04.374871 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 10:52:04.374877 6315 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374888 6315 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1002 10:52:04.374912 6315 kube.go:317] Updating pod openshift-multus/network-metrics-daemon-klg2z\\\\nF1002 10:52:04.374964 6315 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.075609 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.088071 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.105723 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.105771 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.105781 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.105892 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.105922 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:16Z","lastTransitionTime":"2025-10-02T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.208863 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.208915 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.208929 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.208950 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.208964 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:16Z","lastTransitionTime":"2025-10-02T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.310978 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.311263 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.311329 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.311397 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.311464 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:16Z","lastTransitionTime":"2025-10-02T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.414593 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.414840 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.414971 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.415067 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.415161 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:16Z","lastTransitionTime":"2025-10-02T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.425316 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.438286 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.449907 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.461070 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.472649 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.484721 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.501484 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:02Z\\\",\\\"message\\\":\\\"ending *v1.Pod event handler 6 for removal\\\\nI1002 10:52:01.937601 6107 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 10:52:01.937591 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:52:01.937621 6107 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 10:52:01.937634 6107 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:52:01.937638 6107 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 10:52:01.937659 6107 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:52:01.937664 6107 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 10:52:01.937697 6107 factory.go:656] Stopping watch factory\\\\nI1002 10:52:01.937701 6107 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:52:01.937709 6107 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 10:52:01.937634 6107 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 10:52:01.937729 6107 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:52:01.937737 6107 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:04Z\\\",\\\"message\\\":\\\"e-controller-manager/kube-controller-manager-crc\\\\nI1002 10:52:04.374806 6315 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1002 10:52:04.374852 6315 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374859 6315 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374865 6315 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 10:52:04.374871 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 10:52:04.374877 6315 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374888 6315 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1002 10:52:04.374912 6315 kube.go:317] Updating pod openshift-multus/network-metrics-daemon-klg2z\\\\nF1002 10:52:04.374964 6315 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.513138 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.517317 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.517358 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.517367 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.517383 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.517399 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:16Z","lastTransitionTime":"2025-10-02T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.525165 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.535146 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.549702 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.561136 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.573428 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.584167 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.593589 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.606795 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.617979 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.619302 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.619356 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.619370 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.619387 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.619399 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:16Z","lastTransitionTime":"2025-10-02T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.722088 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.722139 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.722152 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.722172 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.722186 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:16Z","lastTransitionTime":"2025-10-02T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.824563 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.824612 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.824626 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.824643 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.824655 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:16Z","lastTransitionTime":"2025-10-02T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.880272 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.880272 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:16 crc kubenswrapper[4766]: E1002 10:52:16.880395 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:16 crc kubenswrapper[4766]: E1002 10:52:16.880455 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.927239 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.927283 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.927301 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.927317 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:16 crc kubenswrapper[4766]: I1002 10:52:16.927330 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:16Z","lastTransitionTime":"2025-10-02T10:52:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.029311 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.029364 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.029373 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.029389 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.029398 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:17Z","lastTransitionTime":"2025-10-02T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.091475 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs\") pod \"network-metrics-daemon-klg2z\" (UID: \"6d68573a-5250-4407-8631-2199a3de7e9e\") " pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:17 crc kubenswrapper[4766]: E1002 10:52:17.091653 4766 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:52:17 crc kubenswrapper[4766]: E1002 10:52:17.091717 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs podName:6d68573a-5250-4407-8631-2199a3de7e9e nodeName:}" failed. No retries permitted until 2025-10-02 10:52:33.09170283 +0000 UTC m=+68.034573774 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs") pod "network-metrics-daemon-klg2z" (UID: "6d68573a-5250-4407-8631-2199a3de7e9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.131763 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.131829 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.131842 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.131865 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.131884 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:17Z","lastTransitionTime":"2025-10-02T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.234226 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.234269 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.234278 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.234295 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.234308 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:17Z","lastTransitionTime":"2025-10-02T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.337315 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.337356 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.337367 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.337384 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.337397 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:17Z","lastTransitionTime":"2025-10-02T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.440166 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.440211 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.440228 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.440250 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.440262 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:17Z","lastTransitionTime":"2025-10-02T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.544303 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.544362 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.544376 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.544409 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.544432 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:17Z","lastTransitionTime":"2025-10-02T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.646910 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.646946 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.646960 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.646985 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.646997 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:17Z","lastTransitionTime":"2025-10-02T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.750032 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.750094 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.750107 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.750127 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.750141 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:17Z","lastTransitionTime":"2025-10-02T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.853585 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.853648 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.853661 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.853681 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.853693 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:17Z","lastTransitionTime":"2025-10-02T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.881131 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.881177 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:17 crc kubenswrapper[4766]: E1002 10:52:17.881287 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:17 crc kubenswrapper[4766]: E1002 10:52:17.881406 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.957078 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.957122 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.957134 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.957152 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:17 crc kubenswrapper[4766]: I1002 10:52:17.957163 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:17Z","lastTransitionTime":"2025-10-02T10:52:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.059947 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.059989 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.060000 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.060014 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.060024 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:18Z","lastTransitionTime":"2025-10-02T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.162061 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.162110 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.162121 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.162140 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.162155 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:18Z","lastTransitionTime":"2025-10-02T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.264523 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.264579 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.264590 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.264607 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.264622 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:18Z","lastTransitionTime":"2025-10-02T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.367903 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.367955 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.367965 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.367981 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.367996 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:18Z","lastTransitionTime":"2025-10-02T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.469905 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.469946 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.469955 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.469973 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.469987 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:18Z","lastTransitionTime":"2025-10-02T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.572887 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.572948 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.572963 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.572986 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.572998 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:18Z","lastTransitionTime":"2025-10-02T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.675990 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.676037 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.676050 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.676068 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.676080 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:18Z","lastTransitionTime":"2025-10-02T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.778773 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.778810 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.778822 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.778840 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.778852 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:18Z","lastTransitionTime":"2025-10-02T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.880276 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:18 crc kubenswrapper[4766]: E1002 10:52:18.880394 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.880612 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:18 crc kubenswrapper[4766]: E1002 10:52:18.880684 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.882086 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.882114 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.882125 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.882142 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.882154 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:18Z","lastTransitionTime":"2025-10-02T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.984275 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.984327 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.984338 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.984353 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:18 crc kubenswrapper[4766]: I1002 10:52:18.984363 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:18Z","lastTransitionTime":"2025-10-02T10:52:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.087109 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.087157 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.087178 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.087198 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.087212 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:19Z","lastTransitionTime":"2025-10-02T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.189427 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.189464 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.189474 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.189490 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.189520 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:19Z","lastTransitionTime":"2025-10-02T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.292061 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.292107 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.292117 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.292133 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.292143 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:19Z","lastTransitionTime":"2025-10-02T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.394357 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.394404 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.394413 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.394429 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.394439 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:19Z","lastTransitionTime":"2025-10-02T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.496833 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.496869 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.496880 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.496896 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.496908 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:19Z","lastTransitionTime":"2025-10-02T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.599236 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.599302 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.599318 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.599337 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.599346 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:19Z","lastTransitionTime":"2025-10-02T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.701988 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.702046 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.702056 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.702070 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.702080 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:19Z","lastTransitionTime":"2025-10-02T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.804385 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.804434 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.804449 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.804469 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.804480 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:19Z","lastTransitionTime":"2025-10-02T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.818990 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.819153 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:19 crc kubenswrapper[4766]: E1002 10:52:19.819199 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:52:51.819175373 +0000 UTC m=+86.762046317 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.819256 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:19 crc kubenswrapper[4766]: E1002 10:52:19.819288 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.819300 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:19 crc kubenswrapper[4766]: E1002 10:52:19.819310 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:52:19 crc kubenswrapper[4766]: E1002 10:52:19.819327 4766 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.819347 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:19 crc kubenswrapper[4766]: E1002 10:52:19.819380 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 10:52:51.819364949 +0000 UTC m=+86.762235893 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:52:19 crc kubenswrapper[4766]: E1002 10:52:19.819456 4766 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:52:19 crc kubenswrapper[4766]: E1002 10:52:19.819490 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:52:51.819483053 +0000 UTC m=+86.762353997 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:52:19 crc kubenswrapper[4766]: E1002 10:52:19.819560 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:52:19 crc kubenswrapper[4766]: E1002 10:52:19.819604 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:52:19 crc kubenswrapper[4766]: E1002 10:52:19.819620 4766 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:52:19 crc kubenswrapper[4766]: E1002 10:52:19.819618 4766 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:52:19 crc kubenswrapper[4766]: E1002 10:52:19.819678 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 10:52:51.819661378 +0000 UTC m=+86.762532322 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:52:19 crc kubenswrapper[4766]: E1002 10:52:19.819807 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:52:51.819743501 +0000 UTC m=+86.762614635 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.843884 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.854715 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.865972 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.880291 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.880478 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:19 crc kubenswrapper[4766]: E1002 10:52:19.880834 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:19 crc kubenswrapper[4766]: E1002 10:52:19.881329 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.881615 4766 scope.go:117] "RemoveContainer" containerID="1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.886960 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e076972e7aeb724f9783ef9714a6b12bf4bfca6c0d50ba852df12d362786df1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:02Z\\\",\\\"message\\\":\\\"ending *v1.Pod event handler 6 for removal\\\\nI1002 10:52:01.937601 6107 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1002 10:52:01.937591 6107 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1002 10:52:01.937621 6107 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1002 10:52:01.937634 6107 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1002 10:52:01.937638 6107 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1002 10:52:01.937659 6107 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1002 10:52:01.937664 6107 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1002 10:52:01.937697 6107 factory.go:656] Stopping watch factory\\\\nI1002 10:52:01.937701 6107 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1002 10:52:01.937709 6107 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1002 10:52:01.937634 6107 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1002 10:52:01.937720 6107 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1002 10:52:01.937729 6107 handler.go:208] Removed *v1.Node event handler 2\\\\nI1002 10:52:01.937737 6107 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:04Z\\\",\\\"message\\\":\\\"e-controller-manager/kube-controller-manager-crc\\\\nI1002 10:52:04.374806 6315 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1002 10:52:04.374852 6315 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374859 6315 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374865 6315 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 10:52:04.374871 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 10:52:04.374877 6315 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374888 6315 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1002 10:52:04.374912 6315 kube.go:317] Updating pod openshift-multus/network-metrics-daemon-klg2z\\\\nF1002 10:52:04.374964 6315 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.909272 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.912277 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.912320 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.912333 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.912353 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.912367 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:19Z","lastTransitionTime":"2025-10-02T10:52:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.933266 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.950939 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.967533 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.980947 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:19 crc kubenswrapper[4766]: I1002 10:52:19.995285 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:19Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.015289 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.015327 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.015336 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.015351 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.015361 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:20Z","lastTransitionTime":"2025-10-02T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.016190 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.030018 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.041188 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.055607 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.069154 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.086318 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.103901 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.119097 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.119145 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.119158 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.119181 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.119192 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:20Z","lastTransitionTime":"2025-10-02T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.119990 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.143885 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.155808 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.172427 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.190842 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.206696 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.222404 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.222462 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.222472 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.222493 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.222534 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:20Z","lastTransitionTime":"2025-10-02T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.226737 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.242194 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"070e0a3b-5963-4adc-a4f5-020999886339\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79274b877cf8ff3109da1dc078ee5f0b19a541fb7026b110c1969e0c2d341cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae830f784230e44d237e8f6a6606c9969e54046b8010f6dcd0ca446ea264676f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f754efe5b2b5464759b533ec933dc5834a1be242592ca9375184f5fc24a72f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.258263 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.273100 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.291008 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.316636 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:04Z\\\",\\\"message\\\":\\\"e-controller-manager/kube-controller-manager-crc\\\\nI1002 10:52:04.374806 6315 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1002 10:52:04.374852 6315 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374859 6315 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374865 6315 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 10:52:04.374871 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 10:52:04.374877 6315 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374888 6315 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1002 10:52:04.374912 6315 kube.go:317] Updating pod openshift-multus/network-metrics-daemon-klg2z\\\\nF1002 10:52:04.374964 6315 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.321649 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovnkube-controller/1.log" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.326436 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.326497 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.326537 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.326560 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.326732 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:20Z","lastTransitionTime":"2025-10-02T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.328068 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerStarted","Data":"adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6"} Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.329808 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.340735 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.358100 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.374492 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.398875 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.414276 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.425227 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.428593 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.428632 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.428645 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.428664 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.428675 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:20Z","lastTransitionTime":"2025-10-02T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.438186 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.450885 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.461838 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.471517 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.485421 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.504921 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:04Z\\\",\\\"message\\\":\\\"e-controller-manager/kube-controller-manager-crc\\\\nI1002 10:52:04.374806 6315 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1002 10:52:04.374852 6315 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374859 6315 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374865 6315 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 10:52:04.374871 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 10:52:04.374877 6315 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374888 6315 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1002 10:52:04.374912 6315 kube.go:317] Updating pod openshift-multus/network-metrics-daemon-klg2z\\\\nF1002 10:52:04.374964 6315 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.523002 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.531367 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.531408 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.531419 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.531433 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.531442 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:20Z","lastTransitionTime":"2025-10-02T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.542747 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"070e0a3b-5963-4adc-a4f5-020999886339\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79274b877cf8ff3109da1dc078ee5f0b19a541fb7026b110c1969e0c2d341cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae830f784230e44d237e8f6a6606c9969e54046b8010f6dcd0ca446ea264676f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f754efe5b2b5464759b533ec933dc5834a1be242592ca9375184f5fc24a72f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.561223 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.575696 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.589645 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.600467 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.612018 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.625391 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.633392 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.633528 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.633612 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.633694 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.633758 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:20Z","lastTransitionTime":"2025-10-02T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.635329 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.647062 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.656603 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:20Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.736884 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.736925 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.736934 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.736951 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.736963 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:20Z","lastTransitionTime":"2025-10-02T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.839633 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.839667 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.839680 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.839698 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.839710 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:20Z","lastTransitionTime":"2025-10-02T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.880679 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.880790 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:20 crc kubenswrapper[4766]: E1002 10:52:20.880825 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:20 crc kubenswrapper[4766]: E1002 10:52:20.880917 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.942688 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.942727 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.942737 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.942751 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:20 crc kubenswrapper[4766]: I1002 10:52:20.942763 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:20Z","lastTransitionTime":"2025-10-02T10:52:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.045792 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.045838 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.045849 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.045865 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.045876 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:21Z","lastTransitionTime":"2025-10-02T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.148840 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.148893 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.148906 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.148927 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.148937 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:21Z","lastTransitionTime":"2025-10-02T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.251463 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.251527 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.251537 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.251553 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.251565 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:21Z","lastTransitionTime":"2025-10-02T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.338632 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovnkube-controller/2.log" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.339618 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovnkube-controller/1.log" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.343138 4766 generic.go:334] "Generic (PLEG): container finished" podID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerID="adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6" exitCode=1 Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.343190 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerDied","Data":"adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6"} Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.343249 4766 scope.go:117] "RemoveContainer" containerID="1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.344804 4766 scope.go:117] "RemoveContainer" containerID="adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6" Oct 02 10:52:21 crc kubenswrapper[4766]: E1002 10:52:21.345277 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.354244 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.354300 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.354315 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.354338 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.354355 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:21Z","lastTransitionTime":"2025-10-02T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.362184 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:21Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.381191 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"070e0a3b-5963-4adc-a4f5-020999886339\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79274b877cf8ff3109da1dc078ee5f0b19a541fb7026b110c1969e0c2d341cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae830f784230e44d237e8f6a6606c9969e54046b8010f6dcd0ca446ea264676f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f754efe5b2b5464759b533ec933dc5834a1be242592ca9375184f5fc24a72f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:21Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.400690 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:21Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.415196 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:21Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.432266 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:21Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.454084 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1bffb8950fb365fa5314840eba5e338f090da28cf436805f0d0b5502b88d4033\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:04Z\\\",\\\"message\\\":\\\"e-controller-manager/kube-controller-manager-crc\\\\nI1002 10:52:04.374806 6315 base_network_controller_pods.go:916] Annotation values: ip=[10.217.0.3/23] ; mac=0a:58:0a:d9:00:03 ; gw=[10.217.0.1]\\\\nI1002 10:52:04.374852 6315 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374859 6315 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374865 6315 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/iptables-alerter-4ln5h in node crc\\\\nI1002 10:52:04.374871 6315 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/iptables-alerter-4ln5h after 0 failed attempt(s)\\\\nI1002 10:52:04.374877 6315 default_network_controller.go:776] Recording success event on pod openshift-network-operator/iptables-alerter-4ln5h\\\\nI1002 10:52:04.374888 6315 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1002 10:52:04.374912 6315 kube.go:317] Updating pod openshift-multus/network-metrics-daemon-klg2z\\\\nF1002 10:52:04.374964 6315 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:21Z\\\",\\\"message\\\":\\\"Config(nil)\\\\nI1002 10:52:20.754739 6530 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:52:20.754765 6530 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754774 6530 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754769 6530 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 request\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:21Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.458558 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.458618 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.458641 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.458670 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.458691 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:21Z","lastTransitionTime":"2025-10-02T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.469075 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:21Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.485995 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:21Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.499554 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:21Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.514363 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:21Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.531553 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:21Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.545340 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:21Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.560909 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:21Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.561890 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.561930 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.561944 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.561966 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.561979 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:21Z","lastTransitionTime":"2025-10-02T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.576599 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:21Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.594689 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:21Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.610390 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:21Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.624330 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:21Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.664788 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.664830 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.664842 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.664858 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.664869 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:21Z","lastTransitionTime":"2025-10-02T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.767213 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.767260 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.767270 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.767283 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.767294 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:21Z","lastTransitionTime":"2025-10-02T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.869554 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.869603 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.869617 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.869635 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.869648 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:21Z","lastTransitionTime":"2025-10-02T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.880808 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.880848 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:21 crc kubenswrapper[4766]: E1002 10:52:21.881001 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:21 crc kubenswrapper[4766]: E1002 10:52:21.881227 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.971647 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.972265 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.972352 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.972438 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:21 crc kubenswrapper[4766]: I1002 10:52:21.972537 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:21Z","lastTransitionTime":"2025-10-02T10:52:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.075649 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.075697 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.075709 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.075732 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.075745 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:22Z","lastTransitionTime":"2025-10-02T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.178786 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.178841 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.178854 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.178885 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.178898 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:22Z","lastTransitionTime":"2025-10-02T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.281693 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.281738 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.281752 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.281768 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.281780 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:22Z","lastTransitionTime":"2025-10-02T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.347465 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovnkube-controller/2.log" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.351603 4766 scope.go:117] "RemoveContainer" containerID="adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6" Oct 02 10:52:22 crc kubenswrapper[4766]: E1002 10:52:22.351769 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.363208 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.376376 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.384199 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.384246 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.384255 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.384274 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.384285 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:22Z","lastTransitionTime":"2025-10-02T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.388714 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.402482 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.420323 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:21Z\\\",\\\"message\\\":\\\"Config(nil)\\\\nI1002 10:52:20.754739 6530 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:52:20.754765 6530 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754774 6530 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754769 6530 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 request\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.433817 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.446792 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"070e0a3b-5963-4adc-a4f5-020999886339\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79274b877cf8ff3109da1dc078ee5f0b19a541fb7026b110c1969e0c2d341cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae830f784230e44d237e8f6a6606c9969e54046b8010f6dcd0ca446ea264676f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f754efe5b2b5464759b533ec933dc5834a1be242592ca9375184f5fc24a72f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.463820 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.476938 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.486751 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.486795 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.486804 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.486820 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.486829 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:22Z","lastTransitionTime":"2025-10-02T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.492123 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.502952 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.516656 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.529227 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.539457 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.551386 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.562191 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.570853 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:22Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.589578 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.589614 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.589625 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.589642 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.589654 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:22Z","lastTransitionTime":"2025-10-02T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.692409 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.692445 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.692454 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.692468 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.692477 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:22Z","lastTransitionTime":"2025-10-02T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.794866 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.794909 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.794922 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.794939 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.794951 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:22Z","lastTransitionTime":"2025-10-02T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.880485 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:22 crc kubenswrapper[4766]: E1002 10:52:22.880741 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.880485 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:22 crc kubenswrapper[4766]: E1002 10:52:22.880892 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.897652 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.897704 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.897715 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.897734 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:22 crc kubenswrapper[4766]: I1002 10:52:22.897744 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:22Z","lastTransitionTime":"2025-10-02T10:52:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.000258 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.000324 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.000336 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.000355 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.000368 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:23Z","lastTransitionTime":"2025-10-02T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.102841 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.102878 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.102891 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.102911 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.102924 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:23Z","lastTransitionTime":"2025-10-02T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.205690 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.205735 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.205744 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.205761 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.205772 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:23Z","lastTransitionTime":"2025-10-02T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.308496 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.308556 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.308571 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.308586 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.308595 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:23Z","lastTransitionTime":"2025-10-02T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.411709 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.411755 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.411767 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.411783 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.411796 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:23Z","lastTransitionTime":"2025-10-02T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.514288 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.514319 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.514327 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.514340 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.514349 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:23Z","lastTransitionTime":"2025-10-02T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.617868 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.617939 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.617952 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.617976 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.617989 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:23Z","lastTransitionTime":"2025-10-02T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.720543 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.720592 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.720610 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.720632 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.720644 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:23Z","lastTransitionTime":"2025-10-02T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.824288 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.824338 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.824349 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.824366 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.824378 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:23Z","lastTransitionTime":"2025-10-02T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.881098 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:23 crc kubenswrapper[4766]: E1002 10:52:23.881364 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.881436 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:23 crc kubenswrapper[4766]: E1002 10:52:23.881669 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.927190 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.927255 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.927276 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.927307 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:23 crc kubenswrapper[4766]: I1002 10:52:23.927330 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:23Z","lastTransitionTime":"2025-10-02T10:52:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.030198 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.030246 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.030259 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.030279 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.030297 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:24Z","lastTransitionTime":"2025-10-02T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.134752 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.134819 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.134838 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.134867 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.134891 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:24Z","lastTransitionTime":"2025-10-02T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.237972 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.238040 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.238059 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.238085 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.238104 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:24Z","lastTransitionTime":"2025-10-02T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.278178 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.278221 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.278231 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.278248 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.278260 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:24Z","lastTransitionTime":"2025-10-02T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:24 crc kubenswrapper[4766]: E1002 10:52:24.300017 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.304763 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.304798 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.304808 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.304824 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.304834 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:24Z","lastTransitionTime":"2025-10-02T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:24 crc kubenswrapper[4766]: E1002 10:52:24.323570 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.327462 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.327628 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.327653 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.327686 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.327709 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:24Z","lastTransitionTime":"2025-10-02T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:24 crc kubenswrapper[4766]: E1002 10:52:24.340445 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.344812 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.344863 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.344879 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.344901 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.344919 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:24Z","lastTransitionTime":"2025-10-02T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:24 crc kubenswrapper[4766]: E1002 10:52:24.362983 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.374663 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.374704 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.374714 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.374730 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.374742 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:24Z","lastTransitionTime":"2025-10-02T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:24 crc kubenswrapper[4766]: E1002 10:52:24.389390 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:24Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:24 crc kubenswrapper[4766]: E1002 10:52:24.389653 4766 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.391719 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.391764 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.391776 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.391797 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.391808 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:24Z","lastTransitionTime":"2025-10-02T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.494618 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.494667 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.494685 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.494705 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.494717 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:24Z","lastTransitionTime":"2025-10-02T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.597223 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.597270 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.597281 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.597298 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.597309 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:24Z","lastTransitionTime":"2025-10-02T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.706086 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.706135 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.706145 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.706161 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.706174 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:24Z","lastTransitionTime":"2025-10-02T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.808549 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.808595 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.808604 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.808621 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.808632 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:24Z","lastTransitionTime":"2025-10-02T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.881155 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.881213 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:24 crc kubenswrapper[4766]: E1002 10:52:24.881317 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:24 crc kubenswrapper[4766]: E1002 10:52:24.881465 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.911261 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.911319 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.911331 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.911348 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:24 crc kubenswrapper[4766]: I1002 10:52:24.911360 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:24Z","lastTransitionTime":"2025-10-02T10:52:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.014850 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.014920 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.014939 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.014969 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.014989 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:25Z","lastTransitionTime":"2025-10-02T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.118834 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.119024 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.119045 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.119073 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.119093 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:25Z","lastTransitionTime":"2025-10-02T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.222107 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.222160 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.222170 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.222188 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.222198 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:25Z","lastTransitionTime":"2025-10-02T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.325036 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.325092 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.325105 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.325128 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.325141 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:25Z","lastTransitionTime":"2025-10-02T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.428345 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.428415 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.428438 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.428464 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.428480 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:25Z","lastTransitionTime":"2025-10-02T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.532331 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.532410 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.532429 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.532461 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.532479 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:25Z","lastTransitionTime":"2025-10-02T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.636085 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.636142 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.636155 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.636175 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.636188 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:25Z","lastTransitionTime":"2025-10-02T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.739482 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.739572 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.739593 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.739614 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.739630 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:25Z","lastTransitionTime":"2025-10-02T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.841293 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.841363 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.841374 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.841391 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.841400 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:25Z","lastTransitionTime":"2025-10-02T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.880362 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.880443 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:25 crc kubenswrapper[4766]: E1002 10:52:25.880485 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:25 crc kubenswrapper[4766]: E1002 10:52:25.881142 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.894969 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:25Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.910042 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:25Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.924656 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:25Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.939971 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:25Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.944342 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.944397 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.944412 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.944437 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.944449 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:25Z","lastTransitionTime":"2025-10-02T10:52:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.953439 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:25Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.976323 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:21Z\\\",\\\"message\\\":\\\"Config(nil)\\\\nI1002 10:52:20.754739 6530 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:52:20.754765 6530 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754774 6530 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754769 6530 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 request\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:25Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:25 crc kubenswrapper[4766]: I1002 10:52:25.991571 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:25Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.005334 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"070e0a3b-5963-4adc-a4f5-020999886339\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79274b877cf8ff3109da1dc078ee5f0b19a541fb7026b110c1969e0c2d341cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae830f784230e44d237e8f6a6606c9969e54046b8010f6dcd0ca446ea264676f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f754efe5b2b5464759b533ec933dc5834a1be242592ca9375184f5fc24a72f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:26Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.020162 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:26Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.035523 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:26Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.046489 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.046540 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.046549 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.046563 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.046572 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:26Z","lastTransitionTime":"2025-10-02T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.050117 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:26Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.060329 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:26Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.072681 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:26Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.086067 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:26Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.096402 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:26Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.109628 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:26Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.120734 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:26Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.152364 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.152410 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.152420 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.152433 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.152444 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:26Z","lastTransitionTime":"2025-10-02T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.255567 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.255618 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.255628 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.255647 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.255659 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:26Z","lastTransitionTime":"2025-10-02T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.358163 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.358247 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.358260 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.358276 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.358285 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:26Z","lastTransitionTime":"2025-10-02T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.460186 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.460216 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.460225 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.460239 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.460248 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:26Z","lastTransitionTime":"2025-10-02T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.563336 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.563394 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.563405 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.563420 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.563429 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:26Z","lastTransitionTime":"2025-10-02T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.666157 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.666213 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.666227 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.666246 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.666256 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:26Z","lastTransitionTime":"2025-10-02T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.768424 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.768466 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.768475 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.768491 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.768521 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:26Z","lastTransitionTime":"2025-10-02T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.871089 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.871131 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.871139 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.871159 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.871170 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:26Z","lastTransitionTime":"2025-10-02T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.880867 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:26 crc kubenswrapper[4766]: E1002 10:52:26.880959 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.880867 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:26 crc kubenswrapper[4766]: E1002 10:52:26.881086 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.973163 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.973197 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.973205 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.973218 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:26 crc kubenswrapper[4766]: I1002 10:52:26.973227 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:26Z","lastTransitionTime":"2025-10-02T10:52:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.076187 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.076235 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.076247 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.076264 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.076277 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:27Z","lastTransitionTime":"2025-10-02T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.178482 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.178539 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.178551 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.178567 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.178579 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:27Z","lastTransitionTime":"2025-10-02T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.280946 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.281058 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.281067 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.281122 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.281132 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:27Z","lastTransitionTime":"2025-10-02T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.384240 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.384293 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.384311 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.384337 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.384358 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:27Z","lastTransitionTime":"2025-10-02T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.486912 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.486970 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.486987 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.487104 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.487124 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:27Z","lastTransitionTime":"2025-10-02T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.590298 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.590357 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.590369 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.590387 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.590398 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:27Z","lastTransitionTime":"2025-10-02T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.693770 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.693805 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.693817 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.693834 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.693846 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:27Z","lastTransitionTime":"2025-10-02T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.796809 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.796853 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.796866 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.796884 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.796900 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:27Z","lastTransitionTime":"2025-10-02T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.881106 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.881158 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:27 crc kubenswrapper[4766]: E1002 10:52:27.881262 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:27 crc kubenswrapper[4766]: E1002 10:52:27.881333 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.899206 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.899243 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.899252 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.899266 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:27 crc kubenswrapper[4766]: I1002 10:52:27.899276 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:27Z","lastTransitionTime":"2025-10-02T10:52:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.001908 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.001967 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.001984 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.002007 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.002024 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:28Z","lastTransitionTime":"2025-10-02T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.104567 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.104626 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.104638 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.104654 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.104665 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:28Z","lastTransitionTime":"2025-10-02T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.207152 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.207190 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.207203 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.207219 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.207229 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:28Z","lastTransitionTime":"2025-10-02T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.309281 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.309320 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.309331 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.309348 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.309359 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:28Z","lastTransitionTime":"2025-10-02T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.411289 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.411325 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.411370 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.411387 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.411397 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:28Z","lastTransitionTime":"2025-10-02T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.513750 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.513784 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.513795 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.513811 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.513822 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:28Z","lastTransitionTime":"2025-10-02T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.616644 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.616696 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.616708 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.616725 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.616741 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:28Z","lastTransitionTime":"2025-10-02T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.720091 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.720135 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.720146 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.720163 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.720176 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:28Z","lastTransitionTime":"2025-10-02T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.822719 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.822760 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.822773 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.822791 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.822805 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:28Z","lastTransitionTime":"2025-10-02T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.881161 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.881230 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:28 crc kubenswrapper[4766]: E1002 10:52:28.881308 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:28 crc kubenswrapper[4766]: E1002 10:52:28.881411 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.925245 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.925305 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.925323 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.925351 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:28 crc kubenswrapper[4766]: I1002 10:52:28.925368 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:28Z","lastTransitionTime":"2025-10-02T10:52:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.028128 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.028158 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.028169 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.028187 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.028197 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:29Z","lastTransitionTime":"2025-10-02T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.131063 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.131112 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.131127 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.131147 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.131161 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:29Z","lastTransitionTime":"2025-10-02T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.233180 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.233223 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.233231 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.233247 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.233256 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:29Z","lastTransitionTime":"2025-10-02T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.336041 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.336104 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.336122 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.336148 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.336166 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:29Z","lastTransitionTime":"2025-10-02T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.438853 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.438888 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.438901 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.438918 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.438930 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:29Z","lastTransitionTime":"2025-10-02T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.541880 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.541946 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.542036 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.542075 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.542092 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:29Z","lastTransitionTime":"2025-10-02T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.644147 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.644202 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.644212 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.644230 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.644240 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:29Z","lastTransitionTime":"2025-10-02T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.746490 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.746576 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.746594 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.746617 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.746632 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:29Z","lastTransitionTime":"2025-10-02T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.848567 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.848615 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.848628 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.848645 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.848657 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:29Z","lastTransitionTime":"2025-10-02T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.880489 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.880643 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:29 crc kubenswrapper[4766]: E1002 10:52:29.880668 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:29 crc kubenswrapper[4766]: E1002 10:52:29.880776 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.950669 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.950710 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.950720 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.950737 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:29 crc kubenswrapper[4766]: I1002 10:52:29.950750 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:29Z","lastTransitionTime":"2025-10-02T10:52:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.053187 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.053230 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.053242 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.053259 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.053269 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:30Z","lastTransitionTime":"2025-10-02T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.155031 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.155062 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.155074 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.155088 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.155098 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:30Z","lastTransitionTime":"2025-10-02T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.257960 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.257998 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.258007 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.258024 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.258033 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:30Z","lastTransitionTime":"2025-10-02T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.360767 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.360809 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.360819 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.360850 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.360862 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:30Z","lastTransitionTime":"2025-10-02T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.463072 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.463113 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.463122 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.463144 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.463155 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:30Z","lastTransitionTime":"2025-10-02T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.565698 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.565742 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.565753 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.565772 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.565787 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:30Z","lastTransitionTime":"2025-10-02T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.668128 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.668183 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.668198 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.668217 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.668227 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:30Z","lastTransitionTime":"2025-10-02T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.770662 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.770701 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.770720 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.770737 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.770748 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:30Z","lastTransitionTime":"2025-10-02T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.873329 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.873364 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.873372 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.873388 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.873397 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:30Z","lastTransitionTime":"2025-10-02T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.880943 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:30 crc kubenswrapper[4766]: E1002 10:52:30.881078 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.880944 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:30 crc kubenswrapper[4766]: E1002 10:52:30.881299 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.975837 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.975872 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.975881 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.975896 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:30 crc kubenswrapper[4766]: I1002 10:52:30.975905 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:30Z","lastTransitionTime":"2025-10-02T10:52:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.078440 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.078486 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.078516 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.078534 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.078547 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:31Z","lastTransitionTime":"2025-10-02T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.181015 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.181075 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.181085 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.181106 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.181118 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:31Z","lastTransitionTime":"2025-10-02T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.283473 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.283546 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.283557 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.283571 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.283586 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:31Z","lastTransitionTime":"2025-10-02T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.386923 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.386969 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.386978 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.386992 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.387001 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:31Z","lastTransitionTime":"2025-10-02T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.489803 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.489855 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.489867 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.489884 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.489899 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:31Z","lastTransitionTime":"2025-10-02T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.591906 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.591959 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.591972 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.591990 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.592001 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:31Z","lastTransitionTime":"2025-10-02T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.694084 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.694124 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.694137 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.694152 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.694162 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:31Z","lastTransitionTime":"2025-10-02T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.796726 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.796765 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.796779 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.796795 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.796806 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:31Z","lastTransitionTime":"2025-10-02T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.881080 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.881136 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:31 crc kubenswrapper[4766]: E1002 10:52:31.881235 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:31 crc kubenswrapper[4766]: E1002 10:52:31.881341 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.899833 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.899914 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.899934 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.899951 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:31 crc kubenswrapper[4766]: I1002 10:52:31.899963 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:31Z","lastTransitionTime":"2025-10-02T10:52:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.002338 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.002375 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.002384 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.002401 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.002409 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:32Z","lastTransitionTime":"2025-10-02T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.104594 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.104645 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.104658 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.104676 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.104831 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:32Z","lastTransitionTime":"2025-10-02T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.207552 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.207588 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.207599 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.207614 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.207624 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:32Z","lastTransitionTime":"2025-10-02T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.310233 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.310276 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.310285 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.310300 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.310311 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:32Z","lastTransitionTime":"2025-10-02T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.413024 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.413101 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.413113 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.413129 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.413142 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:32Z","lastTransitionTime":"2025-10-02T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.515434 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.515521 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.515538 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.515559 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.515573 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:32Z","lastTransitionTime":"2025-10-02T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.617812 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.617869 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.617879 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.617895 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.617906 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:32Z","lastTransitionTime":"2025-10-02T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.720112 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.720187 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.720207 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.720224 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.720237 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:32Z","lastTransitionTime":"2025-10-02T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.822218 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.822262 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.822274 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.822290 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.822301 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:32Z","lastTransitionTime":"2025-10-02T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.880231 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:32 crc kubenswrapper[4766]: E1002 10:52:32.880358 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.880744 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:32 crc kubenswrapper[4766]: E1002 10:52:32.880803 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.924198 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.924238 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.924248 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.924264 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:32 crc kubenswrapper[4766]: I1002 10:52:32.924275 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:32Z","lastTransitionTime":"2025-10-02T10:52:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.026474 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.026562 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.026576 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.026591 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.026611 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:33Z","lastTransitionTime":"2025-10-02T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.129336 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.129369 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.129378 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.129392 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.129405 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:33Z","lastTransitionTime":"2025-10-02T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.167729 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs\") pod \"network-metrics-daemon-klg2z\" (UID: \"6d68573a-5250-4407-8631-2199a3de7e9e\") " pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:33 crc kubenswrapper[4766]: E1002 10:52:33.167897 4766 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:52:33 crc kubenswrapper[4766]: E1002 10:52:33.167967 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs podName:6d68573a-5250-4407-8631-2199a3de7e9e nodeName:}" failed. No retries permitted until 2025-10-02 10:53:05.167951462 +0000 UTC m=+100.110822406 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs") pod "network-metrics-daemon-klg2z" (UID: "6d68573a-5250-4407-8631-2199a3de7e9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.232173 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.232223 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.232236 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.232253 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.232269 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:33Z","lastTransitionTime":"2025-10-02T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.334857 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.334905 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.334915 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.334932 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.334941 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:33Z","lastTransitionTime":"2025-10-02T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.437201 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.437241 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.437253 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.437267 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.437276 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:33Z","lastTransitionTime":"2025-10-02T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.539690 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.539727 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.539736 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.539752 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.539762 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:33Z","lastTransitionTime":"2025-10-02T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.641588 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.641623 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.641632 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.641649 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.641661 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:33Z","lastTransitionTime":"2025-10-02T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.743536 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.743581 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.743590 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.743608 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.743617 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:33Z","lastTransitionTime":"2025-10-02T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.845464 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.845540 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.845551 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.845570 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.845603 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:33Z","lastTransitionTime":"2025-10-02T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.881272 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.881272 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:33 crc kubenswrapper[4766]: E1002 10:52:33.881421 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:33 crc kubenswrapper[4766]: E1002 10:52:33.881528 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.948564 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.948611 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.948624 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.948642 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:33 crc kubenswrapper[4766]: I1002 10:52:33.948655 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:33Z","lastTransitionTime":"2025-10-02T10:52:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.051492 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.051546 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.051555 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.051571 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.051583 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:34Z","lastTransitionTime":"2025-10-02T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.154261 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.154291 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.154300 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.154314 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.154323 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:34Z","lastTransitionTime":"2025-10-02T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.256561 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.256605 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.256621 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.256641 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.256655 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:34Z","lastTransitionTime":"2025-10-02T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.358804 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.358852 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.358861 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.358883 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.358893 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:34Z","lastTransitionTime":"2025-10-02T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.461304 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.461336 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.461346 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.461360 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.461368 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:34Z","lastTransitionTime":"2025-10-02T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.539477 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.539556 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.539570 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.539590 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.539601 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:34Z","lastTransitionTime":"2025-10-02T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:34 crc kubenswrapper[4766]: E1002 10:52:34.550847 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.556012 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.556077 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.556095 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.556120 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.556136 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:34Z","lastTransitionTime":"2025-10-02T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:34 crc kubenswrapper[4766]: E1002 10:52:34.570162 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.575562 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.575616 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.575630 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.575647 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.575658 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:34Z","lastTransitionTime":"2025-10-02T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:34 crc kubenswrapper[4766]: E1002 10:52:34.590943 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.595359 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.595409 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.595419 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.595435 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.595458 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:34Z","lastTransitionTime":"2025-10-02T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:34 crc kubenswrapper[4766]: E1002 10:52:34.608487 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.612392 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.612444 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.612455 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.612474 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.612485 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:34Z","lastTransitionTime":"2025-10-02T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:34 crc kubenswrapper[4766]: E1002 10:52:34.624818 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:34Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:34 crc kubenswrapper[4766]: E1002 10:52:34.625307 4766 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.627065 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.627175 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.627297 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.627401 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.627488 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:34Z","lastTransitionTime":"2025-10-02T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.752922 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.752962 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.752975 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.752992 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.753006 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:34Z","lastTransitionTime":"2025-10-02T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.855226 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.855265 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.855278 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.855296 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.855309 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:34Z","lastTransitionTime":"2025-10-02T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.880257 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.880793 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:34 crc kubenswrapper[4766]: E1002 10:52:34.880937 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:34 crc kubenswrapper[4766]: E1002 10:52:34.881314 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.881318 4766 scope.go:117] "RemoveContainer" containerID="adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6" Oct 02 10:52:34 crc kubenswrapper[4766]: E1002 10:52:34.881698 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.957759 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.957814 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.957827 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.957849 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:34 crc kubenswrapper[4766]: I1002 10:52:34.957864 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:34Z","lastTransitionTime":"2025-10-02T10:52:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.059903 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.059941 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.059953 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.059969 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.059981 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:35Z","lastTransitionTime":"2025-10-02T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.162550 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.162643 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.162657 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.162681 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.162700 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:35Z","lastTransitionTime":"2025-10-02T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.265477 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.265547 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.265566 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.265590 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.265604 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:35Z","lastTransitionTime":"2025-10-02T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.368655 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.368776 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.368790 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.368809 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.368822 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:35Z","lastTransitionTime":"2025-10-02T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.471884 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.471935 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.471951 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.471969 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.471982 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:35Z","lastTransitionTime":"2025-10-02T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.575568 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.575799 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.575821 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.575868 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.575882 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:35Z","lastTransitionTime":"2025-10-02T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.679095 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.679151 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.679173 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.679198 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.679214 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:35Z","lastTransitionTime":"2025-10-02T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.781792 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.781825 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.781836 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.781854 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.781868 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:35Z","lastTransitionTime":"2025-10-02T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.880927 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:35 crc kubenswrapper[4766]: E1002 10:52:35.881059 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.881335 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:35 crc kubenswrapper[4766]: E1002 10:52:35.881391 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.883887 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.883916 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.883926 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.883940 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.883951 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:35Z","lastTransitionTime":"2025-10-02T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.896315 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.910572 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.922408 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.936129 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.953535 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.963943 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.973771 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.985240 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.985283 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.985296 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.985314 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.985328 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:35Z","lastTransitionTime":"2025-10-02T10:52:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:35 crc kubenswrapper[4766]: I1002 10:52:35.991140 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:35Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.004920 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:36Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.017062 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:36Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.031113 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:36Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.044546 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:36Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.059421 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"070e0a3b-5963-4adc-a4f5-020999886339\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79274b877cf8ff3109da1dc078ee5f0b19a541fb7026b110c1969e0c2d341cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae830f784230e44d237e8f6a6606c9969e54046b8010f6dcd0ca446ea264676f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f754efe5b2b5464759b533ec933dc5834a1be242592ca9375184f5fc24a72f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:36Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.075513 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:36Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.088075 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.088112 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.088121 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.088137 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.088145 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:36Z","lastTransitionTime":"2025-10-02T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.089554 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:36Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.104623 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:36Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.125434 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:21Z\\\",\\\"message\\\":\\\"Config(nil)\\\\nI1002 10:52:20.754739 6530 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:52:20.754765 6530 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754774 6530 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754769 6530 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 request\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:36Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.190601 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.190629 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.190638 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.190652 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.190663 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:36Z","lastTransitionTime":"2025-10-02T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.293174 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.293210 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.293224 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.293241 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.293254 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:36Z","lastTransitionTime":"2025-10-02T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.396570 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.396635 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.396648 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.396669 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.396682 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:36Z","lastTransitionTime":"2025-10-02T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.500011 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.500051 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.500081 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.500101 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.500113 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:36Z","lastTransitionTime":"2025-10-02T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.602884 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.602919 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.602928 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.602981 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.602992 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:36Z","lastTransitionTime":"2025-10-02T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.705453 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.705493 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.705526 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.705543 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.705553 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:36Z","lastTransitionTime":"2025-10-02T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.807454 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.807486 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.807495 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.807527 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.807539 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:36Z","lastTransitionTime":"2025-10-02T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.880329 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.880401 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:36 crc kubenswrapper[4766]: E1002 10:52:36.880479 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:36 crc kubenswrapper[4766]: E1002 10:52:36.880585 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.909923 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.909970 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.909981 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.910000 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:36 crc kubenswrapper[4766]: I1002 10:52:36.910012 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:36Z","lastTransitionTime":"2025-10-02T10:52:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.012068 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.012117 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.012132 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.012149 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.012162 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:37Z","lastTransitionTime":"2025-10-02T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.115027 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.115067 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.115078 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.115093 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.115103 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:37Z","lastTransitionTime":"2025-10-02T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.217562 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.217620 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.217632 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.217649 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.217663 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:37Z","lastTransitionTime":"2025-10-02T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.322359 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.322402 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.322417 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.322434 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.322447 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:37Z","lastTransitionTime":"2025-10-02T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.424412 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.424800 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.424816 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.424835 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.424848 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:37Z","lastTransitionTime":"2025-10-02T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.527587 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.527857 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.527936 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.528020 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.528101 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:37Z","lastTransitionTime":"2025-10-02T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.631065 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.631111 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.631122 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.631139 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.631152 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:37Z","lastTransitionTime":"2025-10-02T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.734008 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.734065 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.734079 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.734100 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.734116 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:37Z","lastTransitionTime":"2025-10-02T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.836747 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.836782 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.836792 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.836808 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.836819 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:37Z","lastTransitionTime":"2025-10-02T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.880525 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:37 crc kubenswrapper[4766]: E1002 10:52:37.880921 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.880680 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:37 crc kubenswrapper[4766]: E1002 10:52:37.881136 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.940240 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.940285 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.940306 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.940326 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:37 crc kubenswrapper[4766]: I1002 10:52:37.940342 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:37Z","lastTransitionTime":"2025-10-02T10:52:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.042759 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.043088 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.043164 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.043243 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.043322 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:38Z","lastTransitionTime":"2025-10-02T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.146101 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.146145 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.146157 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.146175 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.146185 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:38Z","lastTransitionTime":"2025-10-02T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.248236 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.248277 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.248286 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.248300 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.248310 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:38Z","lastTransitionTime":"2025-10-02T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.350664 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.350705 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.350717 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.350732 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.350742 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:38Z","lastTransitionTime":"2025-10-02T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.406225 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2jxdg_a6aa81c2-8c87-43df-badb-7b9dbef84ccf/kube-multus/0.log" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.406515 4766 generic.go:334] "Generic (PLEG): container finished" podID="a6aa81c2-8c87-43df-badb-7b9dbef84ccf" containerID="da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071" exitCode=1 Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.406620 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2jxdg" event={"ID":"a6aa81c2-8c87-43df-badb-7b9dbef84ccf","Type":"ContainerDied","Data":"da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071"} Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.407356 4766 scope.go:117] "RemoveContainer" containerID="da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.420453 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:38Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.431227 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:38Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.442565 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:38Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.455009 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:38Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.455103 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.455131 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.455152 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.455203 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.455216 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:38Z","lastTransitionTime":"2025-10-02T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.466848 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:38Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.479433 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"070e0a3b-5963-4adc-a4f5-020999886339\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79274b877cf8ff3109da1dc078ee5f0b19a541fb7026b110c1969e0c2d341cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae830f784230e44d237e8f6a6606c9969e54046b8010f6dcd0ca446ea264676f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f754efe5b2b5464759b533ec933dc5834a1be242592ca9375184f5fc24a72f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:38Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.492253 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:38Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.505103 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:38Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.517645 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:37Z\\\",\\\"message\\\":\\\"2025-10-02T10:51:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe\\\\n2025-10-02T10:51:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe to /host/opt/cni/bin/\\\\n2025-10-02T10:51:52Z [verbose] multus-daemon started\\\\n2025-10-02T10:51:52Z [verbose] Readiness Indicator file check\\\\n2025-10-02T10:52:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:38Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.535434 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:21Z\\\",\\\"message\\\":\\\"Config(nil)\\\\nI1002 10:52:20.754739 6530 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:52:20.754765 6530 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754774 6530 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754769 6530 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 request\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:38Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.555006 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:38Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.558445 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.558526 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.558541 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.558560 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.558571 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:38Z","lastTransitionTime":"2025-10-02T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.567380 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:38Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.579727 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:38Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.592491 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:38Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.607830 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:38Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.619862 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:38Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.634415 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:38Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.662040 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.662088 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.662100 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.662119 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.662132 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:38Z","lastTransitionTime":"2025-10-02T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.764728 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.764795 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.764809 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.764829 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.764844 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:38Z","lastTransitionTime":"2025-10-02T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.867235 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.867269 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.867278 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.867292 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.867305 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:38Z","lastTransitionTime":"2025-10-02T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.880809 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.880819 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:38 crc kubenswrapper[4766]: E1002 10:52:38.880951 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:38 crc kubenswrapper[4766]: E1002 10:52:38.881075 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.969268 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.969303 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.969315 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.969332 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:38 crc kubenswrapper[4766]: I1002 10:52:38.969343 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:38Z","lastTransitionTime":"2025-10-02T10:52:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.072155 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.072200 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.072209 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.072227 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.072237 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:39Z","lastTransitionTime":"2025-10-02T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.174743 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.174809 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.174821 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.174839 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.174849 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:39Z","lastTransitionTime":"2025-10-02T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.276928 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.276973 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.276986 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.277003 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.277016 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:39Z","lastTransitionTime":"2025-10-02T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.379521 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.379812 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.379926 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.379999 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.380061 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:39Z","lastTransitionTime":"2025-10-02T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.411758 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2jxdg_a6aa81c2-8c87-43df-badb-7b9dbef84ccf/kube-multus/0.log" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.411808 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2jxdg" event={"ID":"a6aa81c2-8c87-43df-badb-7b9dbef84ccf","Type":"ContainerStarted","Data":"ff77e4fb340919ea122bf7a3ecdab638bcb0d9dde19ec12b466f14a2cf2e753f"} Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.422616 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:39Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.430375 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:39Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.438690 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:39Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.448316 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:39Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.459216 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:39Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.471090 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"070e0a3b-5963-4adc-a4f5-020999886339\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79274b877cf8ff3109da1dc078ee5f0b19a541fb7026b110c1969e0c2d341cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae830f784230e44d237e8f6a6606c9969e54046b8010f6dcd0ca446ea264676f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f754efe5b2b5464759b533ec933dc5834a1be242592ca9375184f5fc24a72f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:39Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.482608 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.482655 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.482665 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.482680 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.482692 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:39Z","lastTransitionTime":"2025-10-02T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.483910 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:39Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.495330 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:39Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.506634 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff77e4fb340919ea122bf7a3ecdab638bcb0d9dde19ec12b466f14a2cf2e753f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:37Z\\\",\\\"message\\\":\\\"2025-10-02T10:51:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe\\\\n2025-10-02T10:51:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe to /host/opt/cni/bin/\\\\n2025-10-02T10:51:52Z [verbose] multus-daemon started\\\\n2025-10-02T10:51:52Z [verbose] Readiness Indicator file check\\\\n2025-10-02T10:52:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:39Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.522625 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:21Z\\\",\\\"message\\\":\\\"Config(nil)\\\\nI1002 10:52:20.754739 6530 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:52:20.754765 6530 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754774 6530 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754769 6530 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 request\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:39Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.534904 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:39Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.546473 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:39Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.555765 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:39Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.565732 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:39Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.579520 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:39Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.584606 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.584654 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.584669 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.584690 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.584705 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:39Z","lastTransitionTime":"2025-10-02T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.590273 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:39Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.601259 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:39Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.687693 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.687738 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.687752 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.687773 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.687785 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:39Z","lastTransitionTime":"2025-10-02T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.790039 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.790074 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.790086 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.790102 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.790113 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:39Z","lastTransitionTime":"2025-10-02T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.881127 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:39 crc kubenswrapper[4766]: E1002 10:52:39.881266 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.881301 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:39 crc kubenswrapper[4766]: E1002 10:52:39.881431 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.896173 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.896265 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.896286 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.896318 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.896343 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:39Z","lastTransitionTime":"2025-10-02T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.999148 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.999185 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.999196 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.999214 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:39 crc kubenswrapper[4766]: I1002 10:52:39.999226 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:39Z","lastTransitionTime":"2025-10-02T10:52:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.101377 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.101431 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.101441 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.101455 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.101464 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:40Z","lastTransitionTime":"2025-10-02T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.203494 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.203553 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.203565 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.203582 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.203594 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:40Z","lastTransitionTime":"2025-10-02T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.306381 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.306431 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.306440 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.306456 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.306469 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:40Z","lastTransitionTime":"2025-10-02T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.408808 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.408882 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.408894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.408914 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.408926 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:40Z","lastTransitionTime":"2025-10-02T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.511793 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.511828 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.511838 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.511852 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.511863 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:40Z","lastTransitionTime":"2025-10-02T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.614648 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.614700 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.614712 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.614759 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.614769 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:40Z","lastTransitionTime":"2025-10-02T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.717345 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.717389 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.717401 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.717417 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.717429 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:40Z","lastTransitionTime":"2025-10-02T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.819963 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.819999 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.820008 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.820022 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.820031 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:40Z","lastTransitionTime":"2025-10-02T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.880805 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.880880 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:40 crc kubenswrapper[4766]: E1002 10:52:40.880930 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:40 crc kubenswrapper[4766]: E1002 10:52:40.881042 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.922044 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.922083 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.922092 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.922106 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:40 crc kubenswrapper[4766]: I1002 10:52:40.922116 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:40Z","lastTransitionTime":"2025-10-02T10:52:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.024724 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.024783 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.024793 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.024807 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.024817 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:41Z","lastTransitionTime":"2025-10-02T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.127216 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.127263 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.127273 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.127290 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.127301 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:41Z","lastTransitionTime":"2025-10-02T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.229816 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.229849 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.229858 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.229871 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.229881 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:41Z","lastTransitionTime":"2025-10-02T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.331963 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.331997 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.332026 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.332044 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.332052 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:41Z","lastTransitionTime":"2025-10-02T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.434606 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.434645 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.434654 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.434670 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.434679 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:41Z","lastTransitionTime":"2025-10-02T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.537026 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.537081 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.537094 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.537110 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.537132 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:41Z","lastTransitionTime":"2025-10-02T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.639817 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.639862 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.639894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.639918 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.639934 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:41Z","lastTransitionTime":"2025-10-02T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.742141 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.742197 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.742211 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.742228 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.742240 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:41Z","lastTransitionTime":"2025-10-02T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.844830 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.844876 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.844888 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.844905 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.844918 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:41Z","lastTransitionTime":"2025-10-02T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.881002 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.881129 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:41 crc kubenswrapper[4766]: E1002 10:52:41.881158 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:41 crc kubenswrapper[4766]: E1002 10:52:41.881286 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.947547 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.947587 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.947597 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.947613 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:41 crc kubenswrapper[4766]: I1002 10:52:41.947625 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:41Z","lastTransitionTime":"2025-10-02T10:52:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.049992 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.050056 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.050081 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.050098 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.050111 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:42Z","lastTransitionTime":"2025-10-02T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.152342 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.152387 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.152401 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.152420 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.152433 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:42Z","lastTransitionTime":"2025-10-02T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.254709 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.254781 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.254796 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.254815 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.254827 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:42Z","lastTransitionTime":"2025-10-02T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.357404 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.357449 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.357461 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.357477 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.357490 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:42Z","lastTransitionTime":"2025-10-02T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.464150 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.464201 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.464214 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.464231 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.464249 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:42Z","lastTransitionTime":"2025-10-02T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.566630 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.566679 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.566692 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.566709 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.566747 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:42Z","lastTransitionTime":"2025-10-02T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.670359 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.670419 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.670430 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.670446 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.670455 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:42Z","lastTransitionTime":"2025-10-02T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.772838 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.772886 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.772896 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.772913 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.772925 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:42Z","lastTransitionTime":"2025-10-02T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.875878 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.875976 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.875994 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.876017 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.876027 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:42Z","lastTransitionTime":"2025-10-02T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.880834 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.880867 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:42 crc kubenswrapper[4766]: E1002 10:52:42.880959 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:42 crc kubenswrapper[4766]: E1002 10:52:42.881102 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.977926 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.977973 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.977984 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.978000 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:42 crc kubenswrapper[4766]: I1002 10:52:42.978012 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:42Z","lastTransitionTime":"2025-10-02T10:52:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.080536 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.080604 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.080619 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.080635 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.080646 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:43Z","lastTransitionTime":"2025-10-02T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.182733 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.182771 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.182783 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.182828 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.182841 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:43Z","lastTransitionTime":"2025-10-02T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.286067 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.286333 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.286441 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.286547 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.286641 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:43Z","lastTransitionTime":"2025-10-02T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.389324 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.389360 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.389371 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.389385 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.389395 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:43Z","lastTransitionTime":"2025-10-02T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.491419 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.491749 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.491862 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.491973 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.492060 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:43Z","lastTransitionTime":"2025-10-02T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.594181 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.594227 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.594239 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.594256 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.594267 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:43Z","lastTransitionTime":"2025-10-02T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.696570 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.696631 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.696641 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.696659 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.696670 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:43Z","lastTransitionTime":"2025-10-02T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.798850 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.799195 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.799277 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.799362 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.799431 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:43Z","lastTransitionTime":"2025-10-02T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.881006 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:43 crc kubenswrapper[4766]: E1002 10:52:43.881153 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.881355 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:43 crc kubenswrapper[4766]: E1002 10:52:43.881558 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.901437 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.901487 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.901528 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.901553 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:43 crc kubenswrapper[4766]: I1002 10:52:43.901570 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:43Z","lastTransitionTime":"2025-10-02T10:52:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.003894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.003939 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.003948 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.003963 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.003975 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:44Z","lastTransitionTime":"2025-10-02T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.105968 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.106262 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.106344 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.106425 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.106517 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:44Z","lastTransitionTime":"2025-10-02T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.210209 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.210446 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.210583 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.210656 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.210722 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:44Z","lastTransitionTime":"2025-10-02T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.313107 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.313142 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.313154 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.313170 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.313182 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:44Z","lastTransitionTime":"2025-10-02T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.415476 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.415536 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.415549 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.415568 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.415580 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:44Z","lastTransitionTime":"2025-10-02T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.518712 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.518758 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.518769 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.518787 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.518799 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:44Z","lastTransitionTime":"2025-10-02T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.621681 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.621726 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.621737 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.621752 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.621764 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:44Z","lastTransitionTime":"2025-10-02T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.725072 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.725138 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.725153 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.725182 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.725210 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:44Z","lastTransitionTime":"2025-10-02T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.806521 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.806560 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.806570 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.806585 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.806594 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:44Z","lastTransitionTime":"2025-10-02T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:44 crc kubenswrapper[4766]: E1002 10:52:44.820618 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.824413 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.824452 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.824462 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.824477 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.824487 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:44Z","lastTransitionTime":"2025-10-02T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:44 crc kubenswrapper[4766]: E1002 10:52:44.835422 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.838751 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.838795 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.838806 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.838825 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.838836 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:44Z","lastTransitionTime":"2025-10-02T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:44 crc kubenswrapper[4766]: E1002 10:52:44.851205 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.855103 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.855154 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.855167 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.855184 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.855197 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:44Z","lastTransitionTime":"2025-10-02T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:44 crc kubenswrapper[4766]: E1002 10:52:44.866817 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.871002 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.871041 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.871052 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.871067 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.871080 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:44Z","lastTransitionTime":"2025-10-02T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.880241 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.880340 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:44 crc kubenswrapper[4766]: E1002 10:52:44.880389 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:44 crc kubenswrapper[4766]: E1002 10:52:44.880493 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:44 crc kubenswrapper[4766]: E1002 10:52:44.883885 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:44Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:44 crc kubenswrapper[4766]: E1002 10:52:44.884166 4766 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.885972 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.886078 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.886167 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.886245 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.886312 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:44Z","lastTransitionTime":"2025-10-02T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.988672 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.988742 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.988754 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.988772 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:44 crc kubenswrapper[4766]: I1002 10:52:44.988784 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:44Z","lastTransitionTime":"2025-10-02T10:52:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.091475 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.091549 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.091558 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.091572 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.091581 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:45Z","lastTransitionTime":"2025-10-02T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.194234 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.194273 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.194282 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.194295 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.194305 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:45Z","lastTransitionTime":"2025-10-02T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.298186 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.298242 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.298257 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.298283 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.298299 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:45Z","lastTransitionTime":"2025-10-02T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.401743 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.401783 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.401791 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.401807 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.401817 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:45Z","lastTransitionTime":"2025-10-02T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.504069 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.504099 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.504109 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.504123 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.504179 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:45Z","lastTransitionTime":"2025-10-02T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.606570 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.606598 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.606607 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.606620 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.606630 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:45Z","lastTransitionTime":"2025-10-02T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.708946 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.709000 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.709014 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.709030 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.709042 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:45Z","lastTransitionTime":"2025-10-02T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.811113 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.811161 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.811171 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.811188 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.811200 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:45Z","lastTransitionTime":"2025-10-02T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.881281 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.881311 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:45 crc kubenswrapper[4766]: E1002 10:52:45.881416 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:45 crc kubenswrapper[4766]: E1002 10:52:45.881592 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.895794 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff77e4fb340919ea122bf7a3ecdab638bcb0d9dde19ec12b466f14a2cf2e753f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:37Z\\\",\\\"message\\\":\\\"2025-10-02T10:51:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe\\\\n2025-10-02T10:51:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe to /host/opt/cni/bin/\\\\n2025-10-02T10:51:52Z [verbose] multus-daemon started\\\\n2025-10-02T10:51:52Z [verbose] Readiness Indicator file check\\\\n2025-10-02T10:52:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.913631 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.913666 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.913679 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.913695 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.913707 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:45Z","lastTransitionTime":"2025-10-02T10:52:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.919300 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:21Z\\\",\\\"message\\\":\\\"Config(nil)\\\\nI1002 10:52:20.754739 6530 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:52:20.754765 6530 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754774 6530 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754769 6530 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 request\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.935589 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.947742 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"070e0a3b-5963-4adc-a4f5-020999886339\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79274b877cf8ff3109da1dc078ee5f0b19a541fb7026b110c1969e0c2d341cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae830f784230e44d237e8f6a6606c9969e54046b8010f6dcd0ca446ea264676f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f754efe5b2b5464759b533ec933dc5834a1be242592ca9375184f5fc24a72f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.961700 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.975732 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:45 crc kubenswrapper[4766]: I1002 10:52:45.990086 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:45Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.003182 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.015688 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.015744 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.015758 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.015772 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.015781 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:46Z","lastTransitionTime":"2025-10-02T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.016789 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.027706 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.038926 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.049395 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.061073 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.071092 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.088258 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.105854 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.118840 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.118882 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.118890 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.118906 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.118914 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:46Z","lastTransitionTime":"2025-10-02T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.119367 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:46Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.221059 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.221437 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.221571 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.221692 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.221806 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:46Z","lastTransitionTime":"2025-10-02T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.324844 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.324888 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.324897 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.324917 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.324936 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:46Z","lastTransitionTime":"2025-10-02T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.427246 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.428469 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.428523 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.428542 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.428554 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:46Z","lastTransitionTime":"2025-10-02T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.530774 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.530814 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.530826 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.530844 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.530856 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:46Z","lastTransitionTime":"2025-10-02T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.633579 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.633632 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.633643 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.633661 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.633673 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:46Z","lastTransitionTime":"2025-10-02T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.737077 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.737120 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.737130 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.737147 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.737157 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:46Z","lastTransitionTime":"2025-10-02T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.840167 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.840204 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.840219 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.840237 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.840247 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:46Z","lastTransitionTime":"2025-10-02T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.881179 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.881255 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:46 crc kubenswrapper[4766]: E1002 10:52:46.881359 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:46 crc kubenswrapper[4766]: E1002 10:52:46.881410 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.942581 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.942938 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.942949 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.942967 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:46 crc kubenswrapper[4766]: I1002 10:52:46.942980 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:46Z","lastTransitionTime":"2025-10-02T10:52:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.045952 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.046001 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.046014 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.046031 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.046043 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:47Z","lastTransitionTime":"2025-10-02T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.150574 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.150642 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.150656 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.150677 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.150697 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:47Z","lastTransitionTime":"2025-10-02T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.254612 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.254646 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.254655 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.254668 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.254678 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:47Z","lastTransitionTime":"2025-10-02T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.356799 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.356847 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.356859 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.356875 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.356890 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:47Z","lastTransitionTime":"2025-10-02T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.459367 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.459432 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.459441 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.459457 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.459467 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:47Z","lastTransitionTime":"2025-10-02T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.562302 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.562339 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.562348 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.562381 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.562393 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:47Z","lastTransitionTime":"2025-10-02T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.664619 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.664652 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.664661 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.664679 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.664691 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:47Z","lastTransitionTime":"2025-10-02T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.767548 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.767590 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.767598 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.767614 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.767624 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:47Z","lastTransitionTime":"2025-10-02T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.870396 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.870444 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.870452 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.870467 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.870478 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:47Z","lastTransitionTime":"2025-10-02T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.881023 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:47 crc kubenswrapper[4766]: E1002 10:52:47.881141 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.881377 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:47 crc kubenswrapper[4766]: E1002 10:52:47.881567 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.972763 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.972814 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.972825 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.972845 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:47 crc kubenswrapper[4766]: I1002 10:52:47.972859 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:47Z","lastTransitionTime":"2025-10-02T10:52:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.075610 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.075637 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.075646 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.075659 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.075668 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:48Z","lastTransitionTime":"2025-10-02T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.179090 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.179160 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.179175 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.179203 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.179220 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:48Z","lastTransitionTime":"2025-10-02T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.281707 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.281766 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.281785 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.281816 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.281838 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:48Z","lastTransitionTime":"2025-10-02T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.384281 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.384331 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.384348 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.384362 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.384372 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:48Z","lastTransitionTime":"2025-10-02T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.486761 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.486822 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.486835 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.486851 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.486860 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:48Z","lastTransitionTime":"2025-10-02T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.589573 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.589618 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.589635 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.589652 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.589664 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:48Z","lastTransitionTime":"2025-10-02T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.692288 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.692324 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.692335 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.692350 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.692362 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:48Z","lastTransitionTime":"2025-10-02T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.794554 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.794607 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.794619 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.794637 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.794650 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:48Z","lastTransitionTime":"2025-10-02T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.880614 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.880729 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:48 crc kubenswrapper[4766]: E1002 10:52:48.880756 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:48 crc kubenswrapper[4766]: E1002 10:52:48.880924 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.881811 4766 scope.go:117] "RemoveContainer" containerID="adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.897426 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.897476 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.897493 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.897530 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:48 crc kubenswrapper[4766]: I1002 10:52:48.897544 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:48Z","lastTransitionTime":"2025-10-02T10:52:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.000731 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.000791 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.000802 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.000822 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.000834 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:49Z","lastTransitionTime":"2025-10-02T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.103570 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.103634 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.103649 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.103669 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.103684 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:49Z","lastTransitionTime":"2025-10-02T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.205846 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.205888 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.205903 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.205921 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.205935 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:49Z","lastTransitionTime":"2025-10-02T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.308634 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.308661 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.308671 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.308684 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.308694 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:49Z","lastTransitionTime":"2025-10-02T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.414493 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.414578 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.414589 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.414606 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.414622 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:49Z","lastTransitionTime":"2025-10-02T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.441544 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovnkube-controller/2.log" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.444204 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerStarted","Data":"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560"} Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.444641 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.456819 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.466301 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.480094 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.490910 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.503617 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.517228 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.517494 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.517773 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.517970 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.518163 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:49Z","lastTransitionTime":"2025-10-02T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.517127 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.530368 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"070e0a3b-5963-4adc-a4f5-020999886339\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79274b877cf8ff3109da1dc078ee5f0b19a541fb7026b110c1969e0c2d341cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae830f784230e44d237e8f6a6606c9969e54046b8010f6dcd0ca446ea264676f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f754efe5b2b5464759b533ec933dc5834a1be242592ca9375184f5fc24a72f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.543495 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.558705 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.577648 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff77e4fb340919ea122bf7a3ecdab638bcb0d9dde19ec12b466f14a2cf2e753f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:37Z\\\",\\\"message\\\":\\\"2025-10-02T10:51:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe\\\\n2025-10-02T10:51:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe to /host/opt/cni/bin/\\\\n2025-10-02T10:51:52Z [verbose] multus-daemon started\\\\n2025-10-02T10:51:52Z [verbose] Readiness Indicator file check\\\\n2025-10-02T10:52:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.608976 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:21Z\\\",\\\"message\\\":\\\"Config(nil)\\\\nI1002 10:52:20.754739 6530 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:52:20.754765 6530 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754774 6530 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754769 6530 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 request\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.621081 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.621132 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.621144 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.621165 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.621178 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:49Z","lastTransitionTime":"2025-10-02T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.624195 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.639664 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.652230 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.663264 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.679944 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.690656 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:49Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.723487 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.723639 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.723653 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.723672 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.723689 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:49Z","lastTransitionTime":"2025-10-02T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.826775 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.826837 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.826856 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.826881 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.826899 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:49Z","lastTransitionTime":"2025-10-02T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.880736 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:49 crc kubenswrapper[4766]: E1002 10:52:49.880982 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.881132 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:49 crc kubenswrapper[4766]: E1002 10:52:49.881239 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.930210 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.930302 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.930327 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.930362 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:49 crc kubenswrapper[4766]: I1002 10:52:49.930385 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:49Z","lastTransitionTime":"2025-10-02T10:52:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.033485 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.033607 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.033628 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.033657 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.033676 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:50Z","lastTransitionTime":"2025-10-02T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.137544 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.137634 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.137659 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.137695 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.137719 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:50Z","lastTransitionTime":"2025-10-02T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.240902 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.240949 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.240960 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.240975 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.240986 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:50Z","lastTransitionTime":"2025-10-02T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.343333 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.343380 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.343388 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.343402 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.343411 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:50Z","lastTransitionTime":"2025-10-02T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.446061 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.446126 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.446139 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.446154 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.446164 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:50Z","lastTransitionTime":"2025-10-02T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.448462 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovnkube-controller/3.log" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.448943 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovnkube-controller/2.log" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.451850 4766 generic.go:334] "Generic (PLEG): container finished" podID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerID="a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560" exitCode=1 Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.451900 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerDied","Data":"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560"} Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.451950 4766 scope.go:117] "RemoveContainer" containerID="adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.452583 4766 scope.go:117] "RemoveContainer" containerID="a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560" Oct 02 10:52:50 crc kubenswrapper[4766]: E1002 10:52:50.452843 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.468195 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.479245 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.492282 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.504731 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.519450 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.534727 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.553001 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.553894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.553941 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.553952 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.553971 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.553984 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:50Z","lastTransitionTime":"2025-10-02T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.567741 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff77e4fb340919ea122bf7a3ecdab638bcb0d9dde19ec12b466f14a2cf2e753f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:37Z\\\",\\\"message\\\":\\\"2025-10-02T10:51:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe\\\\n2025-10-02T10:51:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe to /host/opt/cni/bin/\\\\n2025-10-02T10:51:52Z [verbose] multus-daemon started\\\\n2025-10-02T10:51:52Z [verbose] Readiness Indicator file check\\\\n2025-10-02T10:52:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.589524 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://adb51219e2aa5ee0bca123c3488ad1adbb25edb92ec091796623d3934ad4aaa6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:21Z\\\",\\\"message\\\":\\\"Config(nil)\\\\nI1002 10:52:20.754739 6530 services_controller.go:451] Built service openshift-service-ca-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.40\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:52:20.754765 6530 services_controller.go:452] Built service openshift-service-ca-operator/metrics per-node LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754774 6530 services_controller.go:453] Built service openshift-service-ca-operator/metrics template LB for network=default: []services.LB{}\\\\nI1002 10:52:20.754769 6530 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:04 10.217.0.4]} options:{GoMap:map[iface-id-ver:3b6479f0-333b-4a96-9adf-2099afdc2447 request\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:50Z\\\",\\\"message\\\":\\\"ervice\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.41\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:52:49.802554 6927 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-l99lx in node crc\\\\nI1002 10:52:49.802544 6927 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 10:52:49.802562 6927 services_controller.go:452] Built service openshift-machine-api/control-plane-machine-set-operator per-node LB for network=default: []services.LB{}\\\\nF1002 10:52:49.801767 6927 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.607419 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.623081 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"070e0a3b-5963-4adc-a4f5-020999886339\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79274b877cf8ff3109da1dc078ee5f0b19a541fb7026b110c1969e0c2d341cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae830f784230e44d237e8f6a6606c9969e54046b8010f6dcd0ca446ea264676f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f754efe5b2b5464759b533ec933dc5834a1be242592ca9375184f5fc24a72f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.635033 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.648864 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.656983 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.657047 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.657062 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.657087 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.657103 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:50Z","lastTransitionTime":"2025-10-02T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.665720 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.677819 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.689140 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.700432 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:50Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.760463 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.760553 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.760570 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.760593 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.760608 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:50Z","lastTransitionTime":"2025-10-02T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.864014 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.864049 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.864057 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.864070 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.864079 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:50Z","lastTransitionTime":"2025-10-02T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.880613 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.880634 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:50 crc kubenswrapper[4766]: E1002 10:52:50.880819 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:50 crc kubenswrapper[4766]: E1002 10:52:50.880729 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.966993 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.967027 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.967037 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.967050 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:50 crc kubenswrapper[4766]: I1002 10:52:50.967059 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:50Z","lastTransitionTime":"2025-10-02T10:52:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.069121 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.069172 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.069181 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.069197 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.069207 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:51Z","lastTransitionTime":"2025-10-02T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.171298 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.171338 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.171349 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.171364 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.171375 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:51Z","lastTransitionTime":"2025-10-02T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.273718 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.273758 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.273771 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.273787 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.273800 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:51Z","lastTransitionTime":"2025-10-02T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.376467 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.376626 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.376646 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.376660 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.376670 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:51Z","lastTransitionTime":"2025-10-02T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.455995 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovnkube-controller/3.log" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.459170 4766 scope.go:117] "RemoveContainer" containerID="a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560" Oct 02 10:52:51 crc kubenswrapper[4766]: E1002 10:52:51.459323 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.468274 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.479179 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.479235 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.479246 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.479262 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.479276 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:51Z","lastTransitionTime":"2025-10-02T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.480488 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.495791 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.507214 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.520020 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.535423 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.549392 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.560326 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.571700 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.582361 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.582426 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.582438 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.582455 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.582468 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:51Z","lastTransitionTime":"2025-10-02T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.583163 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.593815 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.606560 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.619325 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.631043 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff77e4fb340919ea122bf7a3ecdab638bcb0d9dde19ec12b466f14a2cf2e753f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:37Z\\\",\\\"message\\\":\\\"2025-10-02T10:51:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe\\\\n2025-10-02T10:51:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe to /host/opt/cni/bin/\\\\n2025-10-02T10:51:52Z [verbose] multus-daemon started\\\\n2025-10-02T10:51:52Z [verbose] Readiness Indicator file check\\\\n2025-10-02T10:52:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.646985 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:50Z\\\",\\\"message\\\":\\\"ervice\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.41\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:52:49.802554 6927 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-l99lx in node crc\\\\nI1002 10:52:49.802544 6927 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 10:52:49.802562 6927 services_controller.go:452] Built service openshift-machine-api/control-plane-machine-set-operator per-node LB for network=default: []services.LB{}\\\\nF1002 10:52:49.801767 6927 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.658589 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.669788 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"070e0a3b-5963-4adc-a4f5-020999886339\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79274b877cf8ff3109da1dc078ee5f0b19a541fb7026b110c1969e0c2d341cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae830f784230e44d237e8f6a6606c9969e54046b8010f6dcd0ca446ea264676f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f754efe5b2b5464759b533ec933dc5834a1be242592ca9375184f5fc24a72f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:51Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.684211 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.684237 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.684245 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.684258 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.684267 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:51Z","lastTransitionTime":"2025-10-02T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.786803 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.786840 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.786850 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.786865 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.786876 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:51Z","lastTransitionTime":"2025-10-02T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.820578 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.820687 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.820716 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.820736 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:51 crc kubenswrapper[4766]: E1002 10:52:51.820792 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.820758348 +0000 UTC m=+150.763629292 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:52:51 crc kubenswrapper[4766]: E1002 10:52:51.820803 4766 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:52:51 crc kubenswrapper[4766]: E1002 10:52:51.820836 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.820829011 +0000 UTC m=+150.763699955 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.820831 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:51 crc kubenswrapper[4766]: E1002 10:52:51.820850 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:52:51 crc kubenswrapper[4766]: E1002 10:52:51.820871 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:52:51 crc kubenswrapper[4766]: E1002 10:52:51.820879 4766 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:52:51 crc kubenswrapper[4766]: E1002 10:52:51.820883 4766 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:52:51 crc kubenswrapper[4766]: E1002 10:52:51.820904 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.820897963 +0000 UTC m=+150.763768907 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 02 10:52:51 crc kubenswrapper[4766]: E1002 10:52:51.820914 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.820909733 +0000 UTC m=+150.763780677 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:52:51 crc kubenswrapper[4766]: E1002 10:52:51.820998 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 02 10:52:51 crc kubenswrapper[4766]: E1002 10:52:51.821052 4766 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 02 10:52:51 crc kubenswrapper[4766]: E1002 10:52:51.821072 4766 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:52:51 crc kubenswrapper[4766]: E1002 10:52:51.821130 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.8211124 +0000 UTC m=+150.763983344 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.880910 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.880984 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:51 crc kubenswrapper[4766]: E1002 10:52:51.881084 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:51 crc kubenswrapper[4766]: E1002 10:52:51.881271 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.889403 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.889434 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.889442 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.889456 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.889465 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:51Z","lastTransitionTime":"2025-10-02T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.991580 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.991625 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.991636 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.991653 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:51 crc kubenswrapper[4766]: I1002 10:52:51.991672 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:51Z","lastTransitionTime":"2025-10-02T10:52:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.093587 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.093817 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.093928 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.094071 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.094177 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:52Z","lastTransitionTime":"2025-10-02T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.197107 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.197137 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.197146 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.197157 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.197166 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:52Z","lastTransitionTime":"2025-10-02T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.298983 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.299013 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.299021 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.299034 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.299043 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:52Z","lastTransitionTime":"2025-10-02T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.401610 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.401653 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.401663 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.401679 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.401690 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:52Z","lastTransitionTime":"2025-10-02T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.504216 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.504249 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.504260 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.504274 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.504286 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:52Z","lastTransitionTime":"2025-10-02T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.606227 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.606266 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.606277 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.606291 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.606301 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:52Z","lastTransitionTime":"2025-10-02T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.709001 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.709046 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.709056 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.709073 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.709086 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:52Z","lastTransitionTime":"2025-10-02T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.812616 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.812664 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.812682 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.812718 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.812741 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:52Z","lastTransitionTime":"2025-10-02T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.880387 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:52 crc kubenswrapper[4766]: E1002 10:52:52.880628 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.880387 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:52 crc kubenswrapper[4766]: E1002 10:52:52.880972 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.916029 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.916087 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.916109 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.916182 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:52 crc kubenswrapper[4766]: I1002 10:52:52.916402 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:52Z","lastTransitionTime":"2025-10-02T10:52:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.019367 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.019429 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.019448 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.019470 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.019489 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:53Z","lastTransitionTime":"2025-10-02T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.122226 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.122258 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.122267 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.122281 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.122291 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:53Z","lastTransitionTime":"2025-10-02T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.225314 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.225379 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.225396 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.225412 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.225422 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:53Z","lastTransitionTime":"2025-10-02T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.327785 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.327830 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.327838 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.327852 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.327862 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:53Z","lastTransitionTime":"2025-10-02T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.430237 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.430276 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.430284 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.430298 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.430307 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:53Z","lastTransitionTime":"2025-10-02T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.532763 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.532800 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.532809 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.532822 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.532831 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:53Z","lastTransitionTime":"2025-10-02T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.636067 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.636132 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.636148 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.636171 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.636186 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:53Z","lastTransitionTime":"2025-10-02T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.738268 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.738314 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.738325 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.738338 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.738347 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:53Z","lastTransitionTime":"2025-10-02T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.840778 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.840818 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.840828 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.840846 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.840857 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:53Z","lastTransitionTime":"2025-10-02T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.880675 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.880732 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:53 crc kubenswrapper[4766]: E1002 10:52:53.880874 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:53 crc kubenswrapper[4766]: E1002 10:52:53.881062 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.943953 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.944452 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.944466 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.944490 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:53 crc kubenswrapper[4766]: I1002 10:52:53.944521 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:53Z","lastTransitionTime":"2025-10-02T10:52:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.046223 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.046259 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.046270 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.046285 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.046297 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:54Z","lastTransitionTime":"2025-10-02T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.148687 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.148727 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.148741 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.148758 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.148767 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:54Z","lastTransitionTime":"2025-10-02T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.251524 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.251566 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.251575 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.251592 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.251604 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:54Z","lastTransitionTime":"2025-10-02T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.354346 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.354378 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.354387 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.354400 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.354408 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:54Z","lastTransitionTime":"2025-10-02T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.456969 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.457007 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.457018 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.457033 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.457045 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:54Z","lastTransitionTime":"2025-10-02T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.561339 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.561398 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.561415 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.561435 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.561445 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:54Z","lastTransitionTime":"2025-10-02T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.663603 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.663660 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.663673 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.663690 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.663701 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:54Z","lastTransitionTime":"2025-10-02T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.766815 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.766888 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.766905 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.766930 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.766949 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:54Z","lastTransitionTime":"2025-10-02T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.869804 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.869853 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.869866 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.869884 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.869898 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:54Z","lastTransitionTime":"2025-10-02T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.880903 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:54 crc kubenswrapper[4766]: E1002 10:52:54.881043 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.881103 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:54 crc kubenswrapper[4766]: E1002 10:52:54.881319 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.972748 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.972826 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.972841 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.972864 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:54 crc kubenswrapper[4766]: I1002 10:52:54.972881 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:54Z","lastTransitionTime":"2025-10-02T10:52:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.076202 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.076279 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.076300 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.076331 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.076352 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:55Z","lastTransitionTime":"2025-10-02T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.143333 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.143419 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.143443 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.143482 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.143531 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:55Z","lastTransitionTime":"2025-10-02T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:55 crc kubenswrapper[4766]: E1002 10:52:55.156880 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.161354 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.161438 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.161460 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.161492 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.161541 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:55Z","lastTransitionTime":"2025-10-02T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:55 crc kubenswrapper[4766]: E1002 10:52:55.180521 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.185211 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.185241 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.185250 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.185264 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.185279 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:55Z","lastTransitionTime":"2025-10-02T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:55 crc kubenswrapper[4766]: E1002 10:52:55.198629 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.201741 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.201773 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.201781 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.201794 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.201804 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:55Z","lastTransitionTime":"2025-10-02T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:55 crc kubenswrapper[4766]: E1002 10:52:55.219413 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.224026 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.224062 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.224071 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.224117 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.224126 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:55Z","lastTransitionTime":"2025-10-02T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:55 crc kubenswrapper[4766]: E1002 10:52:55.237867 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:55 crc kubenswrapper[4766]: E1002 10:52:55.237973 4766 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.239273 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.239315 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.239330 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.239347 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.239359 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:55Z","lastTransitionTime":"2025-10-02T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.341995 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.342035 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.342044 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.342061 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.342072 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:55Z","lastTransitionTime":"2025-10-02T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.444358 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.444401 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.444412 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.444427 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.444440 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:55Z","lastTransitionTime":"2025-10-02T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.547484 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.547570 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.547583 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.547601 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.547613 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:55Z","lastTransitionTime":"2025-10-02T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.649388 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.649435 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.649445 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.649463 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.649475 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:55Z","lastTransitionTime":"2025-10-02T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.752915 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.752951 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.752961 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.752980 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.752996 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:55Z","lastTransitionTime":"2025-10-02T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.855441 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.855494 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.855535 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.855561 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.855579 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:55Z","lastTransitionTime":"2025-10-02T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.880683 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.881093 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:55 crc kubenswrapper[4766]: E1002 10:52:55.881795 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:55 crc kubenswrapper[4766]: E1002 10:52:55.881326 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.892638 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.893969 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.901703 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.910799 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.920042 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.929364 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.941802 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.950729 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"070e0a3b-5963-4adc-a4f5-020999886339\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79274b877cf8ff3109da1dc078ee5f0b19a541fb7026b110c1969e0c2d341cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae830f784230e44d237e8f6a6606c9969e54046b8010f6dcd0ca446ea264676f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f754efe5b2b5464759b533ec933dc5834a1be242592ca9375184f5fc24a72f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.957724 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.957773 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.957789 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.957805 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.957815 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:55Z","lastTransitionTime":"2025-10-02T10:52:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.962194 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.972375 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:55 crc kubenswrapper[4766]: I1002 10:52:55.985078 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff77e4fb340919ea122bf7a3ecdab638bcb0d9dde19ec12b466f14a2cf2e753f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:37Z\\\",\\\"message\\\":\\\"2025-10-02T10:51:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe\\\\n2025-10-02T10:51:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe to /host/opt/cni/bin/\\\\n2025-10-02T10:51:52Z [verbose] multus-daemon started\\\\n2025-10-02T10:51:52Z [verbose] Readiness Indicator file check\\\\n2025-10-02T10:52:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.001624 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:50Z\\\",\\\"message\\\":\\\"ervice\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.41\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:52:49.802554 6927 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-l99lx in node crc\\\\nI1002 10:52:49.802544 6927 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 10:52:49.802562 6927 services_controller.go:452] Built service openshift-machine-api/control-plane-machine-set-operator per-node LB for network=default: []services.LB{}\\\\nF1002 10:52:49.801767 6927 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:55Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.012675 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.023731 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.032154 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.042109 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.053404 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.060273 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.060312 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.060320 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.060334 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.060343 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:56Z","lastTransitionTime":"2025-10-02T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.063101 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:52:56Z is after 2025-08-24T17:21:41Z" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.163161 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.163227 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.163241 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.163261 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.163284 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:56Z","lastTransitionTime":"2025-10-02T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.265885 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.265926 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.265939 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.265955 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.265967 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:56Z","lastTransitionTime":"2025-10-02T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.368455 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.368492 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.368573 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.368594 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.368603 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:56Z","lastTransitionTime":"2025-10-02T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.470771 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.470813 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.470824 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.470842 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.470856 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:56Z","lastTransitionTime":"2025-10-02T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.572991 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.573026 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.573039 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.573056 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.573071 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:56Z","lastTransitionTime":"2025-10-02T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.675602 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.675649 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.675661 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.675679 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.675691 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:56Z","lastTransitionTime":"2025-10-02T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.778581 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.778990 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.779056 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.779208 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.779288 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:56Z","lastTransitionTime":"2025-10-02T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.880876 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.880888 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:56 crc kubenswrapper[4766]: E1002 10:52:56.881172 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:56 crc kubenswrapper[4766]: E1002 10:52:56.881301 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.883913 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.883963 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.883976 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.883999 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.884014 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:56Z","lastTransitionTime":"2025-10-02T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.894698 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.987124 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.987172 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.987185 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.987205 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:56 crc kubenswrapper[4766]: I1002 10:52:56.987218 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:56Z","lastTransitionTime":"2025-10-02T10:52:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.089278 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.089329 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.089340 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.089357 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.089368 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:57Z","lastTransitionTime":"2025-10-02T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.191987 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.192030 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.192039 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.192053 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.192063 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:57Z","lastTransitionTime":"2025-10-02T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.294804 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.294838 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.294846 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.294860 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.294869 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:57Z","lastTransitionTime":"2025-10-02T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.397165 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.397221 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.397230 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.397244 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.397254 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:57Z","lastTransitionTime":"2025-10-02T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.498774 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.498807 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.498815 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.498828 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.498837 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:57Z","lastTransitionTime":"2025-10-02T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.600960 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.601022 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.601033 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.601048 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.601057 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:57Z","lastTransitionTime":"2025-10-02T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.703546 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.703579 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.703587 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.703601 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.703611 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:57Z","lastTransitionTime":"2025-10-02T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.805732 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.805767 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.805776 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.805790 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.805800 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:57Z","lastTransitionTime":"2025-10-02T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.880428 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.880491 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:57 crc kubenswrapper[4766]: E1002 10:52:57.880569 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:57 crc kubenswrapper[4766]: E1002 10:52:57.880630 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.907549 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.907599 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.907612 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.907629 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:57 crc kubenswrapper[4766]: I1002 10:52:57.907641 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:57Z","lastTransitionTime":"2025-10-02T10:52:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.010437 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.010478 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.010491 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.010528 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.010544 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:58Z","lastTransitionTime":"2025-10-02T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.112697 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.112731 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.112741 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.112756 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.112766 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:58Z","lastTransitionTime":"2025-10-02T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.214688 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.214731 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.214742 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.214767 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.214780 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:58Z","lastTransitionTime":"2025-10-02T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.318838 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.319051 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.319078 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.319108 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.319142 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:58Z","lastTransitionTime":"2025-10-02T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.421528 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.421590 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.421603 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.421621 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.421634 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:58Z","lastTransitionTime":"2025-10-02T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.524560 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.524641 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.524684 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.524716 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.524738 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:58Z","lastTransitionTime":"2025-10-02T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.627821 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.627910 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.627931 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.627967 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.627989 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:58Z","lastTransitionTime":"2025-10-02T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.730279 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.730345 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.730357 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.730391 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.730401 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:58Z","lastTransitionTime":"2025-10-02T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.833000 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.833054 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.833065 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.833082 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.833093 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:58Z","lastTransitionTime":"2025-10-02T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.880616 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.880617 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:52:58 crc kubenswrapper[4766]: E1002 10:52:58.880770 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:52:58 crc kubenswrapper[4766]: E1002 10:52:58.880850 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.936251 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.936328 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.936343 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.936366 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:58 crc kubenswrapper[4766]: I1002 10:52:58.936379 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:58Z","lastTransitionTime":"2025-10-02T10:52:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.039168 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.039213 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.039226 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.039242 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.039253 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:59Z","lastTransitionTime":"2025-10-02T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.141682 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.141743 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.141753 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.141768 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.141777 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:59Z","lastTransitionTime":"2025-10-02T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.244913 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.244969 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.244985 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.245004 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.245021 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:59Z","lastTransitionTime":"2025-10-02T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.348190 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.348244 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.348252 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.348269 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.348279 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:59Z","lastTransitionTime":"2025-10-02T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.450801 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.450858 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.450872 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.450893 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.450910 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:59Z","lastTransitionTime":"2025-10-02T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.554552 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.554599 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.554613 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.554631 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.554644 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:59Z","lastTransitionTime":"2025-10-02T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.656342 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.656372 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.656380 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.656394 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.656402 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:59Z","lastTransitionTime":"2025-10-02T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.758926 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.758962 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.758973 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.758989 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.759002 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:59Z","lastTransitionTime":"2025-10-02T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.861364 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.861412 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.861428 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.861454 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.861475 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:59Z","lastTransitionTime":"2025-10-02T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.880721 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.880800 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:52:59 crc kubenswrapper[4766]: E1002 10:52:59.880987 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:52:59 crc kubenswrapper[4766]: E1002 10:52:59.881101 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.963671 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.963720 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.963737 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.963756 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:52:59 crc kubenswrapper[4766]: I1002 10:52:59.963769 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:52:59Z","lastTransitionTime":"2025-10-02T10:52:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.066230 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.066265 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.066274 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.066302 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.066313 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:00Z","lastTransitionTime":"2025-10-02T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.169051 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.169101 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.169117 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.169145 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.169167 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:00Z","lastTransitionTime":"2025-10-02T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.271165 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.271205 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.271217 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.271231 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.271240 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:00Z","lastTransitionTime":"2025-10-02T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.373966 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.373997 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.374006 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.374038 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.374047 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:00Z","lastTransitionTime":"2025-10-02T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.476458 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.476528 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.476537 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.476552 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.476563 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:00Z","lastTransitionTime":"2025-10-02T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.579175 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.579231 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.579244 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.579263 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.579274 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:00Z","lastTransitionTime":"2025-10-02T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.681654 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.681695 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.681708 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.681726 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.681738 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:00Z","lastTransitionTime":"2025-10-02T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.784330 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.784367 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.784380 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.784396 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.784407 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:00Z","lastTransitionTime":"2025-10-02T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.880285 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.880285 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:00 crc kubenswrapper[4766]: E1002 10:53:00.880648 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:00 crc kubenswrapper[4766]: E1002 10:53:00.880694 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.886126 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.886169 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.886180 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.886194 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.886207 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:00Z","lastTransitionTime":"2025-10-02T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.987955 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.987982 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.987989 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.988001 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:00 crc kubenswrapper[4766]: I1002 10:53:00.988009 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:00Z","lastTransitionTime":"2025-10-02T10:53:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.089999 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.090150 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.090172 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.090204 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.090225 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:01Z","lastTransitionTime":"2025-10-02T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.192349 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.192415 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.192427 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.192449 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.192462 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:01Z","lastTransitionTime":"2025-10-02T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.295466 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.295526 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.295539 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.295555 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.295568 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:01Z","lastTransitionTime":"2025-10-02T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.398381 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.398424 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.398436 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.398452 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.398464 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:01Z","lastTransitionTime":"2025-10-02T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.500948 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.501004 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.501020 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.501042 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.501058 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:01Z","lastTransitionTime":"2025-10-02T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.603135 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.603184 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.603196 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.603210 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.603220 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:01Z","lastTransitionTime":"2025-10-02T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.705987 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.706040 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.706050 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.706065 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.706075 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:01Z","lastTransitionTime":"2025-10-02T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.809216 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.809266 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.809280 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.809298 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.809311 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:01Z","lastTransitionTime":"2025-10-02T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.880862 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.880884 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:01 crc kubenswrapper[4766]: E1002 10:53:01.881011 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:01 crc kubenswrapper[4766]: E1002 10:53:01.881378 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.881694 4766 scope.go:117] "RemoveContainer" containerID="a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560" Oct 02 10:53:01 crc kubenswrapper[4766]: E1002 10:53:01.881880 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.911495 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.911556 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.911571 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.911587 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:01 crc kubenswrapper[4766]: I1002 10:53:01.911599 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:01Z","lastTransitionTime":"2025-10-02T10:53:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.014289 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.014326 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.014336 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.014392 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.014402 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:02Z","lastTransitionTime":"2025-10-02T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.117220 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.117261 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.117273 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.117288 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.117299 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:02Z","lastTransitionTime":"2025-10-02T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.219944 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.220002 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.220014 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.220032 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.220044 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:02Z","lastTransitionTime":"2025-10-02T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.322395 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.322459 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.322468 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.322483 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.322493 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:02Z","lastTransitionTime":"2025-10-02T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.424279 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.424319 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.424328 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.424343 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.424353 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:02Z","lastTransitionTime":"2025-10-02T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.526562 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.526602 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.526611 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.526624 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.526633 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:02Z","lastTransitionTime":"2025-10-02T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.628846 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.628881 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.628892 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.628908 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.628920 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:02Z","lastTransitionTime":"2025-10-02T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.731395 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.731468 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.731481 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.731513 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.731545 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:02Z","lastTransitionTime":"2025-10-02T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.834403 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.834434 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.834442 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.834454 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.834463 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:02Z","lastTransitionTime":"2025-10-02T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.880447 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:02 crc kubenswrapper[4766]: E1002 10:53:02.880614 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.880763 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:02 crc kubenswrapper[4766]: E1002 10:53:02.880930 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.937063 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.937107 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.937120 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.937134 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:02 crc kubenswrapper[4766]: I1002 10:53:02.937142 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:02Z","lastTransitionTime":"2025-10-02T10:53:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.038759 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.038796 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.038804 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.038819 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.038829 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:03Z","lastTransitionTime":"2025-10-02T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.141167 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.141200 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.141208 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.141221 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.141230 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:03Z","lastTransitionTime":"2025-10-02T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.243561 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.243631 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.243656 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.243687 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.243711 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:03Z","lastTransitionTime":"2025-10-02T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.349983 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.350029 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.350040 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.350058 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.350074 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:03Z","lastTransitionTime":"2025-10-02T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.452306 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.452346 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.452354 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.452369 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.452378 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:03Z","lastTransitionTime":"2025-10-02T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.554002 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.554040 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.554049 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.554066 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.554077 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:03Z","lastTransitionTime":"2025-10-02T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.656690 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.656745 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.656754 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.656768 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.656777 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:03Z","lastTransitionTime":"2025-10-02T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.759753 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.759791 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.759799 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.759813 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.759832 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:03Z","lastTransitionTime":"2025-10-02T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.862187 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.862221 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.862229 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.862241 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.862252 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:03Z","lastTransitionTime":"2025-10-02T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.880586 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.880586 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:03 crc kubenswrapper[4766]: E1002 10:53:03.880717 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:03 crc kubenswrapper[4766]: E1002 10:53:03.880782 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.964296 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.964333 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.964341 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.964354 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:03 crc kubenswrapper[4766]: I1002 10:53:03.964362 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:03Z","lastTransitionTime":"2025-10-02T10:53:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.066694 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.066735 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.066745 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.066763 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.066777 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:04Z","lastTransitionTime":"2025-10-02T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.168806 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.168854 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.168863 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.168877 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.168890 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:04Z","lastTransitionTime":"2025-10-02T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.271873 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.271911 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.271920 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.271935 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.271945 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:04Z","lastTransitionTime":"2025-10-02T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.374783 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.374845 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.374855 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.374872 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.374882 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:04Z","lastTransitionTime":"2025-10-02T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.477349 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.477392 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.477406 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.477427 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.477444 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:04Z","lastTransitionTime":"2025-10-02T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.580389 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.580761 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.580873 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.580972 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.581035 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:04Z","lastTransitionTime":"2025-10-02T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.683023 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.683060 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.683070 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.683089 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.683106 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:04Z","lastTransitionTime":"2025-10-02T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.785354 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.785406 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.785414 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.785427 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.785436 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:04Z","lastTransitionTime":"2025-10-02T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.880614 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:04 crc kubenswrapper[4766]: E1002 10:53:04.880978 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.880865 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:04 crc kubenswrapper[4766]: E1002 10:53:04.881209 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.887351 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.887393 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.887403 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.887419 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.887428 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:04Z","lastTransitionTime":"2025-10-02T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.990084 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.990123 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.990132 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.990144 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:04 crc kubenswrapper[4766]: I1002 10:53:04.990153 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:04Z","lastTransitionTime":"2025-10-02T10:53:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.093058 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.093100 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.093110 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.093128 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.093140 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:05Z","lastTransitionTime":"2025-10-02T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.195483 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.195568 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.195584 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.195604 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.195620 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:05Z","lastTransitionTime":"2025-10-02T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.253524 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs\") pod \"network-metrics-daemon-klg2z\" (UID: \"6d68573a-5250-4407-8631-2199a3de7e9e\") " pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:05 crc kubenswrapper[4766]: E1002 10:53:05.253646 4766 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:53:05 crc kubenswrapper[4766]: E1002 10:53:05.253700 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs podName:6d68573a-5250-4407-8631-2199a3de7e9e nodeName:}" failed. No retries permitted until 2025-10-02 10:54:09.253684463 +0000 UTC m=+164.196555407 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs") pod "network-metrics-daemon-klg2z" (UID: "6d68573a-5250-4407-8631-2199a3de7e9e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.298091 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.298139 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.298151 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.298168 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.298179 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:05Z","lastTransitionTime":"2025-10-02T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.345115 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.345155 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.345171 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.345186 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.345197 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:05Z","lastTransitionTime":"2025-10-02T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:05 crc kubenswrapper[4766]: E1002 10:53:05.357449 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.363488 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.363596 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.363618 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.363647 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.363666 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:05Z","lastTransitionTime":"2025-10-02T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:05 crc kubenswrapper[4766]: E1002 10:53:05.378753 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.386464 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.386523 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.386535 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.386551 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.386563 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:05Z","lastTransitionTime":"2025-10-02T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:05 crc kubenswrapper[4766]: E1002 10:53:05.402066 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.407400 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.407447 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.407458 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.407473 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.407485 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:05Z","lastTransitionTime":"2025-10-02T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:05 crc kubenswrapper[4766]: E1002 10:53:05.418364 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.421455 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.421482 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.421490 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.421517 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.421529 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:05Z","lastTransitionTime":"2025-10-02T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:05 crc kubenswrapper[4766]: E1002 10:53:05.431918 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:05 crc kubenswrapper[4766]: E1002 10:53:05.432050 4766 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.433260 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.433359 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.433455 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.433600 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.433688 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:05Z","lastTransitionTime":"2025-10-02T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.535979 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.536025 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.536058 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.536077 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.536091 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:05Z","lastTransitionTime":"2025-10-02T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.638454 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.638493 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.638563 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.638583 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.638598 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:05Z","lastTransitionTime":"2025-10-02T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.740968 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.741007 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.741019 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.741035 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.741044 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:05Z","lastTransitionTime":"2025-10-02T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.843668 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.843710 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.843721 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.843738 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.843749 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:05Z","lastTransitionTime":"2025-10-02T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.880461 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.880461 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:05 crc kubenswrapper[4766]: E1002 10:53:05.880637 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:05 crc kubenswrapper[4766]: E1002 10:53:05.880706 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.905829 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.926703 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.937490 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.945962 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.945988 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.945997 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.946010 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.946020 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:05Z","lastTransitionTime":"2025-10-02T10:53:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.954791 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:50Z\\\",\\\"message\\\":\\\"ervice\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.41\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:52:49.802554 6927 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-l99lx in node crc\\\\nI1002 10:52:49.802544 6927 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 10:52:49.802562 6927 services_controller.go:452] Built service openshift-machine-api/control-plane-machine-set-operator per-node LB for network=default: []services.LB{}\\\\nF1002 10:52:49.801767 6927 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.964197 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd9b14-a240-4037-8252-b5723195f266\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2863522cb0c674d69bc44013b1f0ee9f9adb918a9119fc4d05d3b19cf4ed73e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21316d8de77cacc4f5ec3d1ea00e5d572853ccd2796b7deaed6cb4e8a450f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21316d8de77cacc4f5ec3d1ea00e5d572853ccd2796b7deaed6cb4e8a450f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.976577 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:05 crc kubenswrapper[4766]: I1002 10:53:05.990997 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"070e0a3b-5963-4adc-a4f5-020999886339\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79274b877cf8ff3109da1dc078ee5f0b19a541fb7026b110c1969e0c2d341cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae830f784230e44d237e8f6a6606c9969e54046b8010f6dcd0ca446ea264676f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f754efe5b2b5464759b533ec933dc5834a1be242592ca9375184f5fc24a72f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:05Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.003278 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.013324 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.025436 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff77e4fb340919ea122bf7a3ecdab638bcb0d9dde19ec12b466f14a2cf2e753f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:37Z\\\",\\\"message\\\":\\\"2025-10-02T10:51:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe\\\\n2025-10-02T10:51:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe to /host/opt/cni/bin/\\\\n2025-10-02T10:51:52Z [verbose] multus-daemon started\\\\n2025-10-02T10:51:52Z [verbose] Readiness Indicator file check\\\\n2025-10-02T10:52:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.035352 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.048738 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.048776 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.048787 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.048803 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.048814 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:06Z","lastTransitionTime":"2025-10-02T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.053260 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"793605fd-5a26-4f9e-9d07-98bde1606926\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51b2f1d11586bb63a7a938d67ffe2d79e0737bd5cfd5843516729a47bb8e10dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81594c415f91899483dd0aaa9b4c7579954111d17a09fe4cd1772a1b7e30e828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fec99f47c0eecc37f54abcdeb52db64e8c976a72f51715f62886146e0828a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e713fc811a7629ea42b01bdb34bdfbe8a623903dc45e3c25ace5c4b3f1ab478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35460bafc4d9ad68045d3879f7dba6690b2dca730672f61bafa0e3a8d82c02c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87cf73671615c0ee67ed0a4e457f8c298516c3abead405cbef99b7925552b974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87cf73671615c0ee67ed0a4e457f8c298516c3abead405cbef99b7925552b974\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://371309b1c4a030c7bd0c872fa5b6afe2aac6f7a2a1bf379c5b11533e142aa72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371309b1c4a030c7bd0c872fa5b6afe2aac6f7a2a1bf379c5b11533e142aa72a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://be763462d66bc78259e0217785ee9280b921fb1da9d576731b0a68d82a0fe3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be763462d66bc78259e0217785ee9280b921fb1da9d576731b0a68d82a0fe3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.063349 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.075070 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.093363 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.105830 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.120830 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.132076 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.141678 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:06Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.151188 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.151230 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.151241 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.151258 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.151269 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:06Z","lastTransitionTime":"2025-10-02T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.253772 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.253828 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.253841 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.253858 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.253871 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:06Z","lastTransitionTime":"2025-10-02T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.355849 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.355910 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.355926 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.355945 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.355958 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:06Z","lastTransitionTime":"2025-10-02T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.458054 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.458097 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.458109 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.458151 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.458166 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:06Z","lastTransitionTime":"2025-10-02T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.561027 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.561084 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.561093 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.561108 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.561118 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:06Z","lastTransitionTime":"2025-10-02T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.664474 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.665226 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.665326 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.665438 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.665579 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:06Z","lastTransitionTime":"2025-10-02T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.768036 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.768664 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.768771 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.769204 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.769310 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:06Z","lastTransitionTime":"2025-10-02T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.872194 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.872243 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.872252 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.872269 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.872281 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:06Z","lastTransitionTime":"2025-10-02T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.880623 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.880691 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:06 crc kubenswrapper[4766]: E1002 10:53:06.880824 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:06 crc kubenswrapper[4766]: E1002 10:53:06.881119 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.975469 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.975544 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.975556 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.975574 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:06 crc kubenswrapper[4766]: I1002 10:53:06.975586 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:06Z","lastTransitionTime":"2025-10-02T10:53:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.077753 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.077796 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.077809 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.077827 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.077839 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:07Z","lastTransitionTime":"2025-10-02T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.181283 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.181313 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.181323 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.181343 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.181371 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:07Z","lastTransitionTime":"2025-10-02T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.284576 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.284612 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.284623 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.284638 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.284649 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:07Z","lastTransitionTime":"2025-10-02T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.386441 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.386489 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.386518 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.386539 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.386552 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:07Z","lastTransitionTime":"2025-10-02T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.489049 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.489103 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.489115 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.489140 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.489183 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:07Z","lastTransitionTime":"2025-10-02T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.592162 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.592197 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.592205 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.592220 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.592231 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:07Z","lastTransitionTime":"2025-10-02T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.694895 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.694943 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.694957 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.694974 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.694994 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:07Z","lastTransitionTime":"2025-10-02T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.798959 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.798997 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.799008 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.799024 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.799035 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:07Z","lastTransitionTime":"2025-10-02T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.880928 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.881006 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:07 crc kubenswrapper[4766]: E1002 10:53:07.881142 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:07 crc kubenswrapper[4766]: E1002 10:53:07.884276 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.901706 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.901743 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.901753 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.901768 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:07 crc kubenswrapper[4766]: I1002 10:53:07.901779 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:07Z","lastTransitionTime":"2025-10-02T10:53:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.004985 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.005045 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.005063 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.005089 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.005107 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:08Z","lastTransitionTime":"2025-10-02T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.108274 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.108590 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.108739 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.109059 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.109146 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:08Z","lastTransitionTime":"2025-10-02T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.211469 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.211529 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.211556 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.211573 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.211585 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:08Z","lastTransitionTime":"2025-10-02T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.313712 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.313749 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.313760 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.313775 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.313786 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:08Z","lastTransitionTime":"2025-10-02T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.416072 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.416405 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.416478 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.416579 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.416648 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:08Z","lastTransitionTime":"2025-10-02T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.518369 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.518725 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.518831 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.518937 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.519025 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:08Z","lastTransitionTime":"2025-10-02T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.621375 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.621413 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.621423 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.621438 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.621449 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:08Z","lastTransitionTime":"2025-10-02T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.723941 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.724193 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.724262 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.724425 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.724518 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:08Z","lastTransitionTime":"2025-10-02T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.827240 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.827274 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.827284 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.827301 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.827310 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:08Z","lastTransitionTime":"2025-10-02T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.881013 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.881082 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:08 crc kubenswrapper[4766]: E1002 10:53:08.881162 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:08 crc kubenswrapper[4766]: E1002 10:53:08.881267 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.930549 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.930881 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.930982 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.931088 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:08 crc kubenswrapper[4766]: I1002 10:53:08.931175 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:08Z","lastTransitionTime":"2025-10-02T10:53:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.033609 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.033656 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.033667 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.033686 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.033697 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:09Z","lastTransitionTime":"2025-10-02T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.136420 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.136812 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.136904 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.136996 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.137081 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:09Z","lastTransitionTime":"2025-10-02T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.240110 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.240173 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.240200 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.240230 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.240253 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:09Z","lastTransitionTime":"2025-10-02T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.342892 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.342949 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.342963 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.342986 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.343000 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:09Z","lastTransitionTime":"2025-10-02T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.445204 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.445545 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.445654 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.445738 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.445802 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:09Z","lastTransitionTime":"2025-10-02T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.549407 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.549460 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.549475 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.549529 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.549548 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:09Z","lastTransitionTime":"2025-10-02T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.652818 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.653334 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.653480 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.653654 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.653795 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:09Z","lastTransitionTime":"2025-10-02T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.756288 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.756340 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.756352 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.756369 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.756380 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:09Z","lastTransitionTime":"2025-10-02T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.859148 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.859185 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.859195 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.859210 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.859221 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:09Z","lastTransitionTime":"2025-10-02T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.880794 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.880794 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:09 crc kubenswrapper[4766]: E1002 10:53:09.881027 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:09 crc kubenswrapper[4766]: E1002 10:53:09.881159 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.961324 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.961358 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.961369 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.961385 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:09 crc kubenswrapper[4766]: I1002 10:53:09.961396 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:09Z","lastTransitionTime":"2025-10-02T10:53:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.063205 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.063247 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.063261 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.063280 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.063291 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:10Z","lastTransitionTime":"2025-10-02T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.165377 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.165410 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.165421 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.165436 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.165448 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:10Z","lastTransitionTime":"2025-10-02T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.267217 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.267764 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.267838 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.267918 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.268019 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:10Z","lastTransitionTime":"2025-10-02T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.371327 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.371378 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.371392 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.371409 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.371422 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:10Z","lastTransitionTime":"2025-10-02T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.474338 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.474755 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.474835 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.474915 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.475043 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:10Z","lastTransitionTime":"2025-10-02T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.577349 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.577379 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.577387 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.577400 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.577408 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:10Z","lastTransitionTime":"2025-10-02T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.680172 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.680213 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.680223 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.680242 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.680254 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:10Z","lastTransitionTime":"2025-10-02T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.782442 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.782494 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.782520 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.782535 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.782547 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:10Z","lastTransitionTime":"2025-10-02T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.880695 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:10 crc kubenswrapper[4766]: E1002 10:53:10.880899 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.881171 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:10 crc kubenswrapper[4766]: E1002 10:53:10.881420 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.885837 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.885889 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.885906 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.885933 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.885949 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:10Z","lastTransitionTime":"2025-10-02T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.988343 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.988376 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.988385 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.988398 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:10 crc kubenswrapper[4766]: I1002 10:53:10.988409 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:10Z","lastTransitionTime":"2025-10-02T10:53:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.091438 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.091486 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.091496 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.091531 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.091542 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:11Z","lastTransitionTime":"2025-10-02T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.193425 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.193463 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.193491 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.193528 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.193543 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:11Z","lastTransitionTime":"2025-10-02T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.295712 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.295762 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.295774 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.295791 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.295804 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:11Z","lastTransitionTime":"2025-10-02T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.398561 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.398599 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.398609 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.398624 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.398634 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:11Z","lastTransitionTime":"2025-10-02T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.501372 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.501417 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.501426 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.501441 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.501450 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:11Z","lastTransitionTime":"2025-10-02T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.603894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.603934 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.603943 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.603959 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.603974 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:11Z","lastTransitionTime":"2025-10-02T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.706903 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.707176 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.707245 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.707312 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.707386 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:11Z","lastTransitionTime":"2025-10-02T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.809556 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.809600 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.809609 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.809623 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.809632 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:11Z","lastTransitionTime":"2025-10-02T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.880545 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.880693 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:11 crc kubenswrapper[4766]: E1002 10:53:11.880774 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:11 crc kubenswrapper[4766]: E1002 10:53:11.880880 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.911807 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.912038 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.912111 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.912186 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:11 crc kubenswrapper[4766]: I1002 10:53:11.912283 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:11Z","lastTransitionTime":"2025-10-02T10:53:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.015759 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.015921 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.015957 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.015998 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.016028 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:12Z","lastTransitionTime":"2025-10-02T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.119134 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.119382 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.119490 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.119609 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.119700 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:12Z","lastTransitionTime":"2025-10-02T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.222672 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.222737 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.222767 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.222797 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.222813 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:12Z","lastTransitionTime":"2025-10-02T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.326389 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.326487 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.326548 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.326593 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.326604 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:12Z","lastTransitionTime":"2025-10-02T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.428722 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.428757 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.428765 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.428778 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.428787 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:12Z","lastTransitionTime":"2025-10-02T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.531611 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.531710 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.531730 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.531751 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.531770 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:12Z","lastTransitionTime":"2025-10-02T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.634010 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.634062 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.634074 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.634094 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.634108 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:12Z","lastTransitionTime":"2025-10-02T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.736249 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.736324 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.736338 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.736354 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.736365 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:12Z","lastTransitionTime":"2025-10-02T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.838868 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.838932 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.838942 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.838991 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.839000 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:12Z","lastTransitionTime":"2025-10-02T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.881250 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.881336 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:12 crc kubenswrapper[4766]: E1002 10:53:12.881423 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:12 crc kubenswrapper[4766]: E1002 10:53:12.881538 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.941626 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.941664 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.941672 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.941709 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:12 crc kubenswrapper[4766]: I1002 10:53:12.941720 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:12Z","lastTransitionTime":"2025-10-02T10:53:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.044095 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.044148 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.044161 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.044182 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.044194 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:13Z","lastTransitionTime":"2025-10-02T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.146443 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.146489 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.146522 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.146540 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.146552 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:13Z","lastTransitionTime":"2025-10-02T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.249111 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.249150 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.249158 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.249171 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.249181 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:13Z","lastTransitionTime":"2025-10-02T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.351833 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.351879 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.351890 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.351910 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.351922 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:13Z","lastTransitionTime":"2025-10-02T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.454629 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.454668 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.454676 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.454689 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.454698 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:13Z","lastTransitionTime":"2025-10-02T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.557289 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.557340 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.557351 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.557373 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.557385 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:13Z","lastTransitionTime":"2025-10-02T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.660011 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.660052 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.660065 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.660081 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.660095 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:13Z","lastTransitionTime":"2025-10-02T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.762157 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.762200 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.762211 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.762227 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.762241 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:13Z","lastTransitionTime":"2025-10-02T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.864147 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.864188 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.864196 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.864210 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.864222 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:13Z","lastTransitionTime":"2025-10-02T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.880445 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.880527 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:13 crc kubenswrapper[4766]: E1002 10:53:13.880632 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:13 crc kubenswrapper[4766]: E1002 10:53:13.880712 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.966204 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.966243 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.966251 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.966266 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:13 crc kubenswrapper[4766]: I1002 10:53:13.966276 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:13Z","lastTransitionTime":"2025-10-02T10:53:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.068440 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.068485 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.068520 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.068538 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.068546 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:14Z","lastTransitionTime":"2025-10-02T10:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.171237 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.171524 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.171592 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.171682 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.171807 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:14Z","lastTransitionTime":"2025-10-02T10:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.274573 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.274924 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.275114 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.275256 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.275446 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:14Z","lastTransitionTime":"2025-10-02T10:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.378059 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.378104 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.378116 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.378132 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.378142 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:14Z","lastTransitionTime":"2025-10-02T10:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.480659 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.480699 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.480711 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.480724 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.480733 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:14Z","lastTransitionTime":"2025-10-02T10:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.582989 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.583031 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.583043 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.583059 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.583071 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:14Z","lastTransitionTime":"2025-10-02T10:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.685390 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.685438 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.685447 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.685462 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.685471 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:14Z","lastTransitionTime":"2025-10-02T10:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.787541 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.787585 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.787598 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.787623 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.787635 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:14Z","lastTransitionTime":"2025-10-02T10:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.880235 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.880292 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:14 crc kubenswrapper[4766]: E1002 10:53:14.880391 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:14 crc kubenswrapper[4766]: E1002 10:53:14.880438 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.881135 4766 scope.go:117] "RemoveContainer" containerID="a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560" Oct 02 10:53:14 crc kubenswrapper[4766]: E1002 10:53:14.881300 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.890110 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.890140 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.890148 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.890162 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.890174 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:14Z","lastTransitionTime":"2025-10-02T10:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.992569 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.992619 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.992628 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.992648 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:14 crc kubenswrapper[4766]: I1002 10:53:14.992658 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:14Z","lastTransitionTime":"2025-10-02T10:53:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.094825 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.094870 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.094879 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.094894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.094903 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:15Z","lastTransitionTime":"2025-10-02T10:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.196918 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.197007 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.197017 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.197032 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.197041 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:15Z","lastTransitionTime":"2025-10-02T10:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.299854 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.299906 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.299935 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.299956 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.299970 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:15Z","lastTransitionTime":"2025-10-02T10:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.401871 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.401908 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.401920 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.401934 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.401945 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:15Z","lastTransitionTime":"2025-10-02T10:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.505022 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.505061 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.505087 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.505101 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.505110 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:15Z","lastTransitionTime":"2025-10-02T10:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.608709 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.608764 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.608773 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.608796 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.608806 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:15Z","lastTransitionTime":"2025-10-02T10:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.700960 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.701007 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.701019 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.701035 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.701047 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:15Z","lastTransitionTime":"2025-10-02T10:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:15 crc kubenswrapper[4766]: E1002 10:53:15.714298 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.718306 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.718348 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.718358 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.718373 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.718383 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:15Z","lastTransitionTime":"2025-10-02T10:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:15 crc kubenswrapper[4766]: E1002 10:53:15.733575 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.738153 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.738196 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.738204 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.738220 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.738230 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:15Z","lastTransitionTime":"2025-10-02T10:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:15 crc kubenswrapper[4766]: E1002 10:53:15.749259 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.753890 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.753938 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.753949 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.753969 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.753982 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:15Z","lastTransitionTime":"2025-10-02T10:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:15 crc kubenswrapper[4766]: E1002 10:53:15.771062 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.775299 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.775342 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.775352 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.775368 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.775379 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:15Z","lastTransitionTime":"2025-10-02T10:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:15 crc kubenswrapper[4766]: E1002 10:53:15.788089 4766 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-02T10:53:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0c3177b4-52e1-4f6e-a9c9-0faf43eec636\\\",\\\"systemUUID\\\":\\\"d3914833-6e1d-48ec-a496-ffff0864ff9c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:15 crc kubenswrapper[4766]: E1002 10:53:15.788203 4766 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.790833 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.791016 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.791096 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.791172 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.791240 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:15Z","lastTransitionTime":"2025-10-02T10:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.880800 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:15 crc kubenswrapper[4766]: E1002 10:53:15.880957 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.881179 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:15 crc kubenswrapper[4766]: E1002 10:53:15.881634 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.892050 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"15dd9b14-a240-4037-8252-b5723195f266\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2863522cb0c674d69bc44013b1f0ee9f9adb918a9119fc4d05d3b19cf4ed73e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb21316d8de77cacc4f5ec3d1ea00e5d572853ccd2796b7deaed6cb4e8a450f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb21316d8de77cacc4f5ec3d1ea00e5d572853ccd2796b7deaed6cb4e8a450f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.894038 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.894332 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.894414 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.894520 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.894601 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:15Z","lastTransitionTime":"2025-10-02T10:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.905686 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ad24ce48-49de-4072-9cb1-f0de084fa21a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fab566a44570d6fcdc3a0e271dc87a187dca132a246df57cf88a78fd56e49ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c95228ff6d5d8ef9c1c963d63f90e80be85a49162ed6587e8cb17d120b45ec7d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c22f7192a3b3999dffbc76c5203d0c21b50b79e52d5971ba4f7adade02745bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db1b31cf5b3a3c2d5db6987c95841eaac93c85f963cc21330203294486ae64d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed40e12c744d67c37435036b84e7f2ba626901490cdb453fd06b11f831ef136a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"file observer\\\\nW1002 10:51:47.152191 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1002 10:51:47.152335 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1002 10:51:47.153128 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-160580138/tls.crt::/tmp/serving-cert-160580138/tls.key\\\\\\\"\\\\nI1002 10:51:47.459407 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1002 10:51:47.466420 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1002 10:51:47.466448 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1002 10:51:47.466467 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1002 10:51:47.466472 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1002 10:51:47.490916 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1002 10:51:47.490954 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490961 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1002 10:51:47.490986 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1002 10:51:47.490990 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1002 10:51:47.490993 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1002 10:51:47.490997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1002 10:51:47.491079 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1002 10:51:47.494893 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15ee3c51661c04c9ed8d60e956a78a4ba1225c7444274521e32ee1dec61879df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b854c2e40f069d3440fa8895f2855fe8cd44e34cfc4d9600279411ee2aa2ff97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.917602 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"070e0a3b-5963-4adc-a4f5-020999886339\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79274b877cf8ff3109da1dc078ee5f0b19a541fb7026b110c1969e0c2d341cb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae830f784230e44d237e8f6a6606c9969e54046b8010f6dcd0ca446ea264676f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f754efe5b2b5464759b533ec933dc5834a1be242592ca9375184f5fc24a72f29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77281d574bf0fe88c50d9baa76c7a7aa65f4ad689eec88df76f25d1c2d333840\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.930896 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7ad9743a8a9488b2191e6ac9273fe1bbdc858822cbaf813a430345ac0475de7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.943010 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd6c493afddcc0c9c10a39c437aa9a97f984e9a7feb91615c289e0bb2d52d1fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.959854 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-2jxdg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6aa81c2-8c87-43df-badb-7b9dbef84ccf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff77e4fb340919ea122bf7a3ecdab638bcb0d9dde19ec12b466f14a2cf2e753f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:37Z\\\",\\\"message\\\":\\\"2025-10-02T10:51:51+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe\\\\n2025-10-02T10:51:51+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ddf0df9a-45d3-44f1-895d-08f3b6b2cebe to /host/opt/cni/bin/\\\\n2025-10-02T10:51:52Z [verbose] multus-daemon started\\\\n2025-10-02T10:51:52Z [verbose] Readiness Indicator file check\\\\n2025-10-02T10:52:37Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwgcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-2jxdg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.977889 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11cc785e-5bdc-4827-913a-4d899eb5a83c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-02T10:52:50Z\\\",\\\"message\\\":\\\"ervice\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.41\\\\\\\", Port:9443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1002 10:52:49.802554 6927 ovn.go:134] Ensuring zone local for Pod openshift-machine-config-operator/machine-config-daemon-l99lx in node crc\\\\nI1002 10:52:49.802544 6927 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1002 10:52:49.802562 6927 services_controller.go:452] Built service openshift-machine-api/control-plane-machine-set-operator per-node LB for network=default: []services.LB{}\\\\nF1002 10:52:49.801767 6927 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-02T10:52:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9fb7h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-27vgl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.997269 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.997314 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.997227 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"793605fd-5a26-4f9e-9d07-98bde1606926\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51b2f1d11586bb63a7a938d67ffe2d79e0737bd5cfd5843516729a47bb8e10dc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81594c415f91899483dd0aaa9b4c7579954111d17a09fe4cd1772a1b7e30e828\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fec99f47c0eecc37f54abcdeb52db64e8c976a72f51715f62886146e0828a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e713fc811a7629ea42b01bdb34bdfbe8a623903dc45e3c25ace5c4b3f1ab478\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35460bafc4d9ad68045d3879f7dba6690b2dca730672f61bafa0e3a8d82c02c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87cf73671615c0ee67ed0a4e457f8c298516c3abead405cbef99b7925552b974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87cf73671615c0ee67ed0a4e457f8c298516c3abead405cbef99b7925552b974\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://371309b1c4a030c7bd0c872fa5b6afe2aac6f7a2a1bf379c5b11533e142aa72a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://371309b1c4a030c7bd0c872fa5b6afe2aac6f7a2a1bf379c5b11533e142aa72a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://be763462d66bc78259e0217785ee9280b921fb1da9d576731b0a68d82a0fe3e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be763462d66bc78259e0217785ee9280b921fb1da9d576731b0a68d82a0fe3e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:15Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.997326 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.998469 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:15 crc kubenswrapper[4766]: I1002 10:53:15.998492 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:15Z","lastTransitionTime":"2025-10-02T10:53:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.009660 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.021364 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc860589929095e450a9e3db43d5c05ab74fc902663b8da319f9a31d3da9ed05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8084fd68244c0bc851102450a03c34c93d6c31b2e3a68ceb09e6e82a3c81da0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.031770 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5vgtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"317f8a30-cdef-4f82-832c-4f3bc2674379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://374c4744e2e8aab50a7eacae8f96fb9b737fb8246a9955198492c36f1bb57620\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlxkt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5vgtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.042639 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cd484f43-26b6-4e55-b872-7502e8d6e8c7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2c3e3e291d550f26731818f8bcafde1dbebdcb2a3bb0c6d9e9be734cc43029b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5pn5s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-l99lx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.056630 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-4wx78" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b897c4d4-6c9c-4d4b-a684-d66c59d00190\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13945c3df6ddf4de368889a5451bc362e6dcac445d87b5763c9e065aca77f495\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4cbda612758be3d64d80215257118aeed34902fc15fa6b3d0d0fbf3ff87f14f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6b8ba340183e2fb510d58a8549aac2effbee99e41b1d6ad52013950f494f420\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1c8dc07bb26ed639f9f5006116d43a57b785fc21ed98012582df5f7357b0878f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bfa4f96dcdb5e51c4846b8addcf8fc1b4aa8fdee69019f3e1fe9edd311d5e875\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8ffa210b129e00f4b04a8803dd21a919ab995a2b91cb7ae908e371f8dd1de9d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fde89ced628efe452c27f994b207e245de13757b408bd534d5a52a644fbc14f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-02T10:51:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-02T10:51:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rx6lz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-4wx78\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.066947 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-klg2z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d68573a-5250-4407-8631-2199a3de7e9e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w2xzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-klg2z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.079065 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4cfce279-4871-4b98-ba66-7fc76ef4dd6f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a742043473028dcb5579a05dd454ce0c633b294b26d3e113112e219b0bb0b1ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d7a62981488788ce1cce5b1abfd56755e7ad43ad70f58e7c205bccd2e9f5e58\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53d0a9bc662cc1c5a49e9b331fcfc62e426294e31f43f4b177644c97a92c402a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://02252bb00d32dc75c6e349fc39ecfef247af6fdde7dbe73aa182aff34dab954a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.090417 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-w4c82" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"343775d7-8fd1-4ce4-b05c-ab27e9406a9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc231c78ad8d5e6f8b3a642e4846d2ee5439611d15c8f1ac16d9e71ea1db0a02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:51:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhcvn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:51:49Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-w4c82\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.101151 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.101193 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.101208 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.101225 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.101236 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:16Z","lastTransitionTime":"2025-10-02T10:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.103010 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.116524 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-02T10:51:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.127619 4766 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c99ede6-74b7-406b-8195-c9364efc146f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-02T10:52:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56342f7323e1879313a041af648acc5fd5f6b81ec5ad9dbf3ec8d9bd1166a8da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8319ad82b8df02fc316afdfa3089257939f6c48062f26c4e092e03f0d70c9727\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-02T10:52:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r4jpv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-02T10:52:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cll7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-02T10:53:16Z is after 2025-08-24T17:21:41Z" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.203625 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.203651 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.203659 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.203673 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.203681 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:16Z","lastTransitionTime":"2025-10-02T10:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.306446 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.306488 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.306518 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.306535 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.306545 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:16Z","lastTransitionTime":"2025-10-02T10:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.408441 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.408489 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.408531 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.408549 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.408561 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:16Z","lastTransitionTime":"2025-10-02T10:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.511839 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.511881 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.511897 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.511915 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.511929 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:16Z","lastTransitionTime":"2025-10-02T10:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.613925 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.613992 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.614008 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.614030 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.614044 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:16Z","lastTransitionTime":"2025-10-02T10:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.716519 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.716564 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.716577 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.716593 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.716621 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:16Z","lastTransitionTime":"2025-10-02T10:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.818833 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.818868 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.818876 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.818887 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.818896 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:16Z","lastTransitionTime":"2025-10-02T10:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.880809 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.880875 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:16 crc kubenswrapper[4766]: E1002 10:53:16.880969 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:16 crc kubenswrapper[4766]: E1002 10:53:16.881065 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.921361 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.921436 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.921448 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.921465 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:16 crc kubenswrapper[4766]: I1002 10:53:16.921478 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:16Z","lastTransitionTime":"2025-10-02T10:53:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.023644 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.023693 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.023705 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.023722 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.023735 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:17Z","lastTransitionTime":"2025-10-02T10:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.126275 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.126307 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.126318 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.126330 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.126339 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:17Z","lastTransitionTime":"2025-10-02T10:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.228889 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.228924 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.228933 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.228946 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.228956 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:17Z","lastTransitionTime":"2025-10-02T10:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.331583 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.331873 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.331985 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.332083 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.332210 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:17Z","lastTransitionTime":"2025-10-02T10:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.434547 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.434585 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.434593 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.434608 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.434618 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:17Z","lastTransitionTime":"2025-10-02T10:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.536002 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.536052 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.536067 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.536087 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.536103 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:17Z","lastTransitionTime":"2025-10-02T10:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.638333 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.638375 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.638386 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.638403 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.638415 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:17Z","lastTransitionTime":"2025-10-02T10:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.740593 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.740635 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.740649 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.740668 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.740683 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:17Z","lastTransitionTime":"2025-10-02T10:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.842642 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.842691 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.842703 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.842716 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.842724 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:17Z","lastTransitionTime":"2025-10-02T10:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.881767 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.881914 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:17 crc kubenswrapper[4766]: E1002 10:53:17.882049 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:17 crc kubenswrapper[4766]: E1002 10:53:17.882194 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.944730 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.944785 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.944793 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.944807 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:17 crc kubenswrapper[4766]: I1002 10:53:17.944816 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:17Z","lastTransitionTime":"2025-10-02T10:53:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.047889 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.047937 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.047948 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.047967 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.047980 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:18Z","lastTransitionTime":"2025-10-02T10:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.150657 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.150693 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.150702 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.150714 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.150723 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:18Z","lastTransitionTime":"2025-10-02T10:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.253119 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.253156 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.253164 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.253183 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.253197 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:18Z","lastTransitionTime":"2025-10-02T10:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.355894 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.355939 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.355946 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.355960 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.355968 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:18Z","lastTransitionTime":"2025-10-02T10:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.457979 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.458012 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.458022 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.458037 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.458048 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:18Z","lastTransitionTime":"2025-10-02T10:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.560443 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.560525 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.560541 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.560566 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.560582 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:18Z","lastTransitionTime":"2025-10-02T10:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.662460 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.662527 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.662540 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.662558 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.662570 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:18Z","lastTransitionTime":"2025-10-02T10:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.765225 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.765270 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.765280 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.765305 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.765317 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:18Z","lastTransitionTime":"2025-10-02T10:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.866980 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.867006 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.867014 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.867026 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.867036 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:18Z","lastTransitionTime":"2025-10-02T10:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.880246 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.880263 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:18 crc kubenswrapper[4766]: E1002 10:53:18.880472 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:18 crc kubenswrapper[4766]: E1002 10:53:18.880556 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.969407 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.969458 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.969471 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.969487 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:18 crc kubenswrapper[4766]: I1002 10:53:18.969497 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:18Z","lastTransitionTime":"2025-10-02T10:53:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.071712 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.071797 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.071807 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.071823 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.071835 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:19Z","lastTransitionTime":"2025-10-02T10:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.174475 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.174545 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.174561 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.174581 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.174594 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:19Z","lastTransitionTime":"2025-10-02T10:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.277312 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.277371 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.277385 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.277406 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.277421 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:19Z","lastTransitionTime":"2025-10-02T10:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.380396 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.380440 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.380456 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.380473 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.380483 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:19Z","lastTransitionTime":"2025-10-02T10:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.482931 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.482977 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.482994 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.483010 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.483023 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:19Z","lastTransitionTime":"2025-10-02T10:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.585667 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.585705 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.585718 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.585735 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.585746 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:19Z","lastTransitionTime":"2025-10-02T10:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.688488 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.688546 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.688555 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.688571 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.688583 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:19Z","lastTransitionTime":"2025-10-02T10:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.790857 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.790897 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.790909 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.790925 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.790937 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:19Z","lastTransitionTime":"2025-10-02T10:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.880814 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.880972 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:19 crc kubenswrapper[4766]: E1002 10:53:19.881073 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:19 crc kubenswrapper[4766]: E1002 10:53:19.881189 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.893118 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.893176 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.893188 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.893207 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.893221 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:19Z","lastTransitionTime":"2025-10-02T10:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.995996 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.996039 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.996075 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.996094 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:19 crc kubenswrapper[4766]: I1002 10:53:19.996108 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:19Z","lastTransitionTime":"2025-10-02T10:53:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.098576 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.098629 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.098649 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.098680 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.098696 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:20Z","lastTransitionTime":"2025-10-02T10:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.200937 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.200969 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.200979 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.200993 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.201006 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:20Z","lastTransitionTime":"2025-10-02T10:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.303275 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.303322 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.303337 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.303355 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.303365 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:20Z","lastTransitionTime":"2025-10-02T10:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.406178 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.406230 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.406245 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.406268 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.406286 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:20Z","lastTransitionTime":"2025-10-02T10:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.508580 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.508617 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.508629 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.508646 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.508658 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:20Z","lastTransitionTime":"2025-10-02T10:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.611471 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.611553 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.611566 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.611580 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.611589 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:20Z","lastTransitionTime":"2025-10-02T10:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.713543 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.713585 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.713595 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.713611 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.713622 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:20Z","lastTransitionTime":"2025-10-02T10:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.815761 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.815830 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.815862 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.815879 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.815890 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:20Z","lastTransitionTime":"2025-10-02T10:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.881312 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.881324 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:20 crc kubenswrapper[4766]: E1002 10:53:20.881484 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:20 crc kubenswrapper[4766]: E1002 10:53:20.881565 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.918274 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.918310 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.918323 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.918336 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:20 crc kubenswrapper[4766]: I1002 10:53:20.918345 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:20Z","lastTransitionTime":"2025-10-02T10:53:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.020641 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.020695 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.020711 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.020733 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.020749 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:21Z","lastTransitionTime":"2025-10-02T10:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.123134 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.123226 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.123254 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.123322 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.123404 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:21Z","lastTransitionTime":"2025-10-02T10:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.227644 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.227689 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.227701 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.227718 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.227731 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:21Z","lastTransitionTime":"2025-10-02T10:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.330875 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.330934 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.330946 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.330965 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.330978 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:21Z","lastTransitionTime":"2025-10-02T10:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.433689 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.433737 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.433748 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.433762 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.433771 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:21Z","lastTransitionTime":"2025-10-02T10:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.535777 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.535818 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.535879 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.535893 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.535902 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:21Z","lastTransitionTime":"2025-10-02T10:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.638273 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.638320 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.638330 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.638345 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.638357 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:21Z","lastTransitionTime":"2025-10-02T10:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.740493 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.740558 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.740575 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.740592 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.740603 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:21Z","lastTransitionTime":"2025-10-02T10:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.842742 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.842784 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.842797 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.842812 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.842829 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:21Z","lastTransitionTime":"2025-10-02T10:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.880651 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.880699 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:21 crc kubenswrapper[4766]: E1002 10:53:21.880871 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:21 crc kubenswrapper[4766]: E1002 10:53:21.881035 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.944327 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.944365 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.944374 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.944387 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:21 crc kubenswrapper[4766]: I1002 10:53:21.944398 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:21Z","lastTransitionTime":"2025-10-02T10:53:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.047345 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.047671 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.047685 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.047700 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.047709 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:22Z","lastTransitionTime":"2025-10-02T10:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.151034 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.151108 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.151118 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.151152 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.151164 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:22Z","lastTransitionTime":"2025-10-02T10:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.254062 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.254113 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.254124 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.254138 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.254148 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:22Z","lastTransitionTime":"2025-10-02T10:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.356269 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.356301 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.356311 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.356323 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.356333 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:22Z","lastTransitionTime":"2025-10-02T10:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.459949 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.460042 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.460061 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.460086 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.460100 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:22Z","lastTransitionTime":"2025-10-02T10:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.562124 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.562176 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.562192 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.562211 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.562226 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:22Z","lastTransitionTime":"2025-10-02T10:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.665197 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.665237 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.665248 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.665266 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.665281 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:22Z","lastTransitionTime":"2025-10-02T10:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.767122 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.767158 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.767170 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.767185 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.767197 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:22Z","lastTransitionTime":"2025-10-02T10:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.869394 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.869443 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.869483 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.869495 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.869547 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:22Z","lastTransitionTime":"2025-10-02T10:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.880926 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:22 crc kubenswrapper[4766]: E1002 10:53:22.881075 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.880937 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:22 crc kubenswrapper[4766]: E1002 10:53:22.881260 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.971937 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.972022 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.972127 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.972164 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:22 crc kubenswrapper[4766]: I1002 10:53:22.972182 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:22Z","lastTransitionTime":"2025-10-02T10:53:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.074564 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.074590 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.074598 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.074610 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.074618 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:23Z","lastTransitionTime":"2025-10-02T10:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.177041 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.177088 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.177099 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.177113 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.177125 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:23Z","lastTransitionTime":"2025-10-02T10:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.279466 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.279557 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.279569 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.279585 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.279598 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:23Z","lastTransitionTime":"2025-10-02T10:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.381916 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.381986 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.381998 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.382015 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.382026 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:23Z","lastTransitionTime":"2025-10-02T10:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.485089 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.485151 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.485163 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.485184 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.485207 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:23Z","lastTransitionTime":"2025-10-02T10:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.588399 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.588436 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.588449 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.588510 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.588524 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:23Z","lastTransitionTime":"2025-10-02T10:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.691249 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.691291 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.691301 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.691319 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.691330 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:23Z","lastTransitionTime":"2025-10-02T10:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.794103 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.794171 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.794184 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.794201 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.794214 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:23Z","lastTransitionTime":"2025-10-02T10:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.880937 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.881021 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:23 crc kubenswrapper[4766]: E1002 10:53:23.881092 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:23 crc kubenswrapper[4766]: E1002 10:53:23.881136 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.896301 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.896345 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.896354 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.896370 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.896379 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:23Z","lastTransitionTime":"2025-10-02T10:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.999521 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.999571 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.999582 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.999598 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:23 crc kubenswrapper[4766]: I1002 10:53:23.999613 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:23Z","lastTransitionTime":"2025-10-02T10:53:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.101773 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.101820 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.101831 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.101846 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.101857 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:24Z","lastTransitionTime":"2025-10-02T10:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.205562 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.205617 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.205630 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.205647 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.205663 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:24Z","lastTransitionTime":"2025-10-02T10:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.308299 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.308343 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.308355 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.308372 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.308389 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:24Z","lastTransitionTime":"2025-10-02T10:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.410210 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.410253 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.410264 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.410279 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.410289 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:24Z","lastTransitionTime":"2025-10-02T10:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.512838 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.512872 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.512880 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.512892 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.512902 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:24Z","lastTransitionTime":"2025-10-02T10:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.556680 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2jxdg_a6aa81c2-8c87-43df-badb-7b9dbef84ccf/kube-multus/1.log" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.557227 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2jxdg_a6aa81c2-8c87-43df-badb-7b9dbef84ccf/kube-multus/0.log" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.557280 4766 generic.go:334] "Generic (PLEG): container finished" podID="a6aa81c2-8c87-43df-badb-7b9dbef84ccf" containerID="ff77e4fb340919ea122bf7a3ecdab638bcb0d9dde19ec12b466f14a2cf2e753f" exitCode=1 Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.557311 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2jxdg" event={"ID":"a6aa81c2-8c87-43df-badb-7b9dbef84ccf","Type":"ContainerDied","Data":"ff77e4fb340919ea122bf7a3ecdab638bcb0d9dde19ec12b466f14a2cf2e753f"} Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.557348 4766 scope.go:117] "RemoveContainer" containerID="da500d11366febbd34b3246b556a671942547bf3bc77761b28399a01579b1071" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.557752 4766 scope.go:117] "RemoveContainer" containerID="ff77e4fb340919ea122bf7a3ecdab638bcb0d9dde19ec12b466f14a2cf2e753f" Oct 02 10:53:24 crc kubenswrapper[4766]: E1002 10:53:24.557946 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-2jxdg_openshift-multus(a6aa81c2-8c87-43df-badb-7b9dbef84ccf)\"" pod="openshift-multus/multus-2jxdg" podUID="a6aa81c2-8c87-43df-badb-7b9dbef84ccf" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.613949 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5vgtz" podStartSLOduration=97.613914685 podStartE2EDuration="1m37.613914685s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:24.601307091 +0000 UTC m=+119.544178045" watchObservedRunningTime="2025-10-02 10:53:24.613914685 +0000 UTC m=+119.556785629" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.617383 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.617406 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.617414 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.617425 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.617435 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:24Z","lastTransitionTime":"2025-10-02T10:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.632581 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4wx78" podStartSLOduration=97.632564793 podStartE2EDuration="1m37.632564793s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:24.631968002 +0000 UTC m=+119.574838966" watchObservedRunningTime="2025-10-02 10:53:24.632564793 +0000 UTC m=+119.575435737" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.632786 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podStartSLOduration=97.63278072 podStartE2EDuration="1m37.63278072s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:24.614314069 +0000 UTC m=+119.557185003" watchObservedRunningTime="2025-10-02 10:53:24.63278072 +0000 UTC m=+119.575651664" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.685279 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=29.685258095000002 podStartE2EDuration="29.685258095s" podCreationTimestamp="2025-10-02 10:52:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:24.673590873 +0000 UTC m=+119.616461837" watchObservedRunningTime="2025-10-02 10:53:24.685258095 +0000 UTC m=+119.628129039" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.697748 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=95.697729605 podStartE2EDuration="1m35.697729605s" podCreationTimestamp="2025-10-02 10:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:24.697326962 +0000 UTC m=+119.640197926" watchObservedRunningTime="2025-10-02 10:53:24.697729605 +0000 UTC m=+119.640600549" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.706171 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-w4c82" podStartSLOduration=97.706154599 podStartE2EDuration="1m37.706154599s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:24.705206287 +0000 UTC m=+119.648077241" watchObservedRunningTime="2025-10-02 10:53:24.706154599 +0000 UTC m=+119.649025543" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.720227 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.720293 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.720308 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.720324 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.720335 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:24Z","lastTransitionTime":"2025-10-02T10:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.732591 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cll7n" podStartSLOduration=97.732574877 podStartE2EDuration="1m37.732574877s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:24.716960692 +0000 UTC m=+119.659831646" watchObservedRunningTime="2025-10-02 10:53:24.732574877 +0000 UTC m=+119.675445821" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.774978 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=65.774960133 podStartE2EDuration="1m5.774960133s" podCreationTimestamp="2025-10-02 10:52:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:24.758427397 +0000 UTC m=+119.701298341" watchObservedRunningTime="2025-10-02 10:53:24.774960133 +0000 UTC m=+119.717831077" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.822956 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.822997 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.823005 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.823021 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.823030 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:24Z","lastTransitionTime":"2025-10-02T10:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.842821 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=28.842799796 podStartE2EDuration="28.842799796s" podCreationTimestamp="2025-10-02 10:52:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:24.842293659 +0000 UTC m=+119.785164603" watchObservedRunningTime="2025-10-02 10:53:24.842799796 +0000 UTC m=+119.785670740" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.856244 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=96.856227307 podStartE2EDuration="1m36.856227307s" podCreationTimestamp="2025-10-02 10:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:24.856062522 +0000 UTC m=+119.798933466" watchObservedRunningTime="2025-10-02 10:53:24.856227307 +0000 UTC m=+119.799098251" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.880361 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.880412 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:24 crc kubenswrapper[4766]: E1002 10:53:24.880495 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:24 crc kubenswrapper[4766]: E1002 10:53:24.880641 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.925134 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.925175 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.925187 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.925206 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:24 crc kubenswrapper[4766]: I1002 10:53:24.925218 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:24Z","lastTransitionTime":"2025-10-02T10:53:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.027687 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.027731 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.027740 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.027752 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.027761 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:25Z","lastTransitionTime":"2025-10-02T10:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.130474 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.130587 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.130603 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.130621 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.130633 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:25Z","lastTransitionTime":"2025-10-02T10:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.232441 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.232488 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.232523 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.232540 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.232552 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:25Z","lastTransitionTime":"2025-10-02T10:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.335469 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.335561 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.335576 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.335594 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.335607 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:25Z","lastTransitionTime":"2025-10-02T10:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.438104 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.438138 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.438146 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.438159 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.438169 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:25Z","lastTransitionTime":"2025-10-02T10:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.540020 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.540064 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.540077 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.540092 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.540104 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:25Z","lastTransitionTime":"2025-10-02T10:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.561736 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2jxdg_a6aa81c2-8c87-43df-badb-7b9dbef84ccf/kube-multus/1.log" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.642437 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.642476 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.642490 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.642533 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.642546 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:25Z","lastTransitionTime":"2025-10-02T10:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.744865 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.744966 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.744978 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.744992 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.745002 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:25Z","lastTransitionTime":"2025-10-02T10:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:25 crc kubenswrapper[4766]: E1002 10:53:25.846075 4766 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.880237 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:25 crc kubenswrapper[4766]: E1002 10:53:25.881182 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.881276 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:25 crc kubenswrapper[4766]: E1002 10:53:25.881392 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.972552 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.972589 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.972597 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.972610 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 02 10:53:25 crc kubenswrapper[4766]: I1002 10:53:25.972620 4766 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-02T10:53:25Z","lastTransitionTime":"2025-10-02T10:53:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 02 10:53:25 crc kubenswrapper[4766]: E1002 10:53:25.990727 4766 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.019490 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz"] Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.019886 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.022269 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.022403 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.022432 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.022523 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.169089 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4e6f7d36-40f8-4faa-93ba-40a535c59581-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-p64cz\" (UID: \"4e6f7d36-40f8-4faa-93ba-40a535c59581\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.169154 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4e6f7d36-40f8-4faa-93ba-40a535c59581-service-ca\") pod \"cluster-version-operator-5c965bbfc6-p64cz\" (UID: \"4e6f7d36-40f8-4faa-93ba-40a535c59581\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.169195 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e6f7d36-40f8-4faa-93ba-40a535c59581-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-p64cz\" (UID: \"4e6f7d36-40f8-4faa-93ba-40a535c59581\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.169215 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e6f7d36-40f8-4faa-93ba-40a535c59581-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-p64cz\" (UID: \"4e6f7d36-40f8-4faa-93ba-40a535c59581\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.169280 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4e6f7d36-40f8-4faa-93ba-40a535c59581-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-p64cz\" (UID: \"4e6f7d36-40f8-4faa-93ba-40a535c59581\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.270166 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4e6f7d36-40f8-4faa-93ba-40a535c59581-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-p64cz\" (UID: \"4e6f7d36-40f8-4faa-93ba-40a535c59581\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.270206 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4e6f7d36-40f8-4faa-93ba-40a535c59581-service-ca\") pod \"cluster-version-operator-5c965bbfc6-p64cz\" (UID: \"4e6f7d36-40f8-4faa-93ba-40a535c59581\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.270243 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e6f7d36-40f8-4faa-93ba-40a535c59581-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-p64cz\" (UID: \"4e6f7d36-40f8-4faa-93ba-40a535c59581\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.270266 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e6f7d36-40f8-4faa-93ba-40a535c59581-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-p64cz\" (UID: \"4e6f7d36-40f8-4faa-93ba-40a535c59581\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.270290 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4e6f7d36-40f8-4faa-93ba-40a535c59581-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-p64cz\" (UID: \"4e6f7d36-40f8-4faa-93ba-40a535c59581\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.270372 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4e6f7d36-40f8-4faa-93ba-40a535c59581-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-p64cz\" (UID: \"4e6f7d36-40f8-4faa-93ba-40a535c59581\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.271255 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4e6f7d36-40f8-4faa-93ba-40a535c59581-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-p64cz\" (UID: \"4e6f7d36-40f8-4faa-93ba-40a535c59581\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.271492 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4e6f7d36-40f8-4faa-93ba-40a535c59581-service-ca\") pod \"cluster-version-operator-5c965bbfc6-p64cz\" (UID: \"4e6f7d36-40f8-4faa-93ba-40a535c59581\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.276118 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4e6f7d36-40f8-4faa-93ba-40a535c59581-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-p64cz\" (UID: \"4e6f7d36-40f8-4faa-93ba-40a535c59581\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.285392 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e6f7d36-40f8-4faa-93ba-40a535c59581-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-p64cz\" (UID: \"4e6f7d36-40f8-4faa-93ba-40a535c59581\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.331606 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.565121 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" event={"ID":"4e6f7d36-40f8-4faa-93ba-40a535c59581","Type":"ContainerStarted","Data":"add4e5ad807b596e8964263b91f7ab4199e356e588b5ecadcf38feb1c8227fba"} Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.565162 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" event={"ID":"4e6f7d36-40f8-4faa-93ba-40a535c59581","Type":"ContainerStarted","Data":"4d0d03291231e44abb57907baf1e8869fcc1e0f17132dda512df06e519e1a9cf"} Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.880815 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.880885 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:26 crc kubenswrapper[4766]: E1002 10:53:26.881029 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:26 crc kubenswrapper[4766]: E1002 10:53:26.881265 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:26 crc kubenswrapper[4766]: I1002 10:53:26.882131 4766 scope.go:117] "RemoveContainer" containerID="a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560" Oct 02 10:53:26 crc kubenswrapper[4766]: E1002 10:53:26.882351 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-27vgl_openshift-ovn-kubernetes(11cc785e-5bdc-4827-913a-4d899eb5a83c)\"" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" Oct 02 10:53:27 crc kubenswrapper[4766]: I1002 10:53:27.583473 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-p64cz" podStartSLOduration=100.583448175 podStartE2EDuration="1m40.583448175s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:27.580830057 +0000 UTC m=+122.523701061" watchObservedRunningTime="2025-10-02 10:53:27.583448175 +0000 UTC m=+122.526319149" Oct 02 10:53:27 crc kubenswrapper[4766]: I1002 10:53:27.880593 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:27 crc kubenswrapper[4766]: I1002 10:53:27.880645 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:27 crc kubenswrapper[4766]: E1002 10:53:27.880753 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:27 crc kubenswrapper[4766]: E1002 10:53:27.880887 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:28 crc kubenswrapper[4766]: I1002 10:53:28.881263 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:28 crc kubenswrapper[4766]: I1002 10:53:28.881358 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:28 crc kubenswrapper[4766]: E1002 10:53:28.881436 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:28 crc kubenswrapper[4766]: E1002 10:53:28.881604 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:29 crc kubenswrapper[4766]: I1002 10:53:29.880789 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:29 crc kubenswrapper[4766]: I1002 10:53:29.880860 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:29 crc kubenswrapper[4766]: E1002 10:53:29.880933 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:29 crc kubenswrapper[4766]: E1002 10:53:29.881050 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:30 crc kubenswrapper[4766]: I1002 10:53:30.880376 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:30 crc kubenswrapper[4766]: I1002 10:53:30.880401 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:30 crc kubenswrapper[4766]: E1002 10:53:30.880791 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:30 crc kubenswrapper[4766]: E1002 10:53:30.881118 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:30 crc kubenswrapper[4766]: E1002 10:53:30.991937 4766 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 10:53:31 crc kubenswrapper[4766]: I1002 10:53:31.881054 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:31 crc kubenswrapper[4766]: E1002 10:53:31.881198 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:31 crc kubenswrapper[4766]: I1002 10:53:31.881287 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:31 crc kubenswrapper[4766]: E1002 10:53:31.881446 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:32 crc kubenswrapper[4766]: I1002 10:53:32.881025 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:32 crc kubenswrapper[4766]: E1002 10:53:32.881165 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:32 crc kubenswrapper[4766]: I1002 10:53:32.881351 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:32 crc kubenswrapper[4766]: E1002 10:53:32.881408 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:33 crc kubenswrapper[4766]: I1002 10:53:33.880971 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:33 crc kubenswrapper[4766]: E1002 10:53:33.881158 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:33 crc kubenswrapper[4766]: I1002 10:53:33.881427 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:33 crc kubenswrapper[4766]: E1002 10:53:33.881634 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:34 crc kubenswrapper[4766]: I1002 10:53:34.880659 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:34 crc kubenswrapper[4766]: E1002 10:53:34.880796 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:34 crc kubenswrapper[4766]: I1002 10:53:34.880975 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:34 crc kubenswrapper[4766]: E1002 10:53:34.881023 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:35 crc kubenswrapper[4766]: I1002 10:53:35.880640 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:35 crc kubenswrapper[4766]: E1002 10:53:35.881480 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:35 crc kubenswrapper[4766]: I1002 10:53:35.881590 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:35 crc kubenswrapper[4766]: E1002 10:53:35.881773 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:35 crc kubenswrapper[4766]: E1002 10:53:35.992529 4766 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 10:53:36 crc kubenswrapper[4766]: I1002 10:53:36.881143 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:36 crc kubenswrapper[4766]: E1002 10:53:36.881292 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:36 crc kubenswrapper[4766]: I1002 10:53:36.881146 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:36 crc kubenswrapper[4766]: E1002 10:53:36.881368 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:37 crc kubenswrapper[4766]: I1002 10:53:37.881366 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:37 crc kubenswrapper[4766]: I1002 10:53:37.881378 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:37 crc kubenswrapper[4766]: E1002 10:53:37.881590 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:37 crc kubenswrapper[4766]: E1002 10:53:37.881640 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:38 crc kubenswrapper[4766]: I1002 10:53:38.880531 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:38 crc kubenswrapper[4766]: I1002 10:53:38.880599 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:38 crc kubenswrapper[4766]: E1002 10:53:38.881217 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:38 crc kubenswrapper[4766]: E1002 10:53:38.881280 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:38 crc kubenswrapper[4766]: I1002 10:53:38.881911 4766 scope.go:117] "RemoveContainer" containerID="a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560" Oct 02 10:53:39 crc kubenswrapper[4766]: I1002 10:53:39.606027 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovnkube-controller/3.log" Oct 02 10:53:39 crc kubenswrapper[4766]: I1002 10:53:39.608421 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerStarted","Data":"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8"} Oct 02 10:53:39 crc kubenswrapper[4766]: I1002 10:53:39.608860 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:53:39 crc kubenswrapper[4766]: I1002 10:53:39.633925 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podStartSLOduration=112.633908744 podStartE2EDuration="1m52.633908744s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:39.633438119 +0000 UTC m=+134.576309083" watchObservedRunningTime="2025-10-02 10:53:39.633908744 +0000 UTC m=+134.576779688" Oct 02 10:53:39 crc kubenswrapper[4766]: I1002 10:53:39.868182 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-klg2z"] Oct 02 10:53:39 crc kubenswrapper[4766]: I1002 10:53:39.868327 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:39 crc kubenswrapper[4766]: E1002 10:53:39.868449 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:39 crc kubenswrapper[4766]: I1002 10:53:39.881055 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:39 crc kubenswrapper[4766]: I1002 10:53:39.881261 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:39 crc kubenswrapper[4766]: E1002 10:53:39.881388 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:39 crc kubenswrapper[4766]: E1002 10:53:39.881494 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:39 crc kubenswrapper[4766]: I1002 10:53:39.881759 4766 scope.go:117] "RemoveContainer" containerID="ff77e4fb340919ea122bf7a3ecdab638bcb0d9dde19ec12b466f14a2cf2e753f" Oct 02 10:53:40 crc kubenswrapper[4766]: I1002 10:53:40.612577 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2jxdg_a6aa81c2-8c87-43df-badb-7b9dbef84ccf/kube-multus/1.log" Oct 02 10:53:40 crc kubenswrapper[4766]: I1002 10:53:40.612642 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2jxdg" event={"ID":"a6aa81c2-8c87-43df-badb-7b9dbef84ccf","Type":"ContainerStarted","Data":"1f29fcf0f6187d7194dae698016fffc20d300b88e7467e1fcd6a97ddd9243ac7"} Oct 02 10:53:40 crc kubenswrapper[4766]: I1002 10:53:40.634118 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-2jxdg" podStartSLOduration=113.634095367 podStartE2EDuration="1m53.634095367s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:40.633094484 +0000 UTC m=+135.575965458" watchObservedRunningTime="2025-10-02 10:53:40.634095367 +0000 UTC m=+135.576966331" Oct 02 10:53:40 crc kubenswrapper[4766]: I1002 10:53:40.880512 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:40 crc kubenswrapper[4766]: E1002 10:53:40.880627 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:40 crc kubenswrapper[4766]: E1002 10:53:40.993934 4766 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 10:53:41 crc kubenswrapper[4766]: I1002 10:53:41.881232 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:41 crc kubenswrapper[4766]: I1002 10:53:41.881264 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:41 crc kubenswrapper[4766]: E1002 10:53:41.881418 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:41 crc kubenswrapper[4766]: E1002 10:53:41.881533 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:41 crc kubenswrapper[4766]: I1002 10:53:41.881294 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:41 crc kubenswrapper[4766]: E1002 10:53:41.881609 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:42 crc kubenswrapper[4766]: I1002 10:53:42.880697 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:42 crc kubenswrapper[4766]: E1002 10:53:42.881009 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:43 crc kubenswrapper[4766]: I1002 10:53:43.880570 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:43 crc kubenswrapper[4766]: E1002 10:53:43.880720 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:43 crc kubenswrapper[4766]: I1002 10:53:43.880826 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:43 crc kubenswrapper[4766]: E1002 10:53:43.880912 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:43 crc kubenswrapper[4766]: I1002 10:53:43.880824 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:43 crc kubenswrapper[4766]: E1002 10:53:43.881001 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:44 crc kubenswrapper[4766]: I1002 10:53:44.880640 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:44 crc kubenswrapper[4766]: E1002 10:53:44.880758 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 02 10:53:45 crc kubenswrapper[4766]: I1002 10:53:45.880829 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:45 crc kubenswrapper[4766]: I1002 10:53:45.880846 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:45 crc kubenswrapper[4766]: E1002 10:53:45.881991 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-klg2z" podUID="6d68573a-5250-4407-8631-2199a3de7e9e" Oct 02 10:53:45 crc kubenswrapper[4766]: I1002 10:53:45.882009 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:45 crc kubenswrapper[4766]: E1002 10:53:45.882068 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 02 10:53:45 crc kubenswrapper[4766]: E1002 10:53:45.882122 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.278173 4766 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.366379 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.366962 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.371387 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mm6nw"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.372012 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mm6nw" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.372550 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.374768 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.374794 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.376810 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k8v6c"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.376881 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.382255 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k8v6c" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.383052 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.383531 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.384388 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x5jh"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.384834 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x5jh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.385117 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.385312 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.385425 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.385608 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.385726 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.385824 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.385907 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.386011 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.386032 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.386106 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.386241 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.386669 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.387780 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-44mnp"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.388163 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.389072 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qhc8r"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.390191 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.393709 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tmkn8"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.394080 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.394425 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.394741 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.395099 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.395107 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zkc9n"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.399262 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flmqc"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.399681 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.400150 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.400764 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.400884 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.400993 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flmqc" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401006 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.400851 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401468 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zllgh\" (UniqueName: \"kubernetes.io/projected/6e1d0411-55ac-4287-b19f-d6c46444434b-kube-api-access-zllgh\") pod \"machine-config-controller-84d6567774-7tk8w\" (UID: \"6e1d0411-55ac-4287-b19f-d6c46444434b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401523 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6a20802-d5f8-4b5f-8655-410dc9bd8aa7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-k8v6c\" (UID: \"c6a20802-d5f8-4b5f-8655-410dc9bd8aa7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k8v6c" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401595 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/900aedf6-0ce4-429f-9d04-2776a8625593-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8x5jh\" (UID: \"900aedf6-0ce4-429f-9d04-2776a8625593\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x5jh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401618 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ff4a13b-a07e-4031-a1fb-ba29027332e8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mm6nw\" (UID: \"1ff4a13b-a07e-4031-a1fb-ba29027332e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mm6nw" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401647 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tznn8\" (UniqueName: \"kubernetes.io/projected/fc982890-ee1e-4482-8c17-0c5b11583ce2-kube-api-access-tznn8\") pod \"machine-config-operator-74547568cd-qklg5\" (UID: \"fc982890-ee1e-4482-8c17-0c5b11583ce2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401690 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/900aedf6-0ce4-429f-9d04-2776a8625593-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8x5jh\" (UID: \"900aedf6-0ce4-429f-9d04-2776a8625593\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x5jh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401712 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6a20802-d5f8-4b5f-8655-410dc9bd8aa7-config\") pod \"kube-controller-manager-operator-78b949d7b-k8v6c\" (UID: \"c6a20802-d5f8-4b5f-8655-410dc9bd8aa7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k8v6c" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401757 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900aedf6-0ce4-429f-9d04-2776a8625593-config\") pod \"kube-apiserver-operator-766d6c64bb-8x5jh\" (UID: \"900aedf6-0ce4-429f-9d04-2776a8625593\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x5jh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401783 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/786673c7-6fb8-4b0d-864a-ea29fa681de6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g6jjj\" (UID: \"786673c7-6fb8-4b0d-864a-ea29fa681de6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401819 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ff4a13b-a07e-4031-a1fb-ba29027332e8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mm6nw\" (UID: \"1ff4a13b-a07e-4031-a1fb-ba29027332e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mm6nw" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401843 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ff4a13b-a07e-4031-a1fb-ba29027332e8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mm6nw\" (UID: \"1ff4a13b-a07e-4031-a1fb-ba29027332e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mm6nw" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401864 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e1d0411-55ac-4287-b19f-d6c46444434b-proxy-tls\") pod \"machine-config-controller-84d6567774-7tk8w\" (UID: \"6e1d0411-55ac-4287-b19f-d6c46444434b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401885 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fc982890-ee1e-4482-8c17-0c5b11583ce2-images\") pod \"machine-config-operator-74547568cd-qklg5\" (UID: \"fc982890-ee1e-4482-8c17-0c5b11583ce2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401909 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e1d0411-55ac-4287-b19f-d6c46444434b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7tk8w\" (UID: \"6e1d0411-55ac-4287-b19f-d6c46444434b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401933 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p4d4\" (UniqueName: \"kubernetes.io/projected/786673c7-6fb8-4b0d-864a-ea29fa681de6-kube-api-access-8p4d4\") pod \"ingress-operator-5b745b69d9-g6jjj\" (UID: \"786673c7-6fb8-4b0d-864a-ea29fa681de6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401975 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc982890-ee1e-4482-8c17-0c5b11583ce2-proxy-tls\") pod \"machine-config-operator-74547568cd-qklg5\" (UID: \"fc982890-ee1e-4482-8c17-0c5b11583ce2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.401996 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/786673c7-6fb8-4b0d-864a-ea29fa681de6-trusted-ca\") pod \"ingress-operator-5b745b69d9-g6jjj\" (UID: \"786673c7-6fb8-4b0d-864a-ea29fa681de6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.402015 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6a20802-d5f8-4b5f-8655-410dc9bd8aa7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-k8v6c\" (UID: \"c6a20802-d5f8-4b5f-8655-410dc9bd8aa7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k8v6c" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.402051 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/786673c7-6fb8-4b0d-864a-ea29fa681de6-metrics-tls\") pod \"ingress-operator-5b745b69d9-g6jjj\" (UID: \"786673c7-6fb8-4b0d-864a-ea29fa681de6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.402072 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-config-volume\") pod \"collect-profiles-29323365-wmmns\" (UID: \"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.402094 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5zcf\" (UniqueName: \"kubernetes.io/projected/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-kube-api-access-n5zcf\") pod \"collect-profiles-29323365-wmmns\" (UID: \"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.402115 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc982890-ee1e-4482-8c17-0c5b11583ce2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qklg5\" (UID: \"fc982890-ee1e-4482-8c17-0c5b11583ce2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.402140 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-secret-volume\") pod \"collect-profiles-29323365-wmmns\" (UID: \"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.403194 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.404707 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.406566 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-d5v2m"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.407205 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.407350 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zvx8j"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.408078 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zvx8j" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.410567 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k2sqh"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.411080 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d59w5"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.411467 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d59w5" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.411770 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.411913 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.412361 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.415177 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.415269 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.415198 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.415559 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.416178 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.416309 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.416595 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.416882 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.420769 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.420999 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.421256 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.421597 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.421670 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.421970 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.422522 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.422625 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.424192 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4dx82"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.425321 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4dx82" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.426533 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.427810 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.427810 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.429371 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.437774 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.437954 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.438001 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.437965 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.446789 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.447210 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.447408 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.447521 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.448818 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.448936 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.449045 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.449795 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.451871 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jmttf"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.454536 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmttf" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.449918 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.449994 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450131 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450168 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450214 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450237 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.455305 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5bg6"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450276 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450317 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450352 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450390 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450410 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450426 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450592 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.455886 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5bg6" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450622 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450747 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450805 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450850 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450898 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450930 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450947 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450966 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.450984 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.451014 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.451034 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.451044 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.451073 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.451116 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.451149 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.451192 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.451230 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.451269 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.451728 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.451773 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.452023 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.452170 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.452246 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.452542 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.452657 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.452765 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.452828 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.452897 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.458300 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.458475 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.458608 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.458474 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.458725 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.458837 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.458837 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.458898 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.458978 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.459061 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.459705 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.461262 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.462031 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.462817 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-xqcd7"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.463039 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.463540 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xqcd7" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.467520 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.468089 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nkh48"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.468533 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nkh48" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.468551 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-lrsk8"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.468683 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.469186 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lrsk8" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.469491 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.469996 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.470119 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.471657 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwrx"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.472308 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwrx" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.473099 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.473955 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-knkk5"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.474529 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.474805 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nm9mf"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.479748 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rlrhw"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.479992 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nm9mf" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.481383 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qprrh"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.482838 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rlrhw" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.489051 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.489296 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.497395 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.500200 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m2kx2"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.501316 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vkm2v"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.502330 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.502353 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-8z2x2"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.502934 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ttsj7"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.503693 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.504753 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qprrh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.505332 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ttsj7" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.505857 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-q2szc"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.505882 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.506098 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.536869 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/786673c7-6fb8-4b0d-864a-ea29fa681de6-metrics-tls\") pod \"ingress-operator-5b745b69d9-g6jjj\" (UID: \"786673c7-6fb8-4b0d-864a-ea29fa681de6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.536915 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-config-volume\") pod \"collect-profiles-29323365-wmmns\" (UID: \"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.536945 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8fb9e589-a259-4ff9-9d1b-198b57fb179b-metrics-tls\") pod \"dns-default-zvx8j\" (UID: \"8fb9e589-a259-4ff9-9d1b-198b57fb179b\") " pod="openshift-dns/dns-default-zvx8j" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.536963 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-encryption-config\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.536994 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5zcf\" (UniqueName: \"kubernetes.io/projected/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-kube-api-access-n5zcf\") pod \"collect-profiles-29323365-wmmns\" (UID: \"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537009 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc982890-ee1e-4482-8c17-0c5b11583ce2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qklg5\" (UID: \"fc982890-ee1e-4482-8c17-0c5b11583ce2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537030 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-audit\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537046 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf5sz\" (UniqueName: \"kubernetes.io/projected/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-kube-api-access-hf5sz\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537074 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/641e2cb7-16f3-4339-ace8-1a5d6b921841-serving-cert\") pod \"console-operator-58897d9998-nm9mf\" (UID: \"641e2cb7-16f3-4339-ace8-1a5d6b921841\") " pod="openshift-console-operator/console-operator-58897d9998-nm9mf" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537090 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fb9e589-a259-4ff9-9d1b-198b57fb179b-config-volume\") pod \"dns-default-zvx8j\" (UID: \"8fb9e589-a259-4ff9-9d1b-198b57fb179b\") " pod="openshift-dns/dns-default-zvx8j" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537105 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-etcd-client\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537123 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-secret-volume\") pod \"collect-profiles-29323365-wmmns\" (UID: \"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537141 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-config\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537168 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkr55\" (UniqueName: \"kubernetes.io/projected/6d87787d-4605-4895-a5bd-a3820dd38fae-kube-api-access-dkr55\") pod \"olm-operator-6b444d44fb-qz9xl\" (UID: \"6d87787d-4605-4895-a5bd-a3820dd38fae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537188 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143c6751-acab-4e56-9e54-b0e4dc6ae562-config\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537209 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae-machine-approver-tls\") pod \"machine-approver-56656f9798-cpxbq\" (UID: \"730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537230 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zllgh\" (UniqueName: \"kubernetes.io/projected/6e1d0411-55ac-4287-b19f-d6c46444434b-kube-api-access-zllgh\") pod \"machine-config-controller-84d6567774-7tk8w\" (UID: \"6e1d0411-55ac-4287-b19f-d6c46444434b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537247 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6a20802-d5f8-4b5f-8655-410dc9bd8aa7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-k8v6c\" (UID: \"c6a20802-d5f8-4b5f-8655-410dc9bd8aa7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k8v6c" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537263 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-node-pullsecrets\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537282 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/900aedf6-0ce4-429f-9d04-2776a8625593-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8x5jh\" (UID: \"900aedf6-0ce4-429f-9d04-2776a8625593\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x5jh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537299 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537322 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ff4a13b-a07e-4031-a1fb-ba29027332e8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mm6nw\" (UID: \"1ff4a13b-a07e-4031-a1fb-ba29027332e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mm6nw" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537340 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtsfw\" (UniqueName: \"kubernetes.io/projected/ff3698e3-0db7-4a46-8244-ec9486c9ed48-kube-api-access-dtsfw\") pod \"package-server-manager-789f6589d5-z5bg6\" (UID: \"ff3698e3-0db7-4a46-8244-ec9486c9ed48\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5bg6" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537358 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae-auth-proxy-config\") pod \"machine-approver-56656f9798-cpxbq\" (UID: \"730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537375 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff3698e3-0db7-4a46-8244-ec9486c9ed48-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-z5bg6\" (UID: \"ff3698e3-0db7-4a46-8244-ec9486c9ed48\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5bg6" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537395 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/641e2cb7-16f3-4339-ace8-1a5d6b921841-trusted-ca\") pod \"console-operator-58897d9998-nm9mf\" (UID: \"641e2cb7-16f3-4339-ace8-1a5d6b921841\") " pod="openshift-console-operator/console-operator-58897d9998-nm9mf" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537410 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-serving-cert\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537426 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f27jk\" (UniqueName: \"kubernetes.io/projected/641e2cb7-16f3-4339-ace8-1a5d6b921841-kube-api-access-f27jk\") pod \"console-operator-58897d9998-nm9mf\" (UID: \"641e2cb7-16f3-4339-ace8-1a5d6b921841\") " pod="openshift-console-operator/console-operator-58897d9998-nm9mf" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537441 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6cht\" (UniqueName: \"kubernetes.io/projected/8fb9e589-a259-4ff9-9d1b-198b57fb179b-kube-api-access-k6cht\") pod \"dns-default-zvx8j\" (UID: \"8fb9e589-a259-4ff9-9d1b-198b57fb179b\") " pod="openshift-dns/dns-default-zvx8j" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537460 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tznn8\" (UniqueName: \"kubernetes.io/projected/fc982890-ee1e-4482-8c17-0c5b11583ce2-kube-api-access-tznn8\") pod \"machine-config-operator-74547568cd-qklg5\" (UID: \"fc982890-ee1e-4482-8c17-0c5b11583ce2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537476 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-image-import-ca\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537491 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/143c6751-acab-4e56-9e54-b0e4dc6ae562-etcd-service-ca\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537520 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/143c6751-acab-4e56-9e54-b0e4dc6ae562-etcd-client\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537567 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900aedf6-0ce4-429f-9d04-2776a8625593-config\") pod \"kube-apiserver-operator-766d6c64bb-8x5jh\" (UID: \"900aedf6-0ce4-429f-9d04-2776a8625593\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x5jh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537573 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mm6nw"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537598 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537583 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/900aedf6-0ce4-429f-9d04-2776a8625593-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8x5jh\" (UID: \"900aedf6-0ce4-429f-9d04-2776a8625593\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x5jh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537646 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6a20802-d5f8-4b5f-8655-410dc9bd8aa7-config\") pod \"kube-controller-manager-operator-78b949d7b-k8v6c\" (UID: \"c6a20802-d5f8-4b5f-8655-410dc9bd8aa7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k8v6c" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537716 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/786673c7-6fb8-4b0d-864a-ea29fa681de6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g6jjj\" (UID: \"786673c7-6fb8-4b0d-864a-ea29fa681de6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537745 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae-config\") pod \"machine-approver-56656f9798-cpxbq\" (UID: \"730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537778 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8ljt\" (UniqueName: \"kubernetes.io/projected/143c6751-acab-4e56-9e54-b0e4dc6ae562-kube-api-access-f8ljt\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537802 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/df3eaaf0-38f3-4334-adaa-bcdc6b4409bc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9hwrx\" (UID: \"df3eaaf0-38f3-4334-adaa-bcdc6b4409bc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwrx" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537826 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-audit-dir\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537844 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/143c6751-acab-4e56-9e54-b0e4dc6ae562-serving-cert\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537866 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d87787d-4605-4895-a5bd-a3820dd38fae-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qz9xl\" (UID: \"6d87787d-4605-4895-a5bd-a3820dd38fae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537889 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ff4a13b-a07e-4031-a1fb-ba29027332e8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mm6nw\" (UID: \"1ff4a13b-a07e-4031-a1fb-ba29027332e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mm6nw" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537907 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ff4a13b-a07e-4031-a1fb-ba29027332e8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mm6nw\" (UID: \"1ff4a13b-a07e-4031-a1fb-ba29027332e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mm6nw" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537923 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dql4\" (UniqueName: \"kubernetes.io/projected/730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae-kube-api-access-4dql4\") pod \"machine-approver-56656f9798-cpxbq\" (UID: \"730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537951 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e1d0411-55ac-4287-b19f-d6c46444434b-proxy-tls\") pod \"machine-config-controller-84d6567774-7tk8w\" (UID: \"6e1d0411-55ac-4287-b19f-d6c46444434b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537967 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fc982890-ee1e-4482-8c17-0c5b11583ce2-images\") pod \"machine-config-operator-74547568cd-qklg5\" (UID: \"fc982890-ee1e-4482-8c17-0c5b11583ce2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.537982 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e1d0411-55ac-4287-b19f-d6c46444434b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7tk8w\" (UID: \"6e1d0411-55ac-4287-b19f-d6c46444434b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.538001 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p4d4\" (UniqueName: \"kubernetes.io/projected/786673c7-6fb8-4b0d-864a-ea29fa681de6-kube-api-access-8p4d4\") pod \"ingress-operator-5b745b69d9-g6jjj\" (UID: \"786673c7-6fb8-4b0d-864a-ea29fa681de6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.538017 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d87787d-4605-4895-a5bd-a3820dd38fae-srv-cert\") pod \"olm-operator-6b444d44fb-qz9xl\" (UID: \"6d87787d-4605-4895-a5bd-a3820dd38fae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.538031 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-etcd-serving-ca\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.538069 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc982890-ee1e-4482-8c17-0c5b11583ce2-proxy-tls\") pod \"machine-config-operator-74547568cd-qklg5\" (UID: \"fc982890-ee1e-4482-8c17-0c5b11583ce2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.538083 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/143c6751-acab-4e56-9e54-b0e4dc6ae562-etcd-ca\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.538099 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/786673c7-6fb8-4b0d-864a-ea29fa681de6-trusted-ca\") pod \"ingress-operator-5b745b69d9-g6jjj\" (UID: \"786673c7-6fb8-4b0d-864a-ea29fa681de6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.538140 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jv7l\" (UniqueName: \"kubernetes.io/projected/df3eaaf0-38f3-4334-adaa-bcdc6b4409bc-kube-api-access-7jv7l\") pod \"cluster-samples-operator-665b6dd947-9hwrx\" (UID: \"df3eaaf0-38f3-4334-adaa-bcdc6b4409bc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwrx" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.538161 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6a20802-d5f8-4b5f-8655-410dc9bd8aa7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-k8v6c\" (UID: \"c6a20802-d5f8-4b5f-8655-410dc9bd8aa7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k8v6c" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.538179 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/641e2cb7-16f3-4339-ace8-1a5d6b921841-config\") pod \"console-operator-58897d9998-nm9mf\" (UID: \"641e2cb7-16f3-4339-ace8-1a5d6b921841\") " pod="openshift-console-operator/console-operator-58897d9998-nm9mf" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.538477 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6a20802-d5f8-4b5f-8655-410dc9bd8aa7-config\") pod \"kube-controller-manager-operator-78b949d7b-k8v6c\" (UID: \"c6a20802-d5f8-4b5f-8655-410dc9bd8aa7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k8v6c" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.539488 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ff4a13b-a07e-4031-a1fb-ba29027332e8-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mm6nw\" (UID: \"1ff4a13b-a07e-4031-a1fb-ba29027332e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mm6nw" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.542603 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6a20802-d5f8-4b5f-8655-410dc9bd8aa7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-k8v6c\" (UID: \"c6a20802-d5f8-4b5f-8655-410dc9bd8aa7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k8v6c" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.543264 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-config-volume\") pod \"collect-profiles-29323365-wmmns\" (UID: \"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.544049 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/786673c7-6fb8-4b0d-864a-ea29fa681de6-metrics-tls\") pod \"ingress-operator-5b745b69d9-g6jjj\" (UID: \"786673c7-6fb8-4b0d-864a-ea29fa681de6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.538706 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-q2szc" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.544299 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/900aedf6-0ce4-429f-9d04-2776a8625593-config\") pod \"kube-apiserver-operator-766d6c64bb-8x5jh\" (UID: \"900aedf6-0ce4-429f-9d04-2776a8625593\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x5jh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.544465 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fc982890-ee1e-4482-8c17-0c5b11583ce2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qklg5\" (UID: \"fc982890-ee1e-4482-8c17-0c5b11583ce2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.544553 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.544701 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/900aedf6-0ce4-429f-9d04-2776a8625593-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8x5jh\" (UID: \"900aedf6-0ce4-429f-9d04-2776a8625593\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x5jh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.544752 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.544782 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.546536 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/786673c7-6fb8-4b0d-864a-ea29fa681de6-trusted-ca\") pod \"ingress-operator-5b745b69d9-g6jjj\" (UID: \"786673c7-6fb8-4b0d-864a-ea29fa681de6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.546807 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.547541 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-secret-volume\") pod \"collect-profiles-29323365-wmmns\" (UID: \"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.547629 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e1d0411-55ac-4287-b19f-d6c46444434b-proxy-tls\") pod \"machine-config-controller-84d6567774-7tk8w\" (UID: \"6e1d0411-55ac-4287-b19f-d6c46444434b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.547763 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6e1d0411-55ac-4287-b19f-d6c46444434b-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-7tk8w\" (UID: \"6e1d0411-55ac-4287-b19f-d6c46444434b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.547880 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fc982890-ee1e-4482-8c17-0c5b11583ce2-proxy-tls\") pod \"machine-config-operator-74547568cd-qklg5\" (UID: \"fc982890-ee1e-4482-8c17-0c5b11583ce2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.548397 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fc982890-ee1e-4482-8c17-0c5b11583ce2-images\") pod \"machine-config-operator-74547568cd-qklg5\" (UID: \"fc982890-ee1e-4482-8c17-0c5b11583ce2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.550947 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ff4a13b-a07e-4031-a1fb-ba29027332e8-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mm6nw\" (UID: \"1ff4a13b-a07e-4031-a1fb-ba29027332e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mm6nw" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.555812 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.562908 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k8v6c"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.567980 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.571526 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qhc8r"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.574692 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.574950 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tmkn8"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.576958 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.578542 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.580300 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nkh48"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.581680 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d59w5"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.583058 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jmttf"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.584431 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.585843 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zkc9n"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.587207 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rbh2d"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.588297 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.588562 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-qkm7n"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.589152 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qkm7n" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.591386 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x5jh"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.592807 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flmqc"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.594223 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lrsk8"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.595532 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.595662 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nm9mf"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.597031 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-d5v2m"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.598538 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xqcd7"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.599889 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.601323 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5bg6"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.602749 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rlrhw"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.604171 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwrx"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.605581 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k2sqh"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.607120 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.608277 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-knkk5"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.609229 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vkm2v"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.610209 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zvx8j"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.611181 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.612221 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qprrh"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.613167 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4dx82"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.614420 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8z2x2"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.614895 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.619838 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rbh2d"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.623168 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-q2szc"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.624656 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m2kx2"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.626110 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ttsj7"] Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.634555 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.638497 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae-machine-approver-tls\") pod \"machine-approver-56656f9798-cpxbq\" (UID: \"730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.638553 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkr55\" (UniqueName: \"kubernetes.io/projected/6d87787d-4605-4895-a5bd-a3820dd38fae-kube-api-access-dkr55\") pod \"olm-operator-6b444d44fb-qz9xl\" (UID: \"6d87787d-4605-4895-a5bd-a3820dd38fae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.638573 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143c6751-acab-4e56-9e54-b0e4dc6ae562-config\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.638611 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-node-pullsecrets\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.638625 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.638689 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-node-pullsecrets\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.638727 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff3698e3-0db7-4a46-8244-ec9486c9ed48-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-z5bg6\" (UID: \"ff3698e3-0db7-4a46-8244-ec9486c9ed48\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5bg6" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.638745 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtsfw\" (UniqueName: \"kubernetes.io/projected/ff3698e3-0db7-4a46-8244-ec9486c9ed48-kube-api-access-dtsfw\") pod \"package-server-manager-789f6589d5-z5bg6\" (UID: \"ff3698e3-0db7-4a46-8244-ec9486c9ed48\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5bg6" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.639353 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/143c6751-acab-4e56-9e54-b0e4dc6ae562-config\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.639672 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-trusted-ca-bundle\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.638760 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae-auth-proxy-config\") pod \"machine-approver-56656f9798-cpxbq\" (UID: \"730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640092 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/641e2cb7-16f3-4339-ace8-1a5d6b921841-trusted-ca\") pod \"console-operator-58897d9998-nm9mf\" (UID: \"641e2cb7-16f3-4339-ace8-1a5d6b921841\") " pod="openshift-console-operator/console-operator-58897d9998-nm9mf" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640144 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-serving-cert\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640162 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f27jk\" (UniqueName: \"kubernetes.io/projected/641e2cb7-16f3-4339-ace8-1a5d6b921841-kube-api-access-f27jk\") pod \"console-operator-58897d9998-nm9mf\" (UID: \"641e2cb7-16f3-4339-ace8-1a5d6b921841\") " pod="openshift-console-operator/console-operator-58897d9998-nm9mf" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640564 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6cht\" (UniqueName: \"kubernetes.io/projected/8fb9e589-a259-4ff9-9d1b-198b57fb179b-kube-api-access-k6cht\") pod \"dns-default-zvx8j\" (UID: \"8fb9e589-a259-4ff9-9d1b-198b57fb179b\") " pod="openshift-dns/dns-default-zvx8j" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640597 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-image-import-ca\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640617 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/143c6751-acab-4e56-9e54-b0e4dc6ae562-etcd-service-ca\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640637 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/143c6751-acab-4e56-9e54-b0e4dc6ae562-etcd-client\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640719 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae-config\") pod \"machine-approver-56656f9798-cpxbq\" (UID: \"730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640742 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8ljt\" (UniqueName: \"kubernetes.io/projected/143c6751-acab-4e56-9e54-b0e4dc6ae562-kube-api-access-f8ljt\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640766 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/df3eaaf0-38f3-4334-adaa-bcdc6b4409bc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9hwrx\" (UID: \"df3eaaf0-38f3-4334-adaa-bcdc6b4409bc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwrx" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640786 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-audit-dir\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640803 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/143c6751-acab-4e56-9e54-b0e4dc6ae562-serving-cert\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640824 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d87787d-4605-4895-a5bd-a3820dd38fae-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qz9xl\" (UID: \"6d87787d-4605-4895-a5bd-a3820dd38fae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640859 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dql4\" (UniqueName: \"kubernetes.io/projected/730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae-kube-api-access-4dql4\") pod \"machine-approver-56656f9798-cpxbq\" (UID: \"730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640899 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d87787d-4605-4895-a5bd-a3820dd38fae-srv-cert\") pod \"olm-operator-6b444d44fb-qz9xl\" (UID: \"6d87787d-4605-4895-a5bd-a3820dd38fae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640923 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-etcd-serving-ca\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640957 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/143c6751-acab-4e56-9e54-b0e4dc6ae562-etcd-ca\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.640984 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jv7l\" (UniqueName: \"kubernetes.io/projected/df3eaaf0-38f3-4334-adaa-bcdc6b4409bc-kube-api-access-7jv7l\") pod \"cluster-samples-operator-665b6dd947-9hwrx\" (UID: \"df3eaaf0-38f3-4334-adaa-bcdc6b4409bc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwrx" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.641017 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/641e2cb7-16f3-4339-ace8-1a5d6b921841-config\") pod \"console-operator-58897d9998-nm9mf\" (UID: \"641e2cb7-16f3-4339-ace8-1a5d6b921841\") " pod="openshift-console-operator/console-operator-58897d9998-nm9mf" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.641050 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8fb9e589-a259-4ff9-9d1b-198b57fb179b-metrics-tls\") pod \"dns-default-zvx8j\" (UID: \"8fb9e589-a259-4ff9-9d1b-198b57fb179b\") " pod="openshift-dns/dns-default-zvx8j" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.641073 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-encryption-config\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.641104 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/641e2cb7-16f3-4339-ace8-1a5d6b921841-serving-cert\") pod \"console-operator-58897d9998-nm9mf\" (UID: \"641e2cb7-16f3-4339-ace8-1a5d6b921841\") " pod="openshift-console-operator/console-operator-58897d9998-nm9mf" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.641124 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fb9e589-a259-4ff9-9d1b-198b57fb179b-config-volume\") pod \"dns-default-zvx8j\" (UID: \"8fb9e589-a259-4ff9-9d1b-198b57fb179b\") " pod="openshift-dns/dns-default-zvx8j" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.641143 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-audit\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.641145 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/143c6751-acab-4e56-9e54-b0e4dc6ae562-etcd-service-ca\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.641183 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf5sz\" (UniqueName: \"kubernetes.io/projected/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-kube-api-access-hf5sz\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.641210 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-etcd-client\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.641235 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-config\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.641274 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-audit-dir\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.641800 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-etcd-serving-ca\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.641979 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-image-import-ca\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.642016 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-config\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.642518 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/143c6751-acab-4e56-9e54-b0e4dc6ae562-etcd-ca\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.642768 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8fb9e589-a259-4ff9-9d1b-198b57fb179b-config-volume\") pod \"dns-default-zvx8j\" (UID: \"8fb9e589-a259-4ff9-9d1b-198b57fb179b\") " pod="openshift-dns/dns-default-zvx8j" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.642771 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-audit\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.643964 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-serving-cert\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.644175 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/143c6751-acab-4e56-9e54-b0e4dc6ae562-serving-cert\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.644370 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-etcd-client\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.644747 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/143c6751-acab-4e56-9e54-b0e4dc6ae562-etcd-client\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.644818 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6d87787d-4605-4895-a5bd-a3820dd38fae-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qz9xl\" (UID: \"6d87787d-4605-4895-a5bd-a3820dd38fae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.645024 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8fb9e589-a259-4ff9-9d1b-198b57fb179b-metrics-tls\") pod \"dns-default-zvx8j\" (UID: \"8fb9e589-a259-4ff9-9d1b-198b57fb179b\") " pod="openshift-dns/dns-default-zvx8j" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.645067 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-encryption-config\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.654946 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.674489 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.694415 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.714557 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.735123 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.775185 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.794890 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.814597 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.834396 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.854535 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.873783 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.880880 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.882137 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff3698e3-0db7-4a46-8244-ec9486c9ed48-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-z5bg6\" (UID: \"ff3698e3-0db7-4a46-8244-ec9486c9ed48\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5bg6" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.895159 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.904959 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6d87787d-4605-4895-a5bd-a3820dd38fae-srv-cert\") pod \"olm-operator-6b444d44fb-qz9xl\" (UID: \"6d87787d-4605-4895-a5bd-a3820dd38fae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.914519 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.934020 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.954572 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.974692 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 02 10:53:46 crc kubenswrapper[4766]: I1002 10:53:46.994727 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.014771 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.034917 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.054595 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.075825 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.081809 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae-auth-proxy-config\") pod \"machine-approver-56656f9798-cpxbq\" (UID: \"730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.095166 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.102872 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae-machine-approver-tls\") pod \"machine-approver-56656f9798-cpxbq\" (UID: \"730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.114704 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.134635 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.155420 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.174252 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.195070 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.216121 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.234853 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.242353 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae-config\") pod \"machine-approver-56656f9798-cpxbq\" (UID: \"730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.254805 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.274859 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.294786 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.305352 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/df3eaaf0-38f3-4334-adaa-bcdc6b4409bc-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-9hwrx\" (UID: \"df3eaaf0-38f3-4334-adaa-bcdc6b4409bc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwrx" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.314887 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.335469 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.355233 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.375862 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.403293 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.411490 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/641e2cb7-16f3-4339-ace8-1a5d6b921841-trusted-ca\") pod \"console-operator-58897d9998-nm9mf\" (UID: \"641e2cb7-16f3-4339-ace8-1a5d6b921841\") " pod="openshift-console-operator/console-operator-58897d9998-nm9mf" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.415715 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.444466 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.453994 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.455411 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/641e2cb7-16f3-4339-ace8-1a5d6b921841-serving-cert\") pod \"console-operator-58897d9998-nm9mf\" (UID: \"641e2cb7-16f3-4339-ace8-1a5d6b921841\") " pod="openshift-console-operator/console-operator-58897d9998-nm9mf" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.475032 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.482959 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/641e2cb7-16f3-4339-ace8-1a5d6b921841-config\") pod \"console-operator-58897d9998-nm9mf\" (UID: \"641e2cb7-16f3-4339-ace8-1a5d6b921841\") " pod="openshift-console-operator/console-operator-58897d9998-nm9mf" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.493211 4766 request.go:700] Waited for 1.012155678s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/secrets?fieldSelector=metadata.name%3Dconsole-operator-dockercfg-4xjcr&limit=500&resourceVersion=0 Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.494685 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.514419 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.535396 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.575337 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.595781 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.615995 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.634393 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.669967 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.678998 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.695244 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.715026 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.735307 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.755063 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.775058 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.794312 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.814311 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.834709 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.854524 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.875227 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.880777 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.880842 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.881008 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.895715 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.921598 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.935169 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.955145 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.981622 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 02 10:53:47 crc kubenswrapper[4766]: I1002 10:53:47.994913 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.015554 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.034706 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.055552 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.075427 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.094721 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.114912 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.146925 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.154735 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.175024 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.220041 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/900aedf6-0ce4-429f-9d04-2776a8625593-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8x5jh\" (UID: \"900aedf6-0ce4-429f-9d04-2776a8625593\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x5jh" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.232107 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/786673c7-6fb8-4b0d-864a-ea29fa681de6-bound-sa-token\") pod \"ingress-operator-5b745b69d9-g6jjj\" (UID: \"786673c7-6fb8-4b0d-864a-ea29fa681de6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.235784 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.272748 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5zcf\" (UniqueName: \"kubernetes.io/projected/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-kube-api-access-n5zcf\") pod \"collect-profiles-29323365-wmmns\" (UID: \"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.288864 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.293816 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tznn8\" (UniqueName: \"kubernetes.io/projected/fc982890-ee1e-4482-8c17-0c5b11583ce2-kube-api-access-tznn8\") pod \"machine-config-operator-74547568cd-qklg5\" (UID: \"fc982890-ee1e-4482-8c17-0c5b11583ce2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.294944 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.315668 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.335913 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.378087 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x5jh" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.379627 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ff4a13b-a07e-4031-a1fb-ba29027332e8-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-mm6nw\" (UID: \"1ff4a13b-a07e-4031-a1fb-ba29027332e8\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mm6nw" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.391455 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zllgh\" (UniqueName: \"kubernetes.io/projected/6e1d0411-55ac-4287-b19f-d6c46444434b-kube-api-access-zllgh\") pod \"machine-config-controller-84d6567774-7tk8w\" (UID: \"6e1d0411-55ac-4287-b19f-d6c46444434b\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.394811 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.435159 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c6a20802-d5f8-4b5f-8655-410dc9bd8aa7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-k8v6c\" (UID: \"c6a20802-d5f8-4b5f-8655-410dc9bd8aa7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k8v6c" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.455347 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p4d4\" (UniqueName: \"kubernetes.io/projected/786673c7-6fb8-4b0d-864a-ea29fa681de6-kube-api-access-8p4d4\") pod \"ingress-operator-5b745b69d9-g6jjj\" (UID: \"786673c7-6fb8-4b0d-864a-ea29fa681de6\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.458145 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.475837 4766 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.493600 4766 request.go:700] Waited for 1.905013647s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.495709 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mm6nw" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.496071 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.505648 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.517573 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.535804 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.556384 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.578021 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.583724 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns"] Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.584861 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x5jh"] Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.619668 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkr55\" (UniqueName: \"kubernetes.io/projected/6d87787d-4605-4895-a5bd-a3820dd38fae-kube-api-access-dkr55\") pod \"olm-operator-6b444d44fb-qz9xl\" (UID: \"6d87787d-4605-4895-a5bd-a3820dd38fae\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.630328 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtsfw\" (UniqueName: \"kubernetes.io/projected/ff3698e3-0db7-4a46-8244-ec9486c9ed48-kube-api-access-dtsfw\") pod \"package-server-manager-789f6589d5-z5bg6\" (UID: \"ff3698e3-0db7-4a46-8244-ec9486c9ed48\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5bg6" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.645327 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k8v6c" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.646178 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x5jh" event={"ID":"900aedf6-0ce4-429f-9d04-2776a8625593","Type":"ContainerStarted","Data":"ddce299eb668853bf94523e7902270b2fc23e88383553e1b8d9626050a585604"} Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.647322 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" event={"ID":"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac","Type":"ContainerStarted","Data":"209d8066f862f184c1745c95f5ad6e9fc936dacc79685e924f5cd49f38d0f554"} Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.663840 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f27jk\" (UniqueName: \"kubernetes.io/projected/641e2cb7-16f3-4339-ace8-1a5d6b921841-kube-api-access-f27jk\") pod \"console-operator-58897d9998-nm9mf\" (UID: \"641e2cb7-16f3-4339-ace8-1a5d6b921841\") " pod="openshift-console-operator/console-operator-58897d9998-nm9mf" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.670110 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.671780 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5"] Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.672826 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6cht\" (UniqueName: \"kubernetes.io/projected/8fb9e589-a259-4ff9-9d1b-198b57fb179b-kube-api-access-k6cht\") pod \"dns-default-zvx8j\" (UID: \"8fb9e589-a259-4ff9-9d1b-198b57fb179b\") " pod="openshift-dns/dns-default-zvx8j" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.689734 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8ljt\" (UniqueName: \"kubernetes.io/projected/143c6751-acab-4e56-9e54-b0e4dc6ae562-kube-api-access-f8ljt\") pod \"etcd-operator-b45778765-k2sqh\" (UID: \"143c6751-acab-4e56-9e54-b0e4dc6ae562\") " pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.689900 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mm6nw"] Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.692759 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" Oct 02 10:53:48 crc kubenswrapper[4766]: W1002 10:53:48.695002 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc982890_ee1e_4482_8c17_0c5b11583ce2.slice/crio-f876d2c95de834c70cda08f114a7c7420bdf15239044b3b83c90a8a2f09e3677 WatchSource:0}: Error finding container f876d2c95de834c70cda08f114a7c7420bdf15239044b3b83c90a8a2f09e3677: Status 404 returned error can't find the container with id f876d2c95de834c70cda08f114a7c7420bdf15239044b3b83c90a8a2f09e3677 Oct 02 10:53:48 crc kubenswrapper[4766]: W1002 10:53:48.698415 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ff4a13b_a07e_4031_a1fb_ba29027332e8.slice/crio-3ba7ac379616df09fb634de6ac783ec794b2c60f3ae62cc87c57405d03d03460 WatchSource:0}: Error finding container 3ba7ac379616df09fb634de6ac783ec794b2c60f3ae62cc87c57405d03d03460: Status 404 returned error can't find the container with id 3ba7ac379616df09fb634de6ac783ec794b2c60f3ae62cc87c57405d03d03460 Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.709887 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dql4\" (UniqueName: \"kubernetes.io/projected/730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae-kube-api-access-4dql4\") pod \"machine-approver-56656f9798-cpxbq\" (UID: \"730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.729098 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf5sz\" (UniqueName: \"kubernetes.io/projected/e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3-kube-api-access-hf5sz\") pod \"apiserver-76f77b778f-qhc8r\" (UID: \"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3\") " pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.748941 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jv7l\" (UniqueName: \"kubernetes.io/projected/df3eaaf0-38f3-4334-adaa-bcdc6b4409bc-kube-api-access-7jv7l\") pod \"cluster-samples-operator-665b6dd947-9hwrx\" (UID: \"df3eaaf0-38f3-4334-adaa-bcdc6b4409bc\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwrx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.776202 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.780823 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zvx8j" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.795973 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.796992 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.835546 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.855574 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.863075 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5bg6" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.863411 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.863624 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.869802 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875115 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875147 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz4lb\" (UniqueName: \"kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-kube-api-access-bz4lb\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875165 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgh4q\" (UniqueName: \"kubernetes.io/projected/066a8b65-b3f1-42c3-a989-33409b41f8dc-kube-api-access-cgh4q\") pod \"openshift-controller-manager-operator-756b6f6bc6-d59w5\" (UID: \"066a8b65-b3f1-42c3-a989-33409b41f8dc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d59w5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875202 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/066a8b65-b3f1-42c3-a989-33409b41f8dc-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d59w5\" (UID: \"066a8b65-b3f1-42c3-a989-33409b41f8dc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d59w5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875220 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fdd489b9-775f-4a9a-b3de-8ac4d8fcf8fe-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rlrhw\" (UID: \"fdd489b9-775f-4a9a-b3de-8ac4d8fcf8fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rlrhw" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875244 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fcce406-28bd-4526-9a8e-fe2381ce20a2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-flmqc\" (UID: \"5fcce406-28bd-4526-9a8e-fe2381ce20a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flmqc" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875259 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/97952e7f-5262-40a4-8a14-0a881ce34703-stats-auth\") pod \"router-default-5444994796-44mnp\" (UID: \"97952e7f-5262-40a4-8a14-0a881ce34703\") " pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875278 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c4653f10-7eaa-450c-881b-e074e4038d2f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875297 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2qfd\" (UniqueName: \"kubernetes.io/projected/55374426-4ab4-4ce6-a180-6f449961e26d-kube-api-access-x2qfd\") pod \"openshift-config-operator-7777fb866f-6pxm5\" (UID: \"55374426-4ab4-4ce6-a180-6f449961e26d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875310 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fcce406-28bd-4526-9a8e-fe2381ce20a2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-flmqc\" (UID: \"5fcce406-28bd-4526-9a8e-fe2381ce20a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flmqc" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875352 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c30c3a5c-a29e-48a7-b446-b68f9cce2742-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zkc9n\" (UID: \"c30c3a5c-a29e-48a7-b446-b68f9cce2742\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875367 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6wk6\" (UniqueName: \"kubernetes.io/projected/c30c3a5c-a29e-48a7-b446-b68f9cce2742-kube-api-access-s6wk6\") pod \"machine-api-operator-5694c8668f-zkc9n\" (UID: \"c30c3a5c-a29e-48a7-b446-b68f9cce2742\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875382 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a7440e0b-298a-4118-afc5-3c8fcb11eed7-srv-cert\") pod \"catalog-operator-68c6474976-7r26x\" (UID: \"a7440e0b-298a-4118-afc5-3c8fcb11eed7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875397 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kzhj\" (UniqueName: \"kubernetes.io/projected/d0980a54-cd9d-4daa-a5ac-7f86e447f646-kube-api-access-2kzhj\") pod \"route-controller-manager-6576b87f9c-f2c9k\" (UID: \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875412 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx7fr\" (UniqueName: \"kubernetes.io/projected/743fe4dc-299d-4f28-9448-644d12db4af7-kube-api-access-sx7fr\") pod \"downloads-7954f5f757-xqcd7\" (UID: \"743fe4dc-299d-4f28-9448-644d12db4af7\") " pod="openshift-console/downloads-7954f5f757-xqcd7" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875428 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5102f73f-dc76-4e60-9ed8-cc12efc46860-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mltn4\" (UID: \"5102f73f-dc76-4e60-9ed8-cc12efc46860\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875461 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afb77d71-ded6-4158-a3fe-461336cece71-serving-cert\") pod \"controller-manager-879f6c89f-tmkn8\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.875485 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tmkn8\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.876391 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.876600 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c30c3a5c-a29e-48a7-b446-b68f9cce2742-config\") pod \"machine-api-operator-5694c8668f-zkc9n\" (UID: \"c30c3a5c-a29e-48a7-b446-b68f9cce2742\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.876743 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c4653f10-7eaa-450c-881b-e074e4038d2f-audit-dir\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.876780 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30c9f678-ccef-4c4d-b172-c5853e15ddd4-serving-cert\") pod \"authentication-operator-69f744f599-d5v2m\" (UID: \"30c9f678-ccef-4c4d-b172-c5853e15ddd4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.876916 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5102f73f-dc76-4e60-9ed8-cc12efc46860-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mltn4\" (UID: \"5102f73f-dc76-4e60-9ed8-cc12efc46860\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.876955 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c4653f10-7eaa-450c-881b-e074e4038d2f-etcd-client\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.876974 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4653f10-7eaa-450c-881b-e074e4038d2f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877014 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/347022cb-d24b-4f67-900e-c2b858cc49fc-trusted-ca\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877031 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/97952e7f-5262-40a4-8a14-0a881ce34703-default-certificate\") pod \"router-default-5444994796-44mnp\" (UID: \"97952e7f-5262-40a4-8a14-0a881ce34703\") " pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877048 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-bound-sa-token\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877072 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c4653f10-7eaa-450c-881b-e074e4038d2f-audit-policies\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877091 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55374426-4ab4-4ce6-a180-6f449961e26d-serving-cert\") pod \"openshift-config-operator-7777fb866f-6pxm5\" (UID: \"55374426-4ab4-4ce6-a180-6f449961e26d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877111 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5102f73f-dc76-4e60-9ed8-cc12efc46860-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mltn4\" (UID: \"5102f73f-dc76-4e60-9ed8-cc12efc46860\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877173 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/347022cb-d24b-4f67-900e-c2b858cc49fc-registry-certificates\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877196 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0980a54-cd9d-4daa-a5ac-7f86e447f646-client-ca\") pod \"route-controller-manager-6576b87f9c-f2c9k\" (UID: \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877227 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0980a54-cd9d-4daa-a5ac-7f86e447f646-config\") pod \"route-controller-manager-6576b87f9c-f2c9k\" (UID: \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877273 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q88f7\" (UniqueName: \"kubernetes.io/projected/30c9f678-ccef-4c4d-b172-c5853e15ddd4-kube-api-access-q88f7\") pod \"authentication-operator-69f744f599-d5v2m\" (UID: \"30c9f678-ccef-4c4d-b172-c5853e15ddd4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877291 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq5lh\" (UniqueName: \"kubernetes.io/projected/5fcce406-28bd-4526-9a8e-fe2381ce20a2-kube-api-access-fq5lh\") pod \"kube-storage-version-migrator-operator-b67b599dd-flmqc\" (UID: \"5fcce406-28bd-4526-9a8e-fe2381ce20a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flmqc" Oct 02 10:53:48 crc kubenswrapper[4766]: E1002 10:53:48.877590 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:49.377578061 +0000 UTC m=+144.320449005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877786 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2x8\" (UniqueName: \"kubernetes.io/projected/c4653f10-7eaa-450c-881b-e074e4038d2f-kube-api-access-vh2x8\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877811 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcsc5\" (UniqueName: \"kubernetes.io/projected/ac2bdbb7-a515-46ed-90c6-5fe5d141fba8-kube-api-access-zcsc5\") pod \"openshift-apiserver-operator-796bbdcf4f-4dx82\" (UID: \"ac2bdbb7-a515-46ed-90c6-5fe5d141fba8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4dx82" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877828 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0980a54-cd9d-4daa-a5ac-7f86e447f646-serving-cert\") pod \"route-controller-manager-6576b87f9c-f2c9k\" (UID: \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877845 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xmhn\" (UniqueName: \"kubernetes.io/projected/78152191-403f-476f-90ee-0342f60ba99c-kube-api-access-5xmhn\") pod \"dns-operator-744455d44c-nkh48\" (UID: \"78152191-403f-476f-90ee-0342f60ba99c\") " pod="openshift-dns-operator/dns-operator-744455d44c-nkh48" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877860 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c4653f10-7eaa-450c-881b-e074e4038d2f-encryption-config\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877879 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/347022cb-d24b-4f67-900e-c2b858cc49fc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877895 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c30c3a5c-a29e-48a7-b446-b68f9cce2742-images\") pod \"machine-api-operator-5694c8668f-zkc9n\" (UID: \"c30c3a5c-a29e-48a7-b446-b68f9cce2742\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877914 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxcbp\" (UniqueName: \"kubernetes.io/projected/a7440e0b-298a-4118-afc5-3c8fcb11eed7-kube-api-access-mxcbp\") pod \"catalog-operator-68c6474976-7r26x\" (UID: \"a7440e0b-298a-4118-afc5-3c8fcb11eed7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877951 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8br68\" (UniqueName: \"kubernetes.io/projected/dcbb0f16-6000-4d64-ab71-a61c1b3b7063-kube-api-access-8br68\") pod \"service-ca-operator-777779d784-jmttf\" (UID: \"dcbb0f16-6000-4d64-ab71-a61c1b3b7063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmttf" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877967 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97952e7f-5262-40a4-8a14-0a881ce34703-metrics-certs\") pod \"router-default-5444994796-44mnp\" (UID: \"97952e7f-5262-40a4-8a14-0a881ce34703\") " pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.877993 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8-apiservice-cert\") pod \"packageserver-d55dfcdfc-vqq6s\" (UID: \"f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.878054 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-client-ca\") pod \"controller-manager-879f6c89f-tmkn8\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.878085 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-config\") pod \"controller-manager-879f6c89f-tmkn8\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.878109 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxlz6\" (UniqueName: \"kubernetes.io/projected/89c62d15-b27b-4722-95ec-9b9a76efa5d7-kube-api-access-vxlz6\") pod \"ingress-canary-lrsk8\" (UID: \"89c62d15-b27b-4722-95ec-9b9a76efa5d7\") " pod="openshift-ingress-canary/ingress-canary-lrsk8" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879080 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w"] Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879370 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8-tmpfs\") pod \"packageserver-d55dfcdfc-vqq6s\" (UID: \"f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879423 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn4tt\" (UniqueName: \"kubernetes.io/projected/f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8-kube-api-access-xn4tt\") pod \"packageserver-d55dfcdfc-vqq6s\" (UID: \"f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879447 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30c9f678-ccef-4c4d-b172-c5853e15ddd4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-d5v2m\" (UID: \"30c9f678-ccef-4c4d-b172-c5853e15ddd4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879467 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcbb0f16-6000-4d64-ab71-a61c1b3b7063-serving-cert\") pod \"service-ca-operator-777779d784-jmttf\" (UID: \"dcbb0f16-6000-4d64-ab71-a61c1b3b7063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmttf" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879534 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/55374426-4ab4-4ce6-a180-6f449961e26d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6pxm5\" (UID: \"55374426-4ab4-4ce6-a180-6f449961e26d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879574 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a7440e0b-298a-4118-afc5-3c8fcb11eed7-profile-collector-cert\") pod \"catalog-operator-68c6474976-7r26x\" (UID: \"a7440e0b-298a-4118-afc5-3c8fcb11eed7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879596 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcbb0f16-6000-4d64-ab71-a61c1b3b7063-config\") pod \"service-ca-operator-777779d784-jmttf\" (UID: \"dcbb0f16-6000-4d64-ab71-a61c1b3b7063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmttf" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879615 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac2bdbb7-a515-46ed-90c6-5fe5d141fba8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4dx82\" (UID: \"ac2bdbb7-a515-46ed-90c6-5fe5d141fba8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4dx82" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879635 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89c62d15-b27b-4722-95ec-9b9a76efa5d7-cert\") pod \"ingress-canary-lrsk8\" (UID: \"89c62d15-b27b-4722-95ec-9b9a76efa5d7\") " pod="openshift-ingress-canary/ingress-canary-lrsk8" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879670 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30c9f678-ccef-4c4d-b172-c5853e15ddd4-service-ca-bundle\") pod \"authentication-operator-69f744f599-d5v2m\" (UID: \"30c9f678-ccef-4c4d-b172-c5853e15ddd4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879714 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8-webhook-cert\") pod \"packageserver-d55dfcdfc-vqq6s\" (UID: \"f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879730 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78152191-403f-476f-90ee-0342f60ba99c-metrics-tls\") pod \"dns-operator-744455d44c-nkh48\" (UID: \"78152191-403f-476f-90ee-0342f60ba99c\") " pod="openshift-dns-operator/dns-operator-744455d44c-nkh48" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879745 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf77d\" (UniqueName: \"kubernetes.io/projected/97952e7f-5262-40a4-8a14-0a881ce34703-kube-api-access-qf77d\") pod \"router-default-5444994796-44mnp\" (UID: \"97952e7f-5262-40a4-8a14-0a881ce34703\") " pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879765 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/347022cb-d24b-4f67-900e-c2b858cc49fc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879782 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac2bdbb7-a515-46ed-90c6-5fe5d141fba8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4dx82\" (UID: \"ac2bdbb7-a515-46ed-90c6-5fe5d141fba8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4dx82" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879800 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97952e7f-5262-40a4-8a14-0a881ce34703-service-ca-bundle\") pod \"router-default-5444994796-44mnp\" (UID: \"97952e7f-5262-40a4-8a14-0a881ce34703\") " pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879873 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30c9f678-ccef-4c4d-b172-c5853e15ddd4-config\") pod \"authentication-operator-69f744f599-d5v2m\" (UID: \"30c9f678-ccef-4c4d-b172-c5853e15ddd4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879900 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r767l\" (UniqueName: \"kubernetes.io/projected/5102f73f-dc76-4e60-9ed8-cc12efc46860-kube-api-access-r767l\") pod \"cluster-image-registry-operator-dc59b4c8b-mltn4\" (UID: \"5102f73f-dc76-4e60-9ed8-cc12efc46860\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879929 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp62p\" (UniqueName: \"kubernetes.io/projected/2ef5cc8f-ef34-4fd1-8765-7f41500898e6-kube-api-access-rp62p\") pod \"migrator-59844c95c7-qprrh\" (UID: \"2ef5cc8f-ef34-4fd1-8765-7f41500898e6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qprrh" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879950 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wskrf\" (UniqueName: \"kubernetes.io/projected/afb77d71-ded6-4158-a3fe-461336cece71-kube-api-access-wskrf\") pod \"controller-manager-879f6c89f-tmkn8\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.879975 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/066a8b65-b3f1-42c3-a989-33409b41f8dc-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d59w5\" (UID: \"066a8b65-b3f1-42c3-a989-33409b41f8dc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d59w5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.880026 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4653f10-7eaa-450c-881b-e074e4038d2f-serving-cert\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.880052 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89dqx\" (UniqueName: \"kubernetes.io/projected/fdd489b9-775f-4a9a-b3de-8ac4d8fcf8fe-kube-api-access-89dqx\") pod \"control-plane-machine-set-operator-78cbb6b69f-rlrhw\" (UID: \"fdd489b9-775f-4a9a-b3de-8ac4d8fcf8fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rlrhw" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.880071 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-registry-tls\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.889291 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwrx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.894315 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.903995 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nm9mf" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.908041 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k8v6c"] Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.944640 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj"] Oct 02 10:53:48 crc kubenswrapper[4766]: W1002 10:53:48.960197 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod786673c7_6fb8_4b0d_864a_ea29fa681de6.slice/crio-febfe72ed3eb765bf93db7a04461134e086bce28ac35cb378f0731e5a785736d WatchSource:0}: Error finding container febfe72ed3eb765bf93db7a04461134e086bce28ac35cb378f0731e5a785736d: Status 404 returned error can't find the container with id febfe72ed3eb765bf93db7a04461134e086bce28ac35cb378f0731e5a785736d Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.980749 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981108 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981151 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8-tmpfs\") pod \"packageserver-d55dfcdfc-vqq6s\" (UID: \"f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981184 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn4tt\" (UniqueName: \"kubernetes.io/projected/f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8-kube-api-access-xn4tt\") pod \"packageserver-d55dfcdfc-vqq6s\" (UID: \"f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981235 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30c9f678-ccef-4c4d-b172-c5853e15ddd4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-d5v2m\" (UID: \"30c9f678-ccef-4c4d-b172-c5853e15ddd4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981264 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcbb0f16-6000-4d64-ab71-a61c1b3b7063-serving-cert\") pod \"service-ca-operator-777779d784-jmttf\" (UID: \"dcbb0f16-6000-4d64-ab71-a61c1b3b7063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmttf" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981313 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/55374426-4ab4-4ce6-a180-6f449961e26d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6pxm5\" (UID: \"55374426-4ab4-4ce6-a180-6f449961e26d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981358 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a7440e0b-298a-4118-afc5-3c8fcb11eed7-profile-collector-cert\") pod \"catalog-operator-68c6474976-7r26x\" (UID: \"a7440e0b-298a-4118-afc5-3c8fcb11eed7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981386 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhhv4\" (UniqueName: \"kubernetes.io/projected/899ef710-299b-4178-850d-1da30747c924-kube-api-access-hhhv4\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981411 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac2bdbb7-a515-46ed-90c6-5fe5d141fba8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4dx82\" (UID: \"ac2bdbb7-a515-46ed-90c6-5fe5d141fba8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4dx82" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981437 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/899ef710-299b-4178-850d-1da30747c924-csi-data-dir\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981462 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89c62d15-b27b-4722-95ec-9b9a76efa5d7-cert\") pod \"ingress-canary-lrsk8\" (UID: \"89c62d15-b27b-4722-95ec-9b9a76efa5d7\") " pod="openshift-ingress-canary/ingress-canary-lrsk8" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981485 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcbb0f16-6000-4d64-ab71-a61c1b3b7063-config\") pod \"service-ca-operator-777779d784-jmttf\" (UID: \"dcbb0f16-6000-4d64-ab71-a61c1b3b7063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmttf" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981547 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30c9f678-ccef-4c4d-b172-c5853e15ddd4-service-ca-bundle\") pod \"authentication-operator-69f744f599-d5v2m\" (UID: \"30c9f678-ccef-4c4d-b172-c5853e15ddd4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981572 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-oauth-config\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981657 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8-webhook-cert\") pod \"packageserver-d55dfcdfc-vqq6s\" (UID: \"f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981689 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf77d\" (UniqueName: \"kubernetes.io/projected/97952e7f-5262-40a4-8a14-0a881ce34703-kube-api-access-qf77d\") pod \"router-default-5444994796-44mnp\" (UID: \"97952e7f-5262-40a4-8a14-0a881ce34703\") " pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981720 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981745 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78152191-403f-476f-90ee-0342f60ba99c-metrics-tls\") pod \"dns-operator-744455d44c-nkh48\" (UID: \"78152191-403f-476f-90ee-0342f60ba99c\") " pod="openshift-dns-operator/dns-operator-744455d44c-nkh48" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981778 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac2bdbb7-a515-46ed-90c6-5fe5d141fba8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4dx82\" (UID: \"ac2bdbb7-a515-46ed-90c6-5fe5d141fba8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4dx82" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981795 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97952e7f-5262-40a4-8a14-0a881ce34703-service-ca-bundle\") pod \"router-default-5444994796-44mnp\" (UID: \"97952e7f-5262-40a4-8a14-0a881ce34703\") " pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981816 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpg74\" (UniqueName: \"kubernetes.io/projected/a57c2a77-db59-4b73-b376-640de2af9a7e-kube-api-access-tpg74\") pod \"marketplace-operator-79b997595-vkm2v\" (UID: \"a57c2a77-db59-4b73-b376-640de2af9a7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981844 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/347022cb-d24b-4f67-900e-c2b858cc49fc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981871 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30c9f678-ccef-4c4d-b172-c5853e15ddd4-config\") pod \"authentication-operator-69f744f599-d5v2m\" (UID: \"30c9f678-ccef-4c4d-b172-c5853e15ddd4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981899 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981915 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d24e91e7-dadf-4b67-be1b-a945b1250017-audit-dir\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981943 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r767l\" (UniqueName: \"kubernetes.io/projected/5102f73f-dc76-4e60-9ed8-cc12efc46860-kube-api-access-r767l\") pod \"cluster-image-registry-operator-dc59b4c8b-mltn4\" (UID: \"5102f73f-dc76-4e60-9ed8-cc12efc46860\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.981973 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6rzm\" (UniqueName: \"kubernetes.io/projected/d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3-kube-api-access-m6rzm\") pod \"machine-config-server-qkm7n\" (UID: \"d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3\") " pod="openshift-machine-config-operator/machine-config-server-qkm7n" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.982006 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp62p\" (UniqueName: \"kubernetes.io/projected/2ef5cc8f-ef34-4fd1-8765-7f41500898e6-kube-api-access-rp62p\") pod \"migrator-59844c95c7-qprrh\" (UID: \"2ef5cc8f-ef34-4fd1-8765-7f41500898e6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qprrh" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.982037 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wskrf\" (UniqueName: \"kubernetes.io/projected/afb77d71-ded6-4158-a3fe-461336cece71-kube-api-access-wskrf\") pod \"controller-manager-879f6c89f-tmkn8\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.982063 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/066a8b65-b3f1-42c3-a989-33409b41f8dc-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d59w5\" (UID: \"066a8b65-b3f1-42c3-a989-33409b41f8dc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d59w5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.982114 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-serving-cert\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.982144 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6f1e68c3-4a75-4336-bc1c-a5167d57c28a-signing-cabundle\") pod \"service-ca-9c57cc56f-q2szc\" (UID: \"6f1e68c3-4a75-4336-bc1c-a5167d57c28a\") " pod="openshift-service-ca/service-ca-9c57cc56f-q2szc" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.982178 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.982206 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4653f10-7eaa-450c-881b-e074e4038d2f-serving-cert\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.982274 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89dqx\" (UniqueName: \"kubernetes.io/projected/fdd489b9-775f-4a9a-b3de-8ac4d8fcf8fe-kube-api-access-89dqx\") pod \"control-plane-machine-set-operator-78cbb6b69f-rlrhw\" (UID: \"fdd489b9-775f-4a9a-b3de-8ac4d8fcf8fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rlrhw" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.983919 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/899ef710-299b-4178-850d-1da30747c924-mountpoint-dir\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984037 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-registry-tls\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984069 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984106 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/899ef710-299b-4178-850d-1da30747c924-plugins-dir\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984178 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz4lb\" (UniqueName: \"kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-kube-api-access-bz4lb\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984216 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgh4q\" (UniqueName: \"kubernetes.io/projected/066a8b65-b3f1-42c3-a989-33409b41f8dc-kube-api-access-cgh4q\") pod \"openshift-controller-manager-operator-756b6f6bc6-d59w5\" (UID: \"066a8b65-b3f1-42c3-a989-33409b41f8dc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d59w5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984247 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984283 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/066a8b65-b3f1-42c3-a989-33409b41f8dc-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d59w5\" (UID: \"066a8b65-b3f1-42c3-a989-33409b41f8dc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d59w5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984313 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fdd489b9-775f-4a9a-b3de-8ac4d8fcf8fe-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rlrhw\" (UID: \"fdd489b9-775f-4a9a-b3de-8ac4d8fcf8fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rlrhw" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984372 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fcce406-28bd-4526-9a8e-fe2381ce20a2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-flmqc\" (UID: \"5fcce406-28bd-4526-9a8e-fe2381ce20a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flmqc" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984412 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/97952e7f-5262-40a4-8a14-0a881ce34703-stats-auth\") pod \"router-default-5444994796-44mnp\" (UID: \"97952e7f-5262-40a4-8a14-0a881ce34703\") " pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984438 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-audit-policies\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984471 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1030ba1a-c14b-4091-8417-b2dcbd287b97-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ttsj7\" (UID: \"1030ba1a-c14b-4091-8417-b2dcbd287b97\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ttsj7" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984515 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984556 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c4653f10-7eaa-450c-881b-e074e4038d2f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984609 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2qfd\" (UniqueName: \"kubernetes.io/projected/55374426-4ab4-4ce6-a180-6f449961e26d-kube-api-access-x2qfd\") pod \"openshift-config-operator-7777fb866f-6pxm5\" (UID: \"55374426-4ab4-4ce6-a180-6f449961e26d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984670 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fcce406-28bd-4526-9a8e-fe2381ce20a2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-flmqc\" (UID: \"5fcce406-28bd-4526-9a8e-fe2381ce20a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flmqc" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984730 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-config\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984784 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c30c3a5c-a29e-48a7-b446-b68f9cce2742-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zkc9n\" (UID: \"c30c3a5c-a29e-48a7-b446-b68f9cce2742\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984817 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6wk6\" (UniqueName: \"kubernetes.io/projected/c30c3a5c-a29e-48a7-b446-b68f9cce2742-kube-api-access-s6wk6\") pod \"machine-api-operator-5694c8668f-zkc9n\" (UID: \"c30c3a5c-a29e-48a7-b446-b68f9cce2742\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984853 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a7440e0b-298a-4118-afc5-3c8fcb11eed7-srv-cert\") pod \"catalog-operator-68c6474976-7r26x\" (UID: \"a7440e0b-298a-4118-afc5-3c8fcb11eed7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984881 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-trusted-ca-bundle\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984910 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx7fr\" (UniqueName: \"kubernetes.io/projected/743fe4dc-299d-4f28-9448-644d12db4af7-kube-api-access-sx7fr\") pod \"downloads-7954f5f757-xqcd7\" (UID: \"743fe4dc-299d-4f28-9448-644d12db4af7\") " pod="openshift-console/downloads-7954f5f757-xqcd7" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984945 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5102f73f-dc76-4e60-9ed8-cc12efc46860-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mltn4\" (UID: \"5102f73f-dc76-4e60-9ed8-cc12efc46860\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.984980 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985013 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kzhj\" (UniqueName: \"kubernetes.io/projected/d0980a54-cd9d-4daa-a5ac-7f86e447f646-kube-api-access-2kzhj\") pod \"route-controller-manager-6576b87f9c-f2c9k\" (UID: \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985038 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afb77d71-ded6-4158-a3fe-461336cece71-serving-cert\") pod \"controller-manager-879f6c89f-tmkn8\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985064 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-service-ca\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985096 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tmkn8\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985199 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a57c2a77-db59-4b73-b376-640de2af9a7e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vkm2v\" (UID: \"a57c2a77-db59-4b73-b376-640de2af9a7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985283 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c30c3a5c-a29e-48a7-b446-b68f9cce2742-config\") pod \"machine-api-operator-5694c8668f-zkc9n\" (UID: \"c30c3a5c-a29e-48a7-b446-b68f9cce2742\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985309 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30c9f678-ccef-4c4d-b172-c5853e15ddd4-serving-cert\") pod \"authentication-operator-69f744f599-d5v2m\" (UID: \"30c9f678-ccef-4c4d-b172-c5853e15ddd4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985347 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c4653f10-7eaa-450c-881b-e074e4038d2f-audit-dir\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985376 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z8w5\" (UniqueName: \"kubernetes.io/projected/1030ba1a-c14b-4091-8417-b2dcbd287b97-kube-api-access-4z8w5\") pod \"multus-admission-controller-857f4d67dd-ttsj7\" (UID: \"1030ba1a-c14b-4091-8417-b2dcbd287b97\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ttsj7" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985415 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5102f73f-dc76-4e60-9ed8-cc12efc46860-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mltn4\" (UID: \"5102f73f-dc76-4e60-9ed8-cc12efc46860\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985457 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4653f10-7eaa-450c-881b-e074e4038d2f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985491 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4mpr\" (UniqueName: \"kubernetes.io/projected/6f1e68c3-4a75-4336-bc1c-a5167d57c28a-kube-api-access-t4mpr\") pod \"service-ca-9c57cc56f-q2szc\" (UID: \"6f1e68c3-4a75-4336-bc1c-a5167d57c28a\") " pod="openshift-service-ca/service-ca-9c57cc56f-q2szc" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985575 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985607 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c4653f10-7eaa-450c-881b-e074e4038d2f-etcd-client\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985640 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/347022cb-d24b-4f67-900e-c2b858cc49fc-trusted-ca\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985675 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/97952e7f-5262-40a4-8a14-0a881ce34703-default-certificate\") pod \"router-default-5444994796-44mnp\" (UID: \"97952e7f-5262-40a4-8a14-0a881ce34703\") " pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985704 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-bound-sa-token\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985728 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a57c2a77-db59-4b73-b376-640de2af9a7e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vkm2v\" (UID: \"a57c2a77-db59-4b73-b376-640de2af9a7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985763 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c4653f10-7eaa-450c-881b-e074e4038d2f-audit-policies\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985805 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55374426-4ab4-4ce6-a180-6f449961e26d-serving-cert\") pod \"openshift-config-operator-7777fb866f-6pxm5\" (UID: \"55374426-4ab4-4ce6-a180-6f449961e26d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985834 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5102f73f-dc76-4e60-9ed8-cc12efc46860-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mltn4\" (UID: \"5102f73f-dc76-4e60-9ed8-cc12efc46860\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985873 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/899ef710-299b-4178-850d-1da30747c924-registration-dir\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.985904 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stmtk\" (UniqueName: \"kubernetes.io/projected/d24e91e7-dadf-4b67-be1b-a945b1250017-kube-api-access-stmtk\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.986005 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/347022cb-d24b-4f67-900e-c2b858cc49fc-registry-certificates\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.986041 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0980a54-cd9d-4daa-a5ac-7f86e447f646-client-ca\") pod \"route-controller-manager-6576b87f9c-f2c9k\" (UID: \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.986087 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0980a54-cd9d-4daa-a5ac-7f86e447f646-config\") pod \"route-controller-manager-6576b87f9c-f2c9k\" (UID: \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.986117 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q88f7\" (UniqueName: \"kubernetes.io/projected/30c9f678-ccef-4c4d-b172-c5853e15ddd4-kube-api-access-q88f7\") pod \"authentication-operator-69f744f599-d5v2m\" (UID: \"30c9f678-ccef-4c4d-b172-c5853e15ddd4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.986173 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq5lh\" (UniqueName: \"kubernetes.io/projected/5fcce406-28bd-4526-9a8e-fe2381ce20a2-kube-api-access-fq5lh\") pod \"kube-storage-version-migrator-operator-b67b599dd-flmqc\" (UID: \"5fcce406-28bd-4526-9a8e-fe2381ce20a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flmqc" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.986212 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8h42\" (UniqueName: \"kubernetes.io/projected/581ea4c4-072a-4bba-afc9-2f82918ac0c9-kube-api-access-f8h42\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.987627 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c4653f10-7eaa-450c-881b-e074e4038d2f-audit-dir\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.989108 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-tmkn8\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.991261 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fdd489b9-775f-4a9a-b3de-8ac4d8fcf8fe-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rlrhw\" (UID: \"fdd489b9-775f-4a9a-b3de-8ac4d8fcf8fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rlrhw" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.992582 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-registry-tls\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.993281 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dcbb0f16-6000-4d64-ab71-a61c1b3b7063-config\") pod \"service-ca-operator-777779d784-jmttf\" (UID: \"dcbb0f16-6000-4d64-ab71-a61c1b3b7063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmttf" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.994077 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8-tmpfs\") pod \"packageserver-d55dfcdfc-vqq6s\" (UID: \"f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.994448 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac2bdbb7-a515-46ed-90c6-5fe5d141fba8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4dx82\" (UID: \"ac2bdbb7-a515-46ed-90c6-5fe5d141fba8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4dx82" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.995830 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c4653f10-7eaa-450c-881b-e074e4038d2f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.996435 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c30c3a5c-a29e-48a7-b446-b68f9cce2742-config\") pod \"machine-api-operator-5694c8668f-zkc9n\" (UID: \"c30c3a5c-a29e-48a7-b446-b68f9cce2742\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.996586 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30c9f678-ccef-4c4d-b172-c5853e15ddd4-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-d5v2m\" (UID: \"30c9f678-ccef-4c4d-b172-c5853e15ddd4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.997042 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5fcce406-28bd-4526-9a8e-fe2381ce20a2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-flmqc\" (UID: \"5fcce406-28bd-4526-9a8e-fe2381ce20a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flmqc" Oct 02 10:53:48 crc kubenswrapper[4766]: I1002 10:53:48.999203 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zvx8j"] Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.001239 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/55374426-4ab4-4ce6-a180-6f449961e26d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-6pxm5\" (UID: \"55374426-4ab4-4ce6-a180-6f449961e26d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.001621 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4653f10-7eaa-450c-881b-e074e4038d2f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.003614 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5fcce406-28bd-4526-9a8e-fe2381ce20a2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-flmqc\" (UID: \"5fcce406-28bd-4526-9a8e-fe2381ce20a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flmqc" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.004321 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/5102f73f-dc76-4e60-9ed8-cc12efc46860-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-mltn4\" (UID: \"5102f73f-dc76-4e60-9ed8-cc12efc46860\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.004636 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c4653f10-7eaa-450c-881b-e074e4038d2f-etcd-client\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.004687 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dcbb0f16-6000-4d64-ab71-a61c1b3b7063-serving-cert\") pod \"service-ca-operator-777779d784-jmttf\" (UID: \"dcbb0f16-6000-4d64-ab71-a61c1b3b7063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmttf" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.005136 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/89c62d15-b27b-4722-95ec-9b9a76efa5d7-cert\") pod \"ingress-canary-lrsk8\" (UID: \"89c62d15-b27b-4722-95ec-9b9a76efa5d7\") " pod="openshift-ingress-canary/ingress-canary-lrsk8" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.005158 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/a7440e0b-298a-4118-afc5-3c8fcb11eed7-profile-collector-cert\") pod \"catalog-operator-68c6474976-7r26x\" (UID: \"a7440e0b-298a-4118-afc5-3c8fcb11eed7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.005191 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/347022cb-d24b-4f67-900e-c2b858cc49fc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.005671 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac2bdbb7-a515-46ed-90c6-5fe5d141fba8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4dx82\" (UID: \"ac2bdbb7-a515-46ed-90c6-5fe5d141fba8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4dx82" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.005822 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97952e7f-5262-40a4-8a14-0a881ce34703-service-ca-bundle\") pod \"router-default-5444994796-44mnp\" (UID: \"97952e7f-5262-40a4-8a14-0a881ce34703\") " pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.008707 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8-webhook-cert\") pod \"packageserver-d55dfcdfc-vqq6s\" (UID: \"f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.011279 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/c30c3a5c-a29e-48a7-b446-b68f9cce2742-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-zkc9n\" (UID: \"c30c3a5c-a29e-48a7-b446-b68f9cce2742\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.012062 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afb77d71-ded6-4158-a3fe-461336cece71-serving-cert\") pod \"controller-manager-879f6c89f-tmkn8\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.012220 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/a7440e0b-298a-4118-afc5-3c8fcb11eed7-srv-cert\") pod \"catalog-operator-68c6474976-7r26x\" (UID: \"a7440e0b-298a-4118-afc5-3c8fcb11eed7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.005829 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78152191-403f-476f-90ee-0342f60ba99c-metrics-tls\") pod \"dns-operator-744455d44c-nkh48\" (UID: \"78152191-403f-476f-90ee-0342f60ba99c\") " pod="openshift-dns-operator/dns-operator-744455d44c-nkh48" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.024912 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2x8\" (UniqueName: \"kubernetes.io/projected/c4653f10-7eaa-450c-881b-e074e4038d2f-kube-api-access-vh2x8\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.024952 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcsc5\" (UniqueName: \"kubernetes.io/projected/ac2bdbb7-a515-46ed-90c6-5fe5d141fba8-kube-api-access-zcsc5\") pod \"openshift-apiserver-operator-796bbdcf4f-4dx82\" (UID: \"ac2bdbb7-a515-46ed-90c6-5fe5d141fba8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4dx82" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.024982 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xmhn\" (UniqueName: \"kubernetes.io/projected/78152191-403f-476f-90ee-0342f60ba99c-kube-api-access-5xmhn\") pod \"dns-operator-744455d44c-nkh48\" (UID: \"78152191-403f-476f-90ee-0342f60ba99c\") " pod="openshift-dns-operator/dns-operator-744455d44c-nkh48" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.025005 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c4653f10-7eaa-450c-881b-e074e4038d2f-encryption-config\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.025045 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0980a54-cd9d-4daa-a5ac-7f86e447f646-serving-cert\") pod \"route-controller-manager-6576b87f9c-f2c9k\" (UID: \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.025071 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c30c3a5c-a29e-48a7-b446-b68f9cce2742-images\") pod \"machine-api-operator-5694c8668f-zkc9n\" (UID: \"c30c3a5c-a29e-48a7-b446-b68f9cce2742\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.025195 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxcbp\" (UniqueName: \"kubernetes.io/projected/a7440e0b-298a-4118-afc5-3c8fcb11eed7-kube-api-access-mxcbp\") pod \"catalog-operator-68c6474976-7r26x\" (UID: \"a7440e0b-298a-4118-afc5-3c8fcb11eed7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.025193 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c4653f10-7eaa-450c-881b-e074e4038d2f-audit-policies\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.025215 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8br68\" (UniqueName: \"kubernetes.io/projected/dcbb0f16-6000-4d64-ab71-a61c1b3b7063-kube-api-access-8br68\") pod \"service-ca-operator-777779d784-jmttf\" (UID: \"dcbb0f16-6000-4d64-ab71-a61c1b3b7063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmttf" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.025234 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97952e7f-5262-40a4-8a14-0a881ce34703-metrics-certs\") pod \"router-default-5444994796-44mnp\" (UID: \"97952e7f-5262-40a4-8a14-0a881ce34703\") " pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.025256 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3-node-bootstrap-token\") pod \"machine-config-server-qkm7n\" (UID: \"d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3\") " pod="openshift-machine-config-operator/machine-config-server-qkm7n" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.025277 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/347022cb-d24b-4f67-900e-c2b858cc49fc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.025311 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8-apiservice-cert\") pod \"packageserver-d55dfcdfc-vqq6s\" (UID: \"f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.025330 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/899ef710-299b-4178-850d-1da30747c924-socket-dir\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.025348 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.025370 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-client-ca\") pod \"controller-manager-879f6c89f-tmkn8\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.025387 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6f1e68c3-4a75-4336-bc1c-a5167d57c28a-signing-key\") pod \"service-ca-9c57cc56f-q2szc\" (UID: \"6f1e68c3-4a75-4336-bc1c-a5167d57c28a\") " pod="openshift-service-ca/service-ca-9c57cc56f-q2szc" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.025416 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-config\") pod \"controller-manager-879f6c89f-tmkn8\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.025423 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89dqx\" (UniqueName: \"kubernetes.io/projected/fdd489b9-775f-4a9a-b3de-8ac4d8fcf8fe-kube-api-access-89dqx\") pod \"control-plane-machine-set-operator-78cbb6b69f-rlrhw\" (UID: \"fdd489b9-775f-4a9a-b3de-8ac4d8fcf8fe\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rlrhw" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.026582 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c30c3a5c-a29e-48a7-b446-b68f9cce2742-images\") pod \"machine-api-operator-5694c8668f-zkc9n\" (UID: \"c30c3a5c-a29e-48a7-b446-b68f9cce2742\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.026583 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0980a54-cd9d-4daa-a5ac-7f86e447f646-client-ca\") pod \"route-controller-manager-6576b87f9c-f2c9k\" (UID: \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.026590 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/347022cb-d24b-4f67-900e-c2b858cc49fc-registry-certificates\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.027830 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5102f73f-dc76-4e60-9ed8-cc12efc46860-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-mltn4\" (UID: \"5102f73f-dc76-4e60-9ed8-cc12efc46860\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.029525 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-config\") pod \"controller-manager-879f6c89f-tmkn8\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.030274 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-client-ca\") pod \"controller-manager-879f6c89f-tmkn8\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.030734 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55374426-4ab4-4ce6-a180-6f449961e26d-serving-cert\") pod \"openshift-config-operator-7777fb866f-6pxm5\" (UID: \"55374426-4ab4-4ce6-a180-6f449961e26d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.031047 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/347022cb-d24b-4f67-900e-c2b858cc49fc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:49 crc kubenswrapper[4766]: E1002 10:53:49.031082 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:49.531062834 +0000 UTC m=+144.473933778 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.031959 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4653f10-7eaa-450c-881b-e074e4038d2f-serving-cert\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.035912 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c4653f10-7eaa-450c-881b-e074e4038d2f-encryption-config\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.028729 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.037531 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxlz6\" (UniqueName: \"kubernetes.io/projected/89c62d15-b27b-4722-95ec-9b9a76efa5d7-kube-api-access-vxlz6\") pod \"ingress-canary-lrsk8\" (UID: \"89c62d15-b27b-4722-95ec-9b9a76efa5d7\") " pod="openshift-ingress-canary/ingress-canary-lrsk8" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.037574 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3-certs\") pod \"machine-config-server-qkm7n\" (UID: \"d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3\") " pod="openshift-machine-config-operator/machine-config-server-qkm7n" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.037616 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-oauth-serving-cert\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.038088 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/066a8b65-b3f1-42c3-a989-33409b41f8dc-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d59w5\" (UID: \"066a8b65-b3f1-42c3-a989-33409b41f8dc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d59w5" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.038156 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30c9f678-ccef-4c4d-b172-c5853e15ddd4-service-ca-bundle\") pod \"authentication-operator-69f744f599-d5v2m\" (UID: \"30c9f678-ccef-4c4d-b172-c5853e15ddd4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.038650 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30c9f678-ccef-4c4d-b172-c5853e15ddd4-config\") pod \"authentication-operator-69f744f599-d5v2m\" (UID: \"30c9f678-ccef-4c4d-b172-c5853e15ddd4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.038909 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30c9f678-ccef-4c4d-b172-c5853e15ddd4-serving-cert\") pod \"authentication-operator-69f744f599-d5v2m\" (UID: \"30c9f678-ccef-4c4d-b172-c5853e15ddd4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.039100 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/066a8b65-b3f1-42c3-a989-33409b41f8dc-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d59w5\" (UID: \"066a8b65-b3f1-42c3-a989-33409b41f8dc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d59w5" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.040086 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/347022cb-d24b-4f67-900e-c2b858cc49fc-trusted-ca\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.049349 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8-apiservice-cert\") pod \"packageserver-d55dfcdfc-vqq6s\" (UID: \"f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.049936 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/97952e7f-5262-40a4-8a14-0a881ce34703-stats-auth\") pod \"router-default-5444994796-44mnp\" (UID: \"97952e7f-5262-40a4-8a14-0a881ce34703\") " pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.051254 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/97952e7f-5262-40a4-8a14-0a881ce34703-default-certificate\") pod \"router-default-5444994796-44mnp\" (UID: \"97952e7f-5262-40a4-8a14-0a881ce34703\") " pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.052393 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0980a54-cd9d-4daa-a5ac-7f86e447f646-config\") pod \"route-controller-manager-6576b87f9c-f2c9k\" (UID: \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.054782 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0980a54-cd9d-4daa-a5ac-7f86e447f646-serving-cert\") pod \"route-controller-manager-6576b87f9c-f2c9k\" (UID: \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.063880 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/97952e7f-5262-40a4-8a14-0a881ce34703-metrics-certs\") pod \"router-default-5444994796-44mnp\" (UID: \"97952e7f-5262-40a4-8a14-0a881ce34703\") " pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.064304 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf77d\" (UniqueName: \"kubernetes.io/projected/97952e7f-5262-40a4-8a14-0a881ce34703-kube-api-access-qf77d\") pod \"router-default-5444994796-44mnp\" (UID: \"97952e7f-5262-40a4-8a14-0a881ce34703\") " pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.074759 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn4tt\" (UniqueName: \"kubernetes.io/projected/f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8-kube-api-access-xn4tt\") pod \"packageserver-d55dfcdfc-vqq6s\" (UID: \"f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.095854 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2qfd\" (UniqueName: \"kubernetes.io/projected/55374426-4ab4-4ce6-a180-6f449961e26d-kube-api-access-x2qfd\") pod \"openshift-config-operator-7777fb866f-6pxm5\" (UID: \"55374426-4ab4-4ce6-a180-6f449961e26d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.109763 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-k2sqh"] Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.120331 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp62p\" (UniqueName: \"kubernetes.io/projected/2ef5cc8f-ef34-4fd1-8765-7f41500898e6-kube-api-access-rp62p\") pod \"migrator-59844c95c7-qprrh\" (UID: \"2ef5cc8f-ef34-4fd1-8765-7f41500898e6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qprrh" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.136126 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wskrf\" (UniqueName: \"kubernetes.io/projected/afb77d71-ded6-4158-a3fe-461336cece71-kube-api-access-wskrf\") pod \"controller-manager-879f6c89f-tmkn8\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.139917 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhhv4\" (UniqueName: \"kubernetes.io/projected/899ef710-299b-4178-850d-1da30747c924-kube-api-access-hhhv4\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.139946 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/899ef710-299b-4178-850d-1da30747c924-csi-data-dir\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.139962 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-oauth-config\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.139982 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140004 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpg74\" (UniqueName: \"kubernetes.io/projected/a57c2a77-db59-4b73-b376-640de2af9a7e-kube-api-access-tpg74\") pod \"marketplace-operator-79b997595-vkm2v\" (UID: \"a57c2a77-db59-4b73-b376-640de2af9a7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140029 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140053 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d24e91e7-dadf-4b67-be1b-a945b1250017-audit-dir\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140069 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6rzm\" (UniqueName: \"kubernetes.io/projected/d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3-kube-api-access-m6rzm\") pod \"machine-config-server-qkm7n\" (UID: \"d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3\") " pod="openshift-machine-config-operator/machine-config-server-qkm7n" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140111 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-serving-cert\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140126 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6f1e68c3-4a75-4336-bc1c-a5167d57c28a-signing-cabundle\") pod \"service-ca-9c57cc56f-q2szc\" (UID: \"6f1e68c3-4a75-4336-bc1c-a5167d57c28a\") " pod="openshift-service-ca/service-ca-9c57cc56f-q2szc" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140143 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140162 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/899ef710-299b-4178-850d-1da30747c924-mountpoint-dir\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140176 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/899ef710-299b-4178-850d-1da30747c924-plugins-dir\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140197 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140233 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140261 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140283 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-audit-policies\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140341 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1030ba1a-c14b-4091-8417-b2dcbd287b97-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ttsj7\" (UID: \"1030ba1a-c14b-4091-8417-b2dcbd287b97\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ttsj7" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140360 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140378 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-config\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140399 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-trusted-ca-bundle\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140649 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140702 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-service-ca\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140728 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a57c2a77-db59-4b73-b376-640de2af9a7e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vkm2v\" (UID: \"a57c2a77-db59-4b73-b376-640de2af9a7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140744 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z8w5\" (UniqueName: \"kubernetes.io/projected/1030ba1a-c14b-4091-8417-b2dcbd287b97-kube-api-access-4z8w5\") pod \"multus-admission-controller-857f4d67dd-ttsj7\" (UID: \"1030ba1a-c14b-4091-8417-b2dcbd287b97\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ttsj7" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140767 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140784 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4mpr\" (UniqueName: \"kubernetes.io/projected/6f1e68c3-4a75-4336-bc1c-a5167d57c28a-kube-api-access-t4mpr\") pod \"service-ca-9c57cc56f-q2szc\" (UID: \"6f1e68c3-4a75-4336-bc1c-a5167d57c28a\") " pod="openshift-service-ca/service-ca-9c57cc56f-q2szc" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140805 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a57c2a77-db59-4b73-b376-640de2af9a7e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vkm2v\" (UID: \"a57c2a77-db59-4b73-b376-640de2af9a7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140823 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/899ef710-299b-4178-850d-1da30747c924-registration-dir\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140837 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stmtk\" (UniqueName: \"kubernetes.io/projected/d24e91e7-dadf-4b67-be1b-a945b1250017-kube-api-access-stmtk\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140867 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8h42\" (UniqueName: \"kubernetes.io/projected/581ea4c4-072a-4bba-afc9-2f82918ac0c9-kube-api-access-f8h42\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140901 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3-node-bootstrap-token\") pod \"machine-config-server-qkm7n\" (UID: \"d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3\") " pod="openshift-machine-config-operator/machine-config-server-qkm7n" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140920 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/899ef710-299b-4178-850d-1da30747c924-socket-dir\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140934 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140949 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6f1e68c3-4a75-4336-bc1c-a5167d57c28a-signing-key\") pod \"service-ca-9c57cc56f-q2szc\" (UID: \"6f1e68c3-4a75-4336-bc1c-a5167d57c28a\") " pod="openshift-service-ca/service-ca-9c57cc56f-q2szc" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.140995 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.141017 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3-certs\") pod \"machine-config-server-qkm7n\" (UID: \"d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3\") " pod="openshift-machine-config-operator/machine-config-server-qkm7n" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.141035 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-oauth-serving-cert\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.141063 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.146449 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-config\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.147319 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.147948 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d24e91e7-dadf-4b67-be1b-a945b1250017-audit-dir\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.148361 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6f1e68c3-4a75-4336-bc1c-a5167d57c28a-signing-cabundle\") pod \"service-ca-9c57cc56f-q2szc\" (UID: \"6f1e68c3-4a75-4336-bc1c-a5167d57c28a\") " pod="openshift-service-ca/service-ca-9c57cc56f-q2szc" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.148372 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-service-ca\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.148427 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/899ef710-299b-4178-850d-1da30747c924-csi-data-dir\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.148732 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/899ef710-299b-4178-850d-1da30747c924-mountpoint-dir\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.148835 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/899ef710-299b-4178-850d-1da30747c924-plugins-dir\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:49 crc kubenswrapper[4766]: E1002 10:53:49.149356 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:49.64933246 +0000 UTC m=+144.592203474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.149398 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a57c2a77-db59-4b73-b376-640de2af9a7e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vkm2v\" (UID: \"a57c2a77-db59-4b73-b376-640de2af9a7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.149467 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/899ef710-299b-4178-850d-1da30747c924-registration-dir\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.149976 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.150059 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/899ef710-299b-4178-850d-1da30747c924-socket-dir\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.150833 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-audit-policies\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.150933 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-oauth-serving-cert\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.151001 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-trusted-ca-bundle\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.151490 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: W1002 10:53:49.152713 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod143c6751_acab_4e56_9e54_b0e4dc6ae562.slice/crio-9ff8e3825030802c04352975ebf4c486f51b157dfbdd567dbec027a287550e1e WatchSource:0}: Error finding container 9ff8e3825030802c04352975ebf4c486f51b157dfbdd567dbec027a287550e1e: Status 404 returned error can't find the container with id 9ff8e3825030802c04352975ebf4c486f51b157dfbdd567dbec027a287550e1e Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.153574 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.153965 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.155282 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.155761 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.155867 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.156398 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3-certs\") pod \"machine-config-server-qkm7n\" (UID: \"d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3\") " pod="openshift-machine-config-operator/machine-config-server-qkm7n" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.156752 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.158099 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a57c2a77-db59-4b73-b376-640de2af9a7e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vkm2v\" (UID: \"a57c2a77-db59-4b73-b376-640de2af9a7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.159377 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-serving-cert\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.159475 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.159903 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.159629 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3-node-bootstrap-token\") pod \"machine-config-server-qkm7n\" (UID: \"d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3\") " pod="openshift-machine-config-operator/machine-config-server-qkm7n" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.159618 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6wk6\" (UniqueName: \"kubernetes.io/projected/c30c3a5c-a29e-48a7-b446-b68f9cce2742-kube-api-access-s6wk6\") pod \"machine-api-operator-5694c8668f-zkc9n\" (UID: \"c30c3a5c-a29e-48a7-b446-b68f9cce2742\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.160294 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-oauth-config\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.161116 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5bg6"] Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.162965 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6f1e68c3-4a75-4336-bc1c-a5167d57c28a-signing-key\") pod \"service-ca-9c57cc56f-q2szc\" (UID: \"6f1e68c3-4a75-4336-bc1c-a5167d57c28a\") " pod="openshift-service-ca/service-ca-9c57cc56f-q2szc" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.174300 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/1030ba1a-c14b-4091-8417-b2dcbd287b97-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ttsj7\" (UID: \"1030ba1a-c14b-4091-8417-b2dcbd287b97\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ttsj7" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.174654 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgh4q\" (UniqueName: \"kubernetes.io/projected/066a8b65-b3f1-42c3-a989-33409b41f8dc-kube-api-access-cgh4q\") pod \"openshift-controller-manager-operator-756b6f6bc6-d59w5\" (UID: \"066a8b65-b3f1-42c3-a989-33409b41f8dc\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d59w5" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.191540 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz4lb\" (UniqueName: \"kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-kube-api-access-bz4lb\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.206360 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-qhc8r"] Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.211067 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rlrhw" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.214574 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kzhj\" (UniqueName: \"kubernetes.io/projected/d0980a54-cd9d-4daa-a5ac-7f86e447f646-kube-api-access-2kzhj\") pod \"route-controller-manager-6576b87f9c-f2c9k\" (UID: \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.223032 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qprrh" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.231131 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx7fr\" (UniqueName: \"kubernetes.io/projected/743fe4dc-299d-4f28-9448-644d12db4af7-kube-api-access-sx7fr\") pod \"downloads-7954f5f757-xqcd7\" (UID: \"743fe4dc-299d-4f28-9448-644d12db4af7\") " pod="openshift-console/downloads-7954f5f757-xqcd7" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.241856 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:49 crc kubenswrapper[4766]: E1002 10:53:49.242296 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:49.742281999 +0000 UTC m=+144.685152943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.251471 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r767l\" (UniqueName: \"kubernetes.io/projected/5102f73f-dc76-4e60-9ed8-cc12efc46860-kube-api-access-r767l\") pod \"cluster-image-registry-operator-dc59b4c8b-mltn4\" (UID: \"5102f73f-dc76-4e60-9ed8-cc12efc46860\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.266381 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nm9mf"] Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.285406 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.288961 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-bound-sa-token\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.299654 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.307742 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q88f7\" (UniqueName: \"kubernetes.io/projected/30c9f678-ccef-4c4d-b172-c5853e15ddd4-kube-api-access-q88f7\") pod \"authentication-operator-69f744f599-d5v2m\" (UID: \"30c9f678-ccef-4c4d-b172-c5853e15ddd4\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.314458 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.330007 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq5lh\" (UniqueName: \"kubernetes.io/projected/5fcce406-28bd-4526-9a8e-fe2381ce20a2-kube-api-access-fq5lh\") pod \"kube-storage-version-migrator-operator-b67b599dd-flmqc\" (UID: \"5fcce406-28bd-4526-9a8e-fe2381ce20a2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flmqc" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.334960 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.343214 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:49 crc kubenswrapper[4766]: E1002 10:53:49.343553 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:49.843537481 +0000 UTC m=+144.786408425 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.345616 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.350254 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5102f73f-dc76-4e60-9ed8-cc12efc46860-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-mltn4\" (UID: \"5102f73f-dc76-4e60-9ed8-cc12efc46860\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.369242 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8br68\" (UniqueName: \"kubernetes.io/projected/dcbb0f16-6000-4d64-ab71-a61c1b3b7063-kube-api-access-8br68\") pod \"service-ca-operator-777779d784-jmttf\" (UID: \"dcbb0f16-6000-4d64-ab71-a61c1b3b7063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmttf" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.374678 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.388412 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d59w5" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.393947 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcsc5\" (UniqueName: \"kubernetes.io/projected/ac2bdbb7-a515-46ed-90c6-5fe5d141fba8-kube-api-access-zcsc5\") pod \"openshift-apiserver-operator-796bbdcf4f-4dx82\" (UID: \"ac2bdbb7-a515-46ed-90c6-5fe5d141fba8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4dx82" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.406406 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.412407 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2x8\" (UniqueName: \"kubernetes.io/projected/c4653f10-7eaa-450c-881b-e074e4038d2f-kube-api-access-vh2x8\") pod \"apiserver-7bbb656c7d-7b6dx\" (UID: \"c4653f10-7eaa-450c-881b-e074e4038d2f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.417728 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4dx82" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.430965 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxcbp\" (UniqueName: \"kubernetes.io/projected/a7440e0b-298a-4118-afc5-3c8fcb11eed7-kube-api-access-mxcbp\") pod \"catalog-operator-68c6474976-7r26x\" (UID: \"a7440e0b-298a-4118-afc5-3c8fcb11eed7\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.431044 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rlrhw"] Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.431721 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwrx"] Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.435001 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmttf" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.438347 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.440041 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl"] Oct 02 10:53:49 crc kubenswrapper[4766]: W1002 10:53:49.443102 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97952e7f_5262_40a4_8a14_0a881ce34703.slice/crio-ff3be479bd78556cc3700d56b222c91af13282783df03c064e3dd4f3339a6bdc WatchSource:0}: Error finding container ff3be479bd78556cc3700d56b222c91af13282783df03c064e3dd4f3339a6bdc: Status 404 returned error can't find the container with id ff3be479bd78556cc3700d56b222c91af13282783df03c064e3dd4f3339a6bdc Oct 02 10:53:49 crc kubenswrapper[4766]: W1002 10:53:49.443653 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdd489b9_775f_4a9a_b3de_8ac4d8fcf8fe.slice/crio-b9a3216a4e862280b3d00070011e3c4bd9cf3d46cb7b715066a7a584db42f569 WatchSource:0}: Error finding container b9a3216a4e862280b3d00070011e3c4bd9cf3d46cb7b715066a7a584db42f569: Status 404 returned error can't find the container with id b9a3216a4e862280b3d00070011e3c4bd9cf3d46cb7b715066a7a584db42f569 Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.443849 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:49 crc kubenswrapper[4766]: E1002 10:53:49.443971 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:49.943954408 +0000 UTC m=+144.886825352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.444041 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:49 crc kubenswrapper[4766]: E1002 10:53:49.444316 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:49.944308058 +0000 UTC m=+144.887178992 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.451309 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xmhn\" (UniqueName: \"kubernetes.io/projected/78152191-403f-476f-90ee-0342f60ba99c-kube-api-access-5xmhn\") pod \"dns-operator-744455d44c-nkh48\" (UID: \"78152191-403f-476f-90ee-0342f60ba99c\") " pod="openshift-dns-operator/dns-operator-744455d44c-nkh48" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.461436 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xqcd7" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.469518 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxlz6\" (UniqueName: \"kubernetes.io/projected/89c62d15-b27b-4722-95ec-9b9a76efa5d7-kube-api-access-vxlz6\") pod \"ingress-canary-lrsk8\" (UID: \"89c62d15-b27b-4722-95ec-9b9a76efa5d7\") " pod="openshift-ingress-canary/ingress-canary-lrsk8" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.475932 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nkh48" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.483919 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-lrsk8" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.489737 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qprrh"] Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.490395 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stmtk\" (UniqueName: \"kubernetes.io/projected/d24e91e7-dadf-4b67-be1b-a945b1250017-kube-api-access-stmtk\") pod \"oauth-openshift-558db77b4-m2kx2\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.507616 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpg74\" (UniqueName: \"kubernetes.io/projected/a57c2a77-db59-4b73-b376-640de2af9a7e-kube-api-access-tpg74\") pod \"marketplace-operator-79b997595-vkm2v\" (UID: \"a57c2a77-db59-4b73-b376-640de2af9a7e\") " pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.516993 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:49 crc kubenswrapper[4766]: W1002 10:53:49.517197 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ef5cc8f_ef34_4fd1_8765_7f41500898e6.slice/crio-58481ef0f8e27f4fe8e5a95823c40a41fa5e9a448ae57c40e352aa4422fdaaed WatchSource:0}: Error finding container 58481ef0f8e27f4fe8e5a95823c40a41fa5e9a448ae57c40e352aa4422fdaaed: Status 404 returned error can't find the container with id 58481ef0f8e27f4fe8e5a95823c40a41fa5e9a448ae57c40e352aa4422fdaaed Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.528899 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6rzm\" (UniqueName: \"kubernetes.io/projected/d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3-kube-api-access-m6rzm\") pod \"machine-config-server-qkm7n\" (UID: \"d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3\") " pod="openshift-machine-config-operator/machine-config-server-qkm7n" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.548263 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:49 crc kubenswrapper[4766]: E1002 10:53:49.548707 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:50.048687105 +0000 UTC m=+144.991558059 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.548825 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.551082 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8h42\" (UniqueName: \"kubernetes.io/projected/581ea4c4-072a-4bba-afc9-2f82918ac0c9-kube-api-access-f8h42\") pod \"console-f9d7485db-8z2x2\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.567865 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhhv4\" (UniqueName: \"kubernetes.io/projected/899ef710-299b-4178-850d-1da30747c924-kube-api-access-hhhv4\") pod \"csi-hostpathplugin-rbh2d\" (UID: \"899ef710-299b-4178-850d-1da30747c924\") " pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.571136 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.576056 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qkm7n" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.589903 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4mpr\" (UniqueName: \"kubernetes.io/projected/6f1e68c3-4a75-4336-bc1c-a5167d57c28a-kube-api-access-t4mpr\") pod \"service-ca-9c57cc56f-q2szc\" (UID: \"6f1e68c3-4a75-4336-bc1c-a5167d57c28a\") " pod="openshift-service-ca/service-ca-9c57cc56f-q2szc" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.607282 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.608378 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-zkc9n"] Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.609637 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z8w5\" (UniqueName: \"kubernetes.io/projected/1030ba1a-c14b-4091-8417-b2dcbd287b97-kube-api-access-4z8w5\") pod \"multus-admission-controller-857f4d67dd-ttsj7\" (UID: \"1030ba1a-c14b-4091-8417-b2dcbd287b97\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ttsj7" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.623539 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flmqc" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.650131 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:49 crc kubenswrapper[4766]: E1002 10:53:49.650444 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:50.150433085 +0000 UTC m=+145.093304029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.701735 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" event={"ID":"143c6751-acab-4e56-9e54-b0e4dc6ae562","Type":"ContainerStarted","Data":"9ff8e3825030802c04352975ebf4c486f51b157dfbdd567dbec027a287550e1e"} Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.721082 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" event={"ID":"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3","Type":"ContainerStarted","Data":"62fbaa3799778e6d460cc44e4de2550bf05297208252417143f34ae13baa10dc"} Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.727918 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.740101 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rlrhw" event={"ID":"fdd489b9-775f-4a9a-b3de-8ac4d8fcf8fe","Type":"ContainerStarted","Data":"b9a3216a4e862280b3d00070011e3c4bd9cf3d46cb7b715066a7a584db42f569"} Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.745636 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k8v6c" event={"ID":"c6a20802-d5f8-4b5f-8655-410dc9bd8aa7","Type":"ContainerStarted","Data":"e24d3664f9678789f62309c54c5b2546e6fcf74023655ac2d708292f40c45cc0"} Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.751243 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:49 crc kubenswrapper[4766]: E1002 10:53:49.751739 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:50.25171842 +0000 UTC m=+145.194589364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.759790 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tmkn8"] Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.760029 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mm6nw" event={"ID":"1ff4a13b-a07e-4031-a1fb-ba29027332e8","Type":"ContainerStarted","Data":"3ba7ac379616df09fb634de6ac783ec794b2c60f3ae62cc87c57405d03d03460"} Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.761748 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w" event={"ID":"6e1d0411-55ac-4287-b19f-d6c46444434b","Type":"ContainerStarted","Data":"6fda4cdbb7dde424579550c20281861acc04e7319d5f2cad0d2c1658fe676b06"} Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.763051 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" event={"ID":"786673c7-6fb8-4b0d-864a-ea29fa681de6","Type":"ContainerStarted","Data":"febfe72ed3eb765bf93db7a04461134e086bce28ac35cb378f0731e5a785736d"} Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.763830 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-44mnp" event={"ID":"97952e7f-5262-40a4-8a14-0a881ce34703","Type":"ContainerStarted","Data":"ff3be479bd78556cc3700d56b222c91af13282783df03c064e3dd4f3339a6bdc"} Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.766596 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nm9mf" event={"ID":"641e2cb7-16f3-4339-ace8-1a5d6b921841","Type":"ContainerStarted","Data":"51bca2cd83a5d1092f2e4d2145005309fa8a27e559a9eb4e04c273c07b9cc5fe"} Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.769534 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qprrh" event={"ID":"2ef5cc8f-ef34-4fd1-8765-7f41500898e6","Type":"ContainerStarted","Data":"58481ef0f8e27f4fe8e5a95823c40a41fa5e9a448ae57c40e352aa4422fdaaed"} Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.770582 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" event={"ID":"730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae","Type":"ContainerStarted","Data":"3f46186e7731e05b80d896142a514fa33b0bd4b259b3dab0f98841b42151f43b"} Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.779356 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" event={"ID":"fc982890-ee1e-4482-8c17-0c5b11583ce2","Type":"ContainerStarted","Data":"be16fe1ce244df32d4bedc2a003c7daa09f094e17ced8ec314b5d55cf5983b67"} Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.779393 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" event={"ID":"fc982890-ee1e-4482-8c17-0c5b11583ce2","Type":"ContainerStarted","Data":"f876d2c95de834c70cda08f114a7c7420bdf15239044b3b83c90a8a2f09e3677"} Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.800745 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" event={"ID":"6d87787d-4605-4895-a5bd-a3820dd38fae","Type":"ContainerStarted","Data":"6973830b927d1c267b83bea17a7c12af9230e7ce556abeff7e3c9c049fb47061"} Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.821563 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5bg6" event={"ID":"ff3698e3-0db7-4a46-8244-ec9486c9ed48","Type":"ContainerStarted","Data":"9686ea099397854ac06a7a04f653904c46bb2f398dbbed7555093539cd173708"} Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.826919 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zvx8j" event={"ID":"8fb9e589-a259-4ff9-9d1b-198b57fb179b","Type":"ContainerStarted","Data":"d492c7bbe225ea64a35887c06ed23ea0148d804cfa3ba965e11840e80e17ce36"} Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.829074 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ttsj7" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.829983 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" event={"ID":"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac","Type":"ContainerStarted","Data":"b4f92a7af89f82948375a99c653fe3353833585c19922ad38b613636d7b14f50"} Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.836818 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.851735 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-q2szc" Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.852874 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:49 crc kubenswrapper[4766]: E1002 10:53:49.853351 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:50.353318504 +0000 UTC m=+145.296189448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:49 crc kubenswrapper[4766]: W1002 10:53:49.905925 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb77d71_ded6_4158_a3fe_461336cece71.slice/crio-85b202174382d1a651104786d8ceb570005debc3028a814727e335e02cff6d6c WatchSource:0}: Error finding container 85b202174382d1a651104786d8ceb570005debc3028a814727e335e02cff6d6c: Status 404 returned error can't find the container with id 85b202174382d1a651104786d8ceb570005debc3028a814727e335e02cff6d6c Oct 02 10:53:49 crc kubenswrapper[4766]: I1002 10:53:49.953615 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:49 crc kubenswrapper[4766]: E1002 10:53:49.955378 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:50.455362293 +0000 UTC m=+145.398233237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.017734 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5"] Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.055347 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:50 crc kubenswrapper[4766]: E1002 10:53:50.056020 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:50.556002646 +0000 UTC m=+145.498873660 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.156428 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:50 crc kubenswrapper[4766]: E1002 10:53:50.156818 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:50.656800194 +0000 UTC m=+145.599671138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.258556 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:50 crc kubenswrapper[4766]: E1002 10:53:50.258825 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:50.758814812 +0000 UTC m=+145.701685746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.362457 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:50 crc kubenswrapper[4766]: E1002 10:53:50.363454 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:50.863435777 +0000 UTC m=+145.806306721 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.381933 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s"] Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.384733 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d59w5"] Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.412847 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-d5v2m"] Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.464245 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:50 crc kubenswrapper[4766]: E1002 10:53:50.464644 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:50.964623978 +0000 UTC m=+145.907494922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.572362 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:50 crc kubenswrapper[4766]: E1002 10:53:50.572559 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:51.072531261 +0000 UTC m=+146.015402235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.676093 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:50 crc kubenswrapper[4766]: E1002 10:53:50.676421 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:51.176410861 +0000 UTC m=+146.119281805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.739304 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4"] Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.757758 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k"] Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.777050 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:50 crc kubenswrapper[4766]: E1002 10:53:50.777374 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:51.277360135 +0000 UTC m=+146.220231069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.842574 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nm9mf" event={"ID":"641e2cb7-16f3-4339-ace8-1a5d6b921841","Type":"ContainerStarted","Data":"772cc1ade2e8a05c2f0128c550af413ea7e79ba8648338e1b5947b0fbce15d4e"} Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.844651 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-nm9mf" Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.851282 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" event={"ID":"786673c7-6fb8-4b0d-864a-ea29fa681de6","Type":"ContainerStarted","Data":"22f8ffbafd8270b5cd0cdbccdc43b9975d0a5c43487e3f52a0fee033356db63c"} Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.854561 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" event={"ID":"30c9f678-ccef-4c4d-b172-c5853e15ddd4","Type":"ContainerStarted","Data":"482cce904817e97375449fc4b3c1aae87ffad491244d1b5f80397e2938c09a4a"} Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.871712 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rlrhw" event={"ID":"fdd489b9-775f-4a9a-b3de-8ac4d8fcf8fe","Type":"ContainerStarted","Data":"a1995407d8279f6f81ba667ad828d2b5ace7c70aae93c31ccbb390541e1466b0"} Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.878806 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:50 crc kubenswrapper[4766]: E1002 10:53:50.879251 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:51.379238349 +0000 UTC m=+146.322109293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.880035 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" event={"ID":"6d87787d-4605-4895-a5bd-a3820dd38fae","Type":"ContainerStarted","Data":"6f568c17489efe87e9ad5240908f30091ea2831bf46777632c497d5e43b743b5"} Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.880645 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.885984 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zvx8j" event={"ID":"8fb9e589-a259-4ff9-9d1b-198b57fb179b","Type":"ContainerStarted","Data":"d5a9fca5aebb5e775f836b50eaf3ed669719297a07a5e4b3f49d03e59ec01821"} Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.890222 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5" event={"ID":"55374426-4ab4-4ce6-a180-6f449961e26d","Type":"ContainerStarted","Data":"b5b8a308d422d02656895c79d0afdb1bdf7c2b89f22a46542efcf08ca7c914da"} Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.892712 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mm6nw" event={"ID":"1ff4a13b-a07e-4031-a1fb-ba29027332e8","Type":"ContainerStarted","Data":"6746ac41a6f32f365edf8de1b574358a4ae46fbd7a32f9ca7272d4a85843132c"} Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.893222 4766 patch_prober.go:28] interesting pod/console-operator-58897d9998-nm9mf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/readyz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.893265 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-nm9mf" podUID="641e2cb7-16f3-4339-ace8-1a5d6b921841" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/readyz\": dial tcp 10.217.0.27:8443: connect: connection refused" Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.894472 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qprrh" event={"ID":"2ef5cc8f-ef34-4fd1-8765-7f41500898e6","Type":"ContainerStarted","Data":"7927a6a1b7148eb6f63cee106ba8214fda6dbb84ab1fee675a535f8693defebc"} Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.901910 4766 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qz9xl container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.901961 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" podUID="6d87787d-4605-4895-a5bd-a3820dd38fae" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.902274 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" event={"ID":"f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8","Type":"ContainerStarted","Data":"b07991d64f403a7a1e9ccffaacd5061618bdddfa0c2294854d70a374c9f8d233"} Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.907110 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwrx" event={"ID":"df3eaaf0-38f3-4334-adaa-bcdc6b4409bc","Type":"ContainerStarted","Data":"06821ffada443fb897d7b873a9a4d7a5627751be86529b78e552f8c20fc92a32"} Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.907351 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwrx" event={"ID":"df3eaaf0-38f3-4334-adaa-bcdc6b4409bc","Type":"ContainerStarted","Data":"0115c5238b568b21fb947e75929e401a0951bbd6bad6e0879d3e82cda83fc72e"} Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.924110 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5bg6" event={"ID":"ff3698e3-0db7-4a46-8244-ec9486c9ed48","Type":"ContainerStarted","Data":"2ba8b034df2c3755e168ad69d6abe34de41917dd489a1e65cb000883f3a9dda5"} Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.929836 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" event={"ID":"afb77d71-ded6-4158-a3fe-461336cece71","Type":"ContainerStarted","Data":"85b202174382d1a651104786d8ceb570005debc3028a814727e335e02cff6d6c"} Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.938025 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" event={"ID":"c30c3a5c-a29e-48a7-b446-b68f9cce2742","Type":"ContainerStarted","Data":"908ab053d8162d484e72d0f16b7c4e6beec22d026700f8b8ec9c50f7ed28f0af"} Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.957816 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w" event={"ID":"6e1d0411-55ac-4287-b19f-d6c46444434b","Type":"ContainerStarted","Data":"1d9d9dd994e29db89486d5931fdb99e410b10794d687219974bc27ce541bbee4"} Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.962847 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-44mnp" event={"ID":"97952e7f-5262-40a4-8a14-0a881ce34703","Type":"ContainerStarted","Data":"973cc5f0a6f2c6c47d3b402fa5d3dea52c3f4b013f2f90ce6414793a15067aa5"} Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.979996 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:50 crc kubenswrapper[4766]: E1002 10:53:50.981178 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:51.481159734 +0000 UTC m=+146.424030678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.988732 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-lrsk8"] Oct 02 10:53:50 crc kubenswrapper[4766]: I1002 10:53:50.991280 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m2kx2"] Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.044552 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" event={"ID":"143c6751-acab-4e56-9e54-b0e4dc6ae562","Type":"ContainerStarted","Data":"a4720e8908dfc098e0523d130f8233d932d19f38de941df8c08156991cff052a"} Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.083748 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.084817 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-jmttf"] Oct 02 10:53:51 crc kubenswrapper[4766]: E1002 10:53:51.085207 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:51.585187799 +0000 UTC m=+146.528058743 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.096768 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vkm2v"] Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.113204 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4dx82"] Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.113249 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nkh48"] Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.128493 4766 generic.go:334] "Generic (PLEG): container finished" podID="e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3" containerID="c9454806e5a571a9449bfe81a0615ca27c973ab074132a259d59abadce5089a4" exitCode=0 Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.128612 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" event={"ID":"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3","Type":"ContainerDied","Data":"c9454806e5a571a9449bfe81a0615ca27c973ab074132a259d59abadce5089a4"} Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.146630 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x5jh" event={"ID":"900aedf6-0ce4-429f-9d04-2776a8625593","Type":"ContainerStarted","Data":"f7f83959f555c16011f65547ab5b5fca3ca7cb04542c96431532dc8d6d17d235"} Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.154315 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d59w5" event={"ID":"066a8b65-b3f1-42c3-a989-33409b41f8dc","Type":"ContainerStarted","Data":"ad652ddb2468c33ed7058d906ff6abf219fee5778b0094b573330300ca4112c5"} Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.161042 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qkm7n" event={"ID":"d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3","Type":"ContainerStarted","Data":"0f7a7d4689a1ebe0394a5b310fea51098590590467c0bb3c9eba57d9c1fe0e25"} Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.172953 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k8v6c" event={"ID":"c6a20802-d5f8-4b5f-8655-410dc9bd8aa7","Type":"ContainerStarted","Data":"07a77b00dadce5ca2ac03a4e6e33cd8e3fe8e557422f775dafe086c03f827c7a"} Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.175554 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" event={"ID":"730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae","Type":"ContainerStarted","Data":"03d281678dacd75f9e7e5c27a393451fc9fba1371b2da066db7af047979eb644"} Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.184985 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:51 crc kubenswrapper[4766]: E1002 10:53:51.186040 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:51.686021888 +0000 UTC m=+146.628892832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.188486 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rlrhw" podStartSLOduration=124.188469059 podStartE2EDuration="2m4.188469059s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:51.186897457 +0000 UTC m=+146.129768401" watchObservedRunningTime="2025-10-02 10:53:51.188469059 +0000 UTC m=+146.131340003" Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.225293 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flmqc"] Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.230476 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-rbh2d"] Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.230690 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-44mnp" podStartSLOduration=124.230678967 podStartE2EDuration="2m4.230678967s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:51.214971177 +0000 UTC m=+146.157842141" watchObservedRunningTime="2025-10-02 10:53:51.230678967 +0000 UTC m=+146.173549911" Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.238774 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-nm9mf" podStartSLOduration=124.238767785 podStartE2EDuration="2m4.238767785s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:51.238447115 +0000 UTC m=+146.181318059" watchObservedRunningTime="2025-10-02 10:53:51.238767785 +0000 UTC m=+146.181638729" Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.251646 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" podStartSLOduration=123.251631421 podStartE2EDuration="2m3.251631421s" podCreationTimestamp="2025-10-02 10:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:51.250128771 +0000 UTC m=+146.192999715" watchObservedRunningTime="2025-10-02 10:53:51.251631421 +0000 UTC m=+146.194502365" Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.286670 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:51 crc kubenswrapper[4766]: E1002 10:53:51.288222 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:51.788210113 +0000 UTC m=+146.731081057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.288298 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.289543 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.289587 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.299644 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ttsj7"] Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.307064 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx"] Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.349635 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" podStartSLOduration=124.349615996 podStartE2EDuration="2m4.349615996s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:51.327275226 +0000 UTC m=+146.270146180" watchObservedRunningTime="2025-10-02 10:53:51.349615996 +0000 UTC m=+146.292486940" Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.352428 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xqcd7"] Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.358177 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-q2szc"] Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.372173 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x"] Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.372388 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-k2sqh" podStartSLOduration=124.37236542 podStartE2EDuration="2m4.37236542s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:51.369010868 +0000 UTC m=+146.311881822" watchObservedRunningTime="2025-10-02 10:53:51.37236542 +0000 UTC m=+146.315236364" Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.388187 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:51 crc kubenswrapper[4766]: E1002 10:53:51.388608 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:51.888589307 +0000 UTC m=+146.831460251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.406748 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-mm6nw" podStartSLOduration=124.406726797 podStartE2EDuration="2m4.406726797s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:51.406376036 +0000 UTC m=+146.349246980" watchObservedRunningTime="2025-10-02 10:53:51.406726797 +0000 UTC m=+146.349597741" Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.450593 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8x5jh" podStartSLOduration=124.45057469 podStartE2EDuration="2m4.45057469s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:51.448546362 +0000 UTC m=+146.391417326" watchObservedRunningTime="2025-10-02 10:53:51.45057469 +0000 UTC m=+146.393445644" Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.474239 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-8z2x2"] Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.485142 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-k8v6c" podStartSLOduration=124.485124603 podStartE2EDuration="2m4.485124603s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:51.483563902 +0000 UTC m=+146.426434866" watchObservedRunningTime="2025-10-02 10:53:51.485124603 +0000 UTC m=+146.427995547" Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.490233 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:51 crc kubenswrapper[4766]: E1002 10:53:51.490584 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:51.990571183 +0000 UTC m=+146.933442127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.591045 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:51 crc kubenswrapper[4766]: E1002 10:53:51.591232 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:52.091206457 +0000 UTC m=+147.034077401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.591322 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:51 crc kubenswrapper[4766]: E1002 10:53:51.591701 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:52.091687602 +0000 UTC m=+147.034558546 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.692497 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:51 crc kubenswrapper[4766]: E1002 10:53:51.692755 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:52.192715948 +0000 UTC m=+147.135586892 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.692883 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:51 crc kubenswrapper[4766]: E1002 10:53:51.693235 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:52.193222645 +0000 UTC m=+147.136093589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:51 crc kubenswrapper[4766]: W1002 10:53:51.728640 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f1e68c3_4a75_4336_bc1c_a5167d57c28a.slice/crio-3c91dac7fe9a893015e48b96205dde5196188131a00de0e51a530b6c39010279 WatchSource:0}: Error finding container 3c91dac7fe9a893015e48b96205dde5196188131a00de0e51a530b6c39010279: Status 404 returned error can't find the container with id 3c91dac7fe9a893015e48b96205dde5196188131a00de0e51a530b6c39010279 Oct 02 10:53:51 crc kubenswrapper[4766]: W1002 10:53:51.728990 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod743fe4dc_299d_4f28_9448_644d12db4af7.slice/crio-49320c443f9559fe30eece47544ce8682d05fb4fe4ba80f2e119b9592284d790 WatchSource:0}: Error finding container 49320c443f9559fe30eece47544ce8682d05fb4fe4ba80f2e119b9592284d790: Status 404 returned error can't find the container with id 49320c443f9559fe30eece47544ce8682d05fb4fe4ba80f2e119b9592284d790 Oct 02 10:53:51 crc kubenswrapper[4766]: W1002 10:53:51.730826 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7440e0b_298a_4118_afc5_3c8fcb11eed7.slice/crio-2cc4501768e3d19d4703dce16b60d1352bc28946cd8366d4b292088afcea1f31 WatchSource:0}: Error finding container 2cc4501768e3d19d4703dce16b60d1352bc28946cd8366d4b292088afcea1f31: Status 404 returned error can't find the container with id 2cc4501768e3d19d4703dce16b60d1352bc28946cd8366d4b292088afcea1f31 Oct 02 10:53:51 crc kubenswrapper[4766]: W1002 10:53:51.737751 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod581ea4c4_072a_4bba_afc9_2f82918ac0c9.slice/crio-4b5cdf54df5f023e04a3209ab1a317a8f42af073dc3cc27fdfa7c34c59f3f98d WatchSource:0}: Error finding container 4b5cdf54df5f023e04a3209ab1a317a8f42af073dc3cc27fdfa7c34c59f3f98d: Status 404 returned error can't find the container with id 4b5cdf54df5f023e04a3209ab1a317a8f42af073dc3cc27fdfa7c34c59f3f98d Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.794973 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:51 crc kubenswrapper[4766]: E1002 10:53:51.796816 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:52.296792565 +0000 UTC m=+147.239663509 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.798841 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:51 crc kubenswrapper[4766]: E1002 10:53:51.799294 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:52.299279977 +0000 UTC m=+147.242150921 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.900099 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:51 crc kubenswrapper[4766]: E1002 10:53:51.900257 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:52.40023666 +0000 UTC m=+147.343107614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:51 crc kubenswrapper[4766]: I1002 10:53:51.900382 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:51 crc kubenswrapper[4766]: E1002 10:53:51.900733 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:52.400714076 +0000 UTC m=+147.343585020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.001932 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:52 crc kubenswrapper[4766]: E1002 10:53:52.002103 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:52.502081344 +0000 UTC m=+147.444952288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.003858 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:52 crc kubenswrapper[4766]: E1002 10:53:52.004202 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:52.504190253 +0000 UTC m=+147.447061197 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.104832 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:52 crc kubenswrapper[4766]: E1002 10:53:52.105213 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:52.605182788 +0000 UTC m=+147.548053732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.179675 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4" event={"ID":"5102f73f-dc76-4e60-9ed8-cc12efc46860","Type":"ContainerStarted","Data":"034acafd0d51721db21729e19cf57aba8beb29054f173f70d8f7cf29b897ccd8"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.180998 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flmqc" event={"ID":"5fcce406-28bd-4526-9a8e-fe2381ce20a2","Type":"ContainerStarted","Data":"5c0e73d16572a31f4027f4f006b42a7ca037349df2e13d0ff81c3c15cd318365"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.181949 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" event={"ID":"c4653f10-7eaa-450c-881b-e074e4038d2f","Type":"ContainerStarted","Data":"a2850e23602c75d8ea1a5b121a9108d0c8bd9a576566c0533e8ae366439f0fc7"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.182738 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lrsk8" event={"ID":"89c62d15-b27b-4722-95ec-9b9a76efa5d7","Type":"ContainerStarted","Data":"0418e8878cab79409f4539da8a981ade954a8e455f41e38801d56fb418eaa860"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.183594 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmttf" event={"ID":"dcbb0f16-6000-4d64-ab71-a61c1b3b7063","Type":"ContainerStarted","Data":"84befadf79d0955c8faf7c3663270bb43b09cddb99ec4128a09b1b86e7a2249b"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.184472 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" event={"ID":"a57c2a77-db59-4b73-b376-640de2af9a7e","Type":"ContainerStarted","Data":"46a2a604d643b7bb74a35fe0a4452af5c93be4f6b5c4422f48d4e14e4cd5e76b"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.185801 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x" event={"ID":"a7440e0b-298a-4118-afc5-3c8fcb11eed7","Type":"ContainerStarted","Data":"2cc4501768e3d19d4703dce16b60d1352bc28946cd8366d4b292088afcea1f31"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.186842 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4dx82" event={"ID":"ac2bdbb7-a515-46ed-90c6-5fe5d141fba8","Type":"ContainerStarted","Data":"0b489ee6da66c3d3c44492715a21e168139129e4bf6cdfa216c9598d30094534"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.187668 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8z2x2" event={"ID":"581ea4c4-072a-4bba-afc9-2f82918ac0c9","Type":"ContainerStarted","Data":"4b5cdf54df5f023e04a3209ab1a317a8f42af073dc3cc27fdfa7c34c59f3f98d"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.188485 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" event={"ID":"d24e91e7-dadf-4b67-be1b-a945b1250017","Type":"ContainerStarted","Data":"fa36233076b97946cc538d98600057e8767c67b07eb04e728f5de6d612c85606"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.190403 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" event={"ID":"786673c7-6fb8-4b0d-864a-ea29fa681de6","Type":"ContainerStarted","Data":"63770a5367985b6fe0ff2f30cb4dc41904c9440e475bf1b007b919167b121756"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.193360 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nkh48" event={"ID":"78152191-403f-476f-90ee-0342f60ba99c","Type":"ContainerStarted","Data":"ee42fb8a9c91e0e34283a0e045aa404d4f2fbf64809eae7fb6ce34041f44b5e0"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.195099 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" event={"ID":"d0980a54-cd9d-4daa-a5ac-7f86e447f646","Type":"ContainerStarted","Data":"7139d003d48b67444f57978c9282ff44d9cc29b1cd4c39f222b42cd8d28acb2d"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.197625 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xqcd7" event={"ID":"743fe4dc-299d-4f28-9448-644d12db4af7","Type":"ContainerStarted","Data":"49320c443f9559fe30eece47544ce8682d05fb4fe4ba80f2e119b9592284d790"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.199818 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5bg6" event={"ID":"ff3698e3-0db7-4a46-8244-ec9486c9ed48","Type":"ContainerStarted","Data":"806a98d49b6120fe542941681d62b5833a22f199e7ca35d4473ebf9fda722c26"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.200618 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-q2szc" event={"ID":"6f1e68c3-4a75-4336-bc1c-a5167d57c28a","Type":"ContainerStarted","Data":"3c91dac7fe9a893015e48b96205dde5196188131a00de0e51a530b6c39010279"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.201586 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" event={"ID":"899ef710-299b-4178-850d-1da30747c924","Type":"ContainerStarted","Data":"05c0cf44bdbf012a0d063bd9f78a19e203b6ae444932989864f6101d03fc37e2"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.202551 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ttsj7" event={"ID":"1030ba1a-c14b-4091-8417-b2dcbd287b97","Type":"ContainerStarted","Data":"fa4c8660e18fb69721c5233212b882ba28b036d6864f85bd6b6b45de1c9b9555"} Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.204706 4766 patch_prober.go:28] interesting pod/console-operator-58897d9998-nm9mf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/readyz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.204728 4766 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qz9xl container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.204758 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-nm9mf" podUID="641e2cb7-16f3-4339-ace8-1a5d6b921841" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/readyz\": dial tcp 10.217.0.27:8443: connect: connection refused" Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.204772 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" podUID="6d87787d-4605-4895-a5bd-a3820dd38fae" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.206285 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:52 crc kubenswrapper[4766]: E1002 10:53:52.206553 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:52.706541745 +0000 UTC m=+147.649412689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.287879 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.287928 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.307025 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:52 crc kubenswrapper[4766]: E1002 10:53:52.307144 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:52.807128926 +0000 UTC m=+147.749999870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.307299 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:52 crc kubenswrapper[4766]: E1002 10:53:52.307625 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:52.807609372 +0000 UTC m=+147.750480316 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.408865 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:52 crc kubenswrapper[4766]: E1002 10:53:52.408984 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:52.908962927 +0000 UTC m=+147.851833881 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.409380 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:52 crc kubenswrapper[4766]: E1002 10:53:52.409649 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:52.90963403 +0000 UTC m=+147.852504974 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.510219 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:52 crc kubenswrapper[4766]: E1002 10:53:52.510364 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:53.010344465 +0000 UTC m=+147.953215419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.510479 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:52 crc kubenswrapper[4766]: E1002 10:53:52.510791 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:53.01078157 +0000 UTC m=+147.953652514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.557692 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.612070 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:52 crc kubenswrapper[4766]: E1002 10:53:52.612431 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:53.112413756 +0000 UTC m=+148.055284700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.713351 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:52 crc kubenswrapper[4766]: E1002 10:53:52.714890 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:53.214870588 +0000 UTC m=+148.157741632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.815177 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:52 crc kubenswrapper[4766]: E1002 10:53:52.815333 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:53.315312264 +0000 UTC m=+148.258183198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.815615 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:52 crc kubenswrapper[4766]: E1002 10:53:52.816016 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:53.316001918 +0000 UTC m=+148.258872862 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:52 crc kubenswrapper[4766]: I1002 10:53:52.916399 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:52 crc kubenswrapper[4766]: E1002 10:53:52.916616 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:53.416592849 +0000 UTC m=+148.359463793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.019172 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:53 crc kubenswrapper[4766]: E1002 10:53:53.020050 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:53.520024354 +0000 UTC m=+148.462895328 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.120892 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:53 crc kubenswrapper[4766]: E1002 10:53:53.121001 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:53.620980247 +0000 UTC m=+148.563851181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.121194 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:53 crc kubenswrapper[4766]: E1002 10:53:53.121464 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:53.621454663 +0000 UTC m=+148.564325607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.208990 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qkm7n" event={"ID":"d3f4c49b-42ce-4922-98a0-aa0a23c2a0e3","Type":"ContainerStarted","Data":"82fb9556feb0952e69f44029108d22db01f6f473b821ca6ca9b8ddcbb984479c"} Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.214336 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" event={"ID":"afb77d71-ded6-4158-a3fe-461336cece71","Type":"ContainerStarted","Data":"8fcc5113edc86ad8750b351d0d36dc42a366e8f48a8f57b7b05495500f47c9ba"} Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.217084 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" event={"ID":"c30c3a5c-a29e-48a7-b446-b68f9cce2742","Type":"ContainerStarted","Data":"9b07c49523fbfb30ac1b1e61f4ce90d6dc82933494b22b86a5c3590710e78c6c"} Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.218541 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w" event={"ID":"6e1d0411-55ac-4287-b19f-d6c46444434b","Type":"ContainerStarted","Data":"97508ff809b13c29c6396f5eb26bd67307feae07a07d39f45c1dbbbd19b07f92"} Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.219799 4766 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qz9xl container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" start-of-body= Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.219853 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" podUID="6d87787d-4605-4895-a5bd-a3820dd38fae" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.24:8443/healthz\": dial tcp 10.217.0.24:8443: connect: connection refused" Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.221933 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:53 crc kubenswrapper[4766]: E1002 10:53:53.222139 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:53.722115937 +0000 UTC m=+148.664986891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.222194 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:53 crc kubenswrapper[4766]: E1002 10:53:53.222655 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:53.722641524 +0000 UTC m=+148.665512468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.288307 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.288373 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.323699 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:53 crc kubenswrapper[4766]: E1002 10:53:53.323897 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:53.823862186 +0000 UTC m=+148.766733170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.325074 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:53 crc kubenswrapper[4766]: E1002 10:53:53.325451 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:53.825434708 +0000 UTC m=+148.768305682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.426357 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:53 crc kubenswrapper[4766]: E1002 10:53:53.426548 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:53.926521916 +0000 UTC m=+148.869392860 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.426762 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:53 crc kubenswrapper[4766]: E1002 10:53:53.427053 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:53.927044173 +0000 UTC m=+148.869915117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.528238 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:53 crc kubenswrapper[4766]: E1002 10:53:53.528593 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.028551915 +0000 UTC m=+148.971422909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.629720 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:53 crc kubenswrapper[4766]: E1002 10:53:53.630056 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.130044015 +0000 UTC m=+149.072914959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.730367 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:53 crc kubenswrapper[4766]: E1002 10:53:53.730543 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.230493272 +0000 UTC m=+149.173364216 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.730726 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:53 crc kubenswrapper[4766]: E1002 10:53:53.731029 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.231018199 +0000 UTC m=+149.173889143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.832310 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:53 crc kubenswrapper[4766]: E1002 10:53:53.832491 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.332469489 +0000 UTC m=+149.275340433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.832645 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:53 crc kubenswrapper[4766]: E1002 10:53:53.832965 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.332956916 +0000 UTC m=+149.275827850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.933535 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:53 crc kubenswrapper[4766]: E1002 10:53:53.933784 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.433746393 +0000 UTC m=+149.376617358 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:53 crc kubenswrapper[4766]: I1002 10:53:53.934016 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:53 crc kubenswrapper[4766]: E1002 10:53:53.934459 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.434442957 +0000 UTC m=+149.377313941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.034942 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.035164 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.535138191 +0000 UTC m=+149.478009135 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.035331 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.035806 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.535789752 +0000 UTC m=+149.478660696 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.136649 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.136838 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.636811758 +0000 UTC m=+149.579682702 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.137114 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.137407 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.637394447 +0000 UTC m=+149.580265391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.224415 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" event={"ID":"fc982890-ee1e-4482-8c17-0c5b11583ce2","Type":"ContainerStarted","Data":"5b159b573b6e18e2fbf2c8254cc220d86da61edadea9e9bb3edef0530fb51049"} Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.238138 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.238304 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.738280858 +0000 UTC m=+149.681151802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.238350 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.238878 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.738867898 +0000 UTC m=+149.681738842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.292111 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:53:54 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:53:54 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:53:54 crc kubenswrapper[4766]: healthz check failed Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.292179 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.339546 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.339774 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.839751979 +0000 UTC m=+149.782622913 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.339859 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.340270 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.840254886 +0000 UTC m=+149.783125830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.432380 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.432666 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.440775 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.440907 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.940892489 +0000 UTC m=+149.883763433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.440954 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.441236 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:54.94122849 +0000 UTC m=+149.884099434 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.541636 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.541769 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.041739028 +0000 UTC m=+149.984609972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.541839 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.542233 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.042224493 +0000 UTC m=+149.985095437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.643487 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.643690 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.143651983 +0000 UTC m=+150.086522937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.643903 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.644347 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.144334545 +0000 UTC m=+150.087205659 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.744862 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.745020 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.244977198 +0000 UTC m=+150.187848142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.745213 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.745568 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.245553048 +0000 UTC m=+150.188424002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.847979 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.848791 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.348769816 +0000 UTC m=+150.291640760 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:54 crc kubenswrapper[4766]: I1002 10:53:54.950072 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:54 crc kubenswrapper[4766]: E1002 10:53:54.950438 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.450425192 +0000 UTC m=+150.393296136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.051190 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:55 crc kubenswrapper[4766]: E1002 10:53:55.051408 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.551384146 +0000 UTC m=+150.494255090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.051719 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:55 crc kubenswrapper[4766]: E1002 10:53:55.052066 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.552055687 +0000 UTC m=+150.494926671 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.152703 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:55 crc kubenswrapper[4766]: E1002 10:53:55.152842 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.652821964 +0000 UTC m=+150.595692908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.153039 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:55 crc kubenswrapper[4766]: E1002 10:53:55.153400 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.653390554 +0000 UTC m=+150.596261508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.236725 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" event={"ID":"a57c2a77-db59-4b73-b376-640de2af9a7e","Type":"ContainerStarted","Data":"fd592f4eaa72c7cc60d59e29049bd321e3843dbfd1355aa26fb9e593fcd2862a"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.239527 4766 generic.go:334] "Generic (PLEG): container finished" podID="55374426-4ab4-4ce6-a180-6f449961e26d" containerID="0a04c506402897196bec3d1c9399ff6435b6af6a793c53e707c66b381dc55426" exitCode=0 Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.239622 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5" event={"ID":"55374426-4ab4-4ce6-a180-6f449961e26d","Type":"ContainerDied","Data":"0a04c506402897196bec3d1c9399ff6435b6af6a793c53e707c66b381dc55426"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.241450 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d59w5" event={"ID":"066a8b65-b3f1-42c3-a989-33409b41f8dc","Type":"ContainerStarted","Data":"acc9e76cdd643f291ae5876322e551eec1d2d64d9b4760e7c65d1a68178cd17e"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.243837 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-q2szc" event={"ID":"6f1e68c3-4a75-4336-bc1c-a5167d57c28a","Type":"ContainerStarted","Data":"e8900452857b6ec56d101f8b1f9d86c95cce10759338ab78896a54d8e6264345"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.245833 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" event={"ID":"730c9c05-47c2-4c2a-9cdc-7e047cc2d6ae","Type":"ContainerStarted","Data":"b55019e5f97f209a29f595947ed78bf34abf7d7f6032123d74d3dc24fa537269"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.249608 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" event={"ID":"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3","Type":"ContainerStarted","Data":"3fe685de2f9bc7462033f161f9d3d996286798850bfdc13d60bc901660269fb1"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.251482 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qprrh" event={"ID":"2ef5cc8f-ef34-4fd1-8765-7f41500898e6","Type":"ContainerStarted","Data":"fcac2bef4adc382ef6b78881c9e2941d7f020e5d3d27757104498a1ee903084c"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.253361 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwrx" event={"ID":"df3eaaf0-38f3-4334-adaa-bcdc6b4409bc","Type":"ContainerStarted","Data":"8acbebac555df06140b3a599c85bea556e53ce6224ea063fc1ecc08708073e07"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.253764 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:55 crc kubenswrapper[4766]: E1002 10:53:55.253939 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.753911822 +0000 UTC m=+150.696782796 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.256006 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4dx82" event={"ID":"ac2bdbb7-a515-46ed-90c6-5fe5d141fba8","Type":"ContainerStarted","Data":"2bcbd1b7d3af5cb9b0acc0a8b429239e155b0cebd9b5e994fd346632848b439b"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.257623 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" event={"ID":"f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8","Type":"ContainerStarted","Data":"960e2778975993b5f59d0ec846ef79211279810609ffdef631cde09892dfb624"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.258234 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.259859 4766 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vqq6s container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" start-of-body= Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.259924 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" podUID="f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.261160 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flmqc" event={"ID":"5fcce406-28bd-4526-9a8e-fe2381ce20a2","Type":"ContainerStarted","Data":"0ebda71dfc3299908e1eee020fecfe079739de6dabf981e996a8958ce97707ee"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.262591 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x" event={"ID":"a7440e0b-298a-4118-afc5-3c8fcb11eed7","Type":"ContainerStarted","Data":"8f39d886ea533e0db7880087a670e8311581a9ba366f6f7e6fbaf0610308fd03"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.264887 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8z2x2" event={"ID":"581ea4c4-072a-4bba-afc9-2f82918ac0c9","Type":"ContainerStarted","Data":"346d18e5f3d1757e06ff483e7ccf11c59e3fe37ba77d78dcbcbf3627115e5afb"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.269916 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ttsj7" event={"ID":"1030ba1a-c14b-4091-8417-b2dcbd287b97","Type":"ContainerStarted","Data":"9417095179b65f555efc7885f28bb40f115d62fe8269c96d1075dad50dcbff0b"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.272812 4766 generic.go:334] "Generic (PLEG): container finished" podID="c4653f10-7eaa-450c-881b-e074e4038d2f" containerID="f67c6433c5bafe1525c33254cfcbb565fe53e277e0fea46dfd5a8d1abf2bb5ad" exitCode=0 Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.272889 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" event={"ID":"c4653f10-7eaa-450c-881b-e074e4038d2f","Type":"ContainerDied","Data":"f67c6433c5bafe1525c33254cfcbb565fe53e277e0fea46dfd5a8d1abf2bb5ad"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.275086 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4" event={"ID":"5102f73f-dc76-4e60-9ed8-cc12efc46860","Type":"ContainerStarted","Data":"90804535d9e337c1d68894674d2551c859ac573af1b051cd49d0f99aaec386ec"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.276750 4766 generic.go:334] "Generic (PLEG): container finished" podID="67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac" containerID="b4f92a7af89f82948375a99c653fe3353833585c19922ad38b613636d7b14f50" exitCode=0 Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.276815 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" event={"ID":"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac","Type":"ContainerDied","Data":"b4f92a7af89f82948375a99c653fe3353833585c19922ad38b613636d7b14f50"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.278215 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" event={"ID":"d24e91e7-dadf-4b67-be1b-a945b1250017","Type":"ContainerStarted","Data":"767ea522b236dfacd8fcf1666d4426e2f68e1488d2ad20a2fb8be28ddaaaed8e"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.278492 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.279531 4766 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-m2kx2 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.42:6443/healthz\": dial tcp 10.217.0.42:6443: connect: connection refused" start-of-body= Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.279563 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" podUID="d24e91e7-dadf-4b67-be1b-a945b1250017" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.42:6443/healthz\": dial tcp 10.217.0.42:6443: connect: connection refused" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.281617 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zvx8j" event={"ID":"8fb9e589-a259-4ff9-9d1b-198b57fb179b","Type":"ContainerStarted","Data":"932f86e542ce07730ea6fc65814dd013d6ea77f1678ebd82ee34f50a2a8c19ce"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.281693 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-zvx8j" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.284308 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" event={"ID":"30c9f678-ccef-4c4d-b172-c5853e15ddd4","Type":"ContainerStarted","Data":"c46a0aa46b56d56c5780891a8f4ba18d80ddba6ab27ec298ef20d1714b4c181f"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.285950 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nkh48" event={"ID":"78152191-403f-476f-90ee-0342f60ba99c","Type":"ContainerStarted","Data":"fc6cbacdd747d7e2334051b05f0dff506304cac8bf455f200745c68f6b593c32"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.288064 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" podStartSLOduration=128.288054213 podStartE2EDuration="2m8.288054213s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:55.287575377 +0000 UTC m=+150.230446321" watchObservedRunningTime="2025-10-02 10:53:55.288054213 +0000 UTC m=+150.230925157" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.290263 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:53:55 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:53:55 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:53:55 crc kubenswrapper[4766]: healthz check failed Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.290605 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.292610 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xqcd7" event={"ID":"743fe4dc-299d-4f28-9448-644d12db4af7","Type":"ContainerStarted","Data":"e006c52aa1d853709aa0b8c51e1c5b7cf1bfc50b2b48193cf5b7f4c872431cbd"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.292955 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-xqcd7" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.294391 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-xqcd7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.294551 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.296484 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" event={"ID":"d0980a54-cd9d-4daa-a5ac-7f86e447f646","Type":"ContainerStarted","Data":"5890d35b518447fa849728890c19ee94775ae437b93d908795cb79e5044b0274"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.296982 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.299741 4766 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-f2c9k container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.299784 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" podUID="d0980a54-cd9d-4daa-a5ac-7f86e447f646" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.300216 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-lrsk8" event={"ID":"89c62d15-b27b-4722-95ec-9b9a76efa5d7","Type":"ContainerStarted","Data":"2a8609adef1d3d8510a5c287a7548b7fdb55dfdc14753d0f9459d6895906a50d"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.303212 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmttf" event={"ID":"dcbb0f16-6000-4d64-ab71-a61c1b3b7063","Type":"ContainerStarted","Data":"2fa1e3627d72517c20d397f896612c32e12f999b845fd327bd18e091dd2127a0"} Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.319135 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" podStartSLOduration=128.319112422 podStartE2EDuration="2m8.319112422s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:55.31694831 +0000 UTC m=+150.259819264" watchObservedRunningTime="2025-10-02 10:53:55.319112422 +0000 UTC m=+150.261983366" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.356100 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:55 crc kubenswrapper[4766]: E1002 10:53:55.358980 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.858964232 +0000 UTC m=+150.801835256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.394720 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-d5v2m" podStartSLOduration=128.394707055 podStartE2EDuration="2m8.394707055s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:55.393765734 +0000 UTC m=+150.336636678" watchObservedRunningTime="2025-10-02 10:53:55.394707055 +0000 UTC m=+150.337577999" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.451776 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-mltn4" podStartSLOduration=128.451760175 podStartE2EDuration="2m8.451760175s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:55.451057281 +0000 UTC m=+150.393928225" watchObservedRunningTime="2025-10-02 10:53:55.451760175 +0000 UTC m=+150.394631119" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.458920 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:55 crc kubenswrapper[4766]: E1002 10:53:55.459534 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:55.959492181 +0000 UTC m=+150.902363205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.479862 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zvx8j" podStartSLOduration=9.479845425 podStartE2EDuration="9.479845425s" podCreationTimestamp="2025-10-02 10:53:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:55.476798624 +0000 UTC m=+150.419669558" watchObservedRunningTime="2025-10-02 10:53:55.479845425 +0000 UTC m=+150.422716369" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.536068 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-7tk8w" podStartSLOduration=128.536048765 podStartE2EDuration="2m8.536048765s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:55.53284823 +0000 UTC m=+150.475719164" watchObservedRunningTime="2025-10-02 10:53:55.536048765 +0000 UTC m=+150.478919709" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.536796 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-8z2x2" podStartSLOduration=128.5367903 podStartE2EDuration="2m8.5367903s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:55.509157725 +0000 UTC m=+150.452028669" watchObservedRunningTime="2025-10-02 10:53:55.5367903 +0000 UTC m=+150.479661244" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.549536 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-qkm7n" podStartSLOduration=9.549517502 podStartE2EDuration="9.549517502s" podCreationTimestamp="2025-10-02 10:53:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:55.547327629 +0000 UTC m=+150.490198573" watchObservedRunningTime="2025-10-02 10:53:55.549517502 +0000 UTC m=+150.492388446" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.561284 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:55 crc kubenswrapper[4766]: E1002 10:53:55.561620 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:56.061608352 +0000 UTC m=+151.004479296 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.565745 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-xqcd7" podStartSLOduration=128.565726869 podStartE2EDuration="2m8.565726869s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:55.565221992 +0000 UTC m=+150.508092956" watchObservedRunningTime="2025-10-02 10:53:55.565726869 +0000 UTC m=+150.508597813" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.592636 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" podStartSLOduration=128.592620279 podStartE2EDuration="2m8.592620279s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:55.590174618 +0000 UTC m=+150.533045572" watchObservedRunningTime="2025-10-02 10:53:55.592620279 +0000 UTC m=+150.535491223" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.612619 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" podStartSLOduration=128.612603621 podStartE2EDuration="2m8.612603621s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:55.612361403 +0000 UTC m=+150.555232347" watchObservedRunningTime="2025-10-02 10:53:55.612603621 +0000 UTC m=+150.555474565" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.662061 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:55 crc kubenswrapper[4766]: E1002 10:53:55.662329 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:56.162299857 +0000 UTC m=+151.105170811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.666372 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5bg6" podStartSLOduration=128.666354501 podStartE2EDuration="2m8.666354501s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:55.642281793 +0000 UTC m=+150.585152747" watchObservedRunningTime="2025-10-02 10:53:55.666354501 +0000 UTC m=+150.609225445" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.691211 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-g6jjj" podStartSLOduration=128.691193184 podStartE2EDuration="2m8.691193184s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:55.6687517 +0000 UTC m=+150.611622644" watchObservedRunningTime="2025-10-02 10:53:55.691193184 +0000 UTC m=+150.634064128" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.692810 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-lrsk8" podStartSLOduration=9.692800217 podStartE2EDuration="9.692800217s" podCreationTimestamp="2025-10-02 10:53:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:55.687041047 +0000 UTC m=+150.629912011" watchObservedRunningTime="2025-10-02 10:53:55.692800217 +0000 UTC m=+150.635671161" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.701204 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qklg5" podStartSLOduration=128.701182674 podStartE2EDuration="2m8.701182674s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:55.70105911 +0000 UTC m=+150.643930064" watchObservedRunningTime="2025-10-02 10:53:55.701182674 +0000 UTC m=+150.644053618" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.723060 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-jmttf" podStartSLOduration=128.723041659 podStartE2EDuration="2m8.723041659s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:55.720337229 +0000 UTC m=+150.663208173" watchObservedRunningTime="2025-10-02 10:53:55.723041659 +0000 UTC m=+150.665912603" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.764066 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:55 crc kubenswrapper[4766]: E1002 10:53:55.764407 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:56.264394068 +0000 UTC m=+151.207265012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.865106 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.865356 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.865400 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.865469 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.865517 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:55 crc kubenswrapper[4766]: E1002 10:53:55.866052 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:56.366021634 +0000 UTC m=+151.308892598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.870652 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.874453 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.875595 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.890270 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.899132 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 02 10:53:55 crc kubenswrapper[4766]: I1002 10:53:55.967123 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:55 crc kubenswrapper[4766]: E1002 10:53:55.967629 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:56.467611617 +0000 UTC m=+151.410482561 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.009816 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.010426 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.087099 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:56 crc kubenswrapper[4766]: E1002 10:53:56.087344 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:56.587293812 +0000 UTC m=+151.530164756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.154214 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.155181 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.160908 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.161137 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.168694 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.213420 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:56 crc kubenswrapper[4766]: E1002 10:53:56.213854 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:56.713837081 +0000 UTC m=+151.656708025 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.294721 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:53:56 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:53:56 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:53:56 crc kubenswrapper[4766]: healthz check failed Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.294792 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.315769 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:56 crc kubenswrapper[4766]: E1002 10:53:56.315963 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:56.815932723 +0000 UTC m=+151.758803677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.316026 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29fd48d9-df17-4809-9a86-69a825a837d7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"29fd48d9-df17-4809-9a86-69a825a837d7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.316221 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29fd48d9-df17-4809-9a86-69a825a837d7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"29fd48d9-df17-4809-9a86-69a825a837d7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.316287 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:56 crc kubenswrapper[4766]: E1002 10:53:56.316701 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:56.816684388 +0000 UTC m=+151.759555332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.330808 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"96771ed929ac4490412aa8cdc56bb9dc8565f204cee14962b70098532a5cdb5c"} Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.343035 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nkh48" event={"ID":"78152191-403f-476f-90ee-0342f60ba99c","Type":"ContainerStarted","Data":"8f86de83e7e70ae72e64d46633d2c7cd78736d38cd605e20966520c00ed7d8d3"} Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.355511 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ttsj7" event={"ID":"1030ba1a-c14b-4091-8417-b2dcbd287b97","Type":"ContainerStarted","Data":"364bcf3df103c5a678c42c4c23e5a30b3aa3a920bda66e479100ff9f5730ae51"} Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.368920 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" event={"ID":"c30c3a5c-a29e-48a7-b446-b68f9cce2742","Type":"ContainerStarted","Data":"ed07891640b87a78aadd588e711e1a8e56e44750c0e85d3edf08250d1272a390"} Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.375452 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-nkh48" podStartSLOduration=129.375437184 podStartE2EDuration="2m9.375437184s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:56.374069978 +0000 UTC m=+151.316940922" watchObservedRunningTime="2025-10-02 10:53:56.375437184 +0000 UTC m=+151.318308128" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.388768 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" event={"ID":"e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3","Type":"ContainerStarted","Data":"cafa4ee72d108b04ea577c390e5cf15f67bf10ed89e130f78029785861bba100"} Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.389932 4766 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-vqq6s container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" start-of-body= Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.389977 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" podUID="f3f9a4cc-a2e5-4ae7-a426-38580b56b8a8" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.391329 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-xqcd7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.391364 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.391735 4766 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-f2c9k container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.391767 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" podUID="d0980a54-cd9d-4daa-a5ac-7f86e447f646" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.408625 4766 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-m2kx2 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.42:6443/healthz\": dial tcp 10.217.0.42:6443: connect: connection refused" start-of-body= Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.408669 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" podUID="d24e91e7-dadf-4b67-be1b-a945b1250017" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.42:6443/healthz\": dial tcp 10.217.0.42:6443: connect: connection refused" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.411819 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-zkc9n" podStartSLOduration=129.411801408 podStartE2EDuration="2m9.411801408s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:56.408566801 +0000 UTC m=+151.351437775" watchObservedRunningTime="2025-10-02 10:53:56.411801408 +0000 UTC m=+151.354672352" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.417713 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.417960 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29fd48d9-df17-4809-9a86-69a825a837d7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"29fd48d9-df17-4809-9a86-69a825a837d7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.418016 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29fd48d9-df17-4809-9a86-69a825a837d7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"29fd48d9-df17-4809-9a86-69a825a837d7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:53:56 crc kubenswrapper[4766]: E1002 10:53:56.418358 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:56.918342674 +0000 UTC m=+151.861213608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.419618 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29fd48d9-df17-4809-9a86-69a825a837d7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"29fd48d9-df17-4809-9a86-69a825a837d7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.432560 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-ttsj7" podStartSLOduration=129.432541294 podStartE2EDuration="2m9.432541294s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:56.428489991 +0000 UTC m=+151.371360955" watchObservedRunningTime="2025-10-02 10:53:56.432541294 +0000 UTC m=+151.375412238" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.463003 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29fd48d9-df17-4809-9a86-69a825a837d7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"29fd48d9-df17-4809-9a86-69a825a837d7\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.464597 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" podStartSLOduration=129.464573786 podStartE2EDuration="2m9.464573786s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:56.46352423 +0000 UTC m=+151.406395174" watchObservedRunningTime="2025-10-02 10:53:56.464573786 +0000 UTC m=+151.407444730" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.495177 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-flmqc" podStartSLOduration=129.495158208 podStartE2EDuration="2m9.495158208s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:56.493268546 +0000 UTC m=+151.436139490" watchObservedRunningTime="2025-10-02 10:53:56.495158208 +0000 UTC m=+151.438029152" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.496596 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:53:56 crc kubenswrapper[4766]: W1002 10:53:56.516641 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-78f3b73d6e7e43e93f5f1613f5173d99929b58d65c4c4d41c29e188addeb9e2b WatchSource:0}: Error finding container 78f3b73d6e7e43e93f5f1613f5173d99929b58d65c4c4d41c29e188addeb9e2b: Status 404 returned error can't find the container with id 78f3b73d6e7e43e93f5f1613f5173d99929b58d65c4c4d41c29e188addeb9e2b Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.517407 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4dx82" podStartSLOduration=129.517382214 podStartE2EDuration="2m9.517382214s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:56.514385675 +0000 UTC m=+151.457256639" watchObservedRunningTime="2025-10-02 10:53:56.517382214 +0000 UTC m=+151.460253158" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.519717 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:56 crc kubenswrapper[4766]: E1002 10:53:56.524112 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:57.024096277 +0000 UTC m=+151.966967221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.551039 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-9hwrx" podStartSLOduration=129.551011848 podStartE2EDuration="2m9.551011848s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:56.550604294 +0000 UTC m=+151.493475258" watchObservedRunningTime="2025-10-02 10:53:56.551011848 +0000 UTC m=+151.493882802" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.586995 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-q2szc" podStartSLOduration=129.586974419 podStartE2EDuration="2m9.586974419s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:56.585765648 +0000 UTC m=+151.528636612" watchObservedRunningTime="2025-10-02 10:53:56.586974419 +0000 UTC m=+151.529845363" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.621688 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:56 crc kubenswrapper[4766]: E1002 10:53:56.637049 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:57.137017926 +0000 UTC m=+152.079888870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.637404 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:56 crc kubenswrapper[4766]: E1002 10:53:56.639244 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:57.139234709 +0000 UTC m=+152.082105653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.640927 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-cpxbq" podStartSLOduration=129.640892635 podStartE2EDuration="2m9.640892635s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:56.608396048 +0000 UTC m=+151.551266992" watchObservedRunningTime="2025-10-02 10:53:56.640892635 +0000 UTC m=+151.583763579" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.725711 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x" podStartSLOduration=129.725689743 podStartE2EDuration="2m9.725689743s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:56.71806747 +0000 UTC m=+151.660938424" watchObservedRunningTime="2025-10-02 10:53:56.725689743 +0000 UTC m=+151.668560687" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.739198 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:56 crc kubenswrapper[4766]: E1002 10:53:56.740850 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:57.240818913 +0000 UTC m=+152.183689857 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.779012 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" podStartSLOduration=129.778990637 podStartE2EDuration="2m9.778990637s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:56.755594433 +0000 UTC m=+151.698465367" watchObservedRunningTime="2025-10-02 10:53:56.778990637 +0000 UTC m=+151.721861581" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.800189 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qprrh" podStartSLOduration=129.800168539 podStartE2EDuration="2m9.800168539s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:56.799773746 +0000 UTC m=+151.742644690" watchObservedRunningTime="2025-10-02 10:53:56.800168539 +0000 UTC m=+151.743039473" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.801094 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d59w5" podStartSLOduration=129.80108531 podStartE2EDuration="2m9.80108531s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:56.77636403 +0000 UTC m=+151.719234984" watchObservedRunningTime="2025-10-02 10:53:56.80108531 +0000 UTC m=+151.743956254" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.820979 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.845146 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:56 crc kubenswrapper[4766]: E1002 10:53:56.845495 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:57.34548063 +0000 UTC m=+152.288351574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.928916 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.947450 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.947806 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5zcf\" (UniqueName: \"kubernetes.io/projected/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-kube-api-access-n5zcf\") pod \"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac\" (UID: \"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac\") " Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.947884 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-secret-volume\") pod \"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac\" (UID: \"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac\") " Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.947939 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-config-volume\") pod \"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac\" (UID: \"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac\") " Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.948877 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-config-volume" (OuterVolumeSpecName: "config-volume") pod "67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac" (UID: "67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:53:56 crc kubenswrapper[4766]: E1002 10:53:56.949123 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:57.449097311 +0000 UTC m=+152.391968255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.954245 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac" (UID: "67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:53:56 crc kubenswrapper[4766]: I1002 10:53:56.954874 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-kube-api-access-n5zcf" (OuterVolumeSpecName: "kube-api-access-n5zcf") pod "67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac" (UID: "67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac"). InnerVolumeSpecName "kube-api-access-n5zcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:53:56 crc kubenswrapper[4766]: W1002 10:53:56.973454 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod29fd48d9_df17_4809_9a86_69a825a837d7.slice/crio-3f7803d8c9b9215c70b897bddfa1bb4b8be76c37436f7703e60ce3475b48c460 WatchSource:0}: Error finding container 3f7803d8c9b9215c70b897bddfa1bb4b8be76c37436f7703e60ce3475b48c460: Status 404 returned error can't find the container with id 3f7803d8c9b9215c70b897bddfa1bb4b8be76c37436f7703e60ce3475b48c460 Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.032121 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 10:53:57 crc kubenswrapper[4766]: E1002 10:53:57.032370 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac" containerName="collect-profiles" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.032393 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac" containerName="collect-profiles" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.032538 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac" containerName="collect-profiles" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.032972 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.040171 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.041125 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.043955 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.049956 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.050098 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.050114 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.050127 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5zcf\" (UniqueName: \"kubernetes.io/projected/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac-kube-api-access-n5zcf\") on node \"crc\" DevicePath \"\"" Oct 02 10:53:57 crc kubenswrapper[4766]: E1002 10:53:57.051228 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:57.551211413 +0000 UTC m=+152.494082367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.150661 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:57 crc kubenswrapper[4766]: E1002 10:53:57.150885 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:57.650852683 +0000 UTC m=+152.593723627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.150952 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5699dbba-f019-4274-9521-7c77f73897bb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5699dbba-f019-4274-9521-7c77f73897bb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.151028 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:57 crc kubenswrapper[4766]: E1002 10:53:57.151276 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:57.651265116 +0000 UTC m=+152.594136060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.151308 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5699dbba-f019-4274-9521-7c77f73897bb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5699dbba-f019-4274-9521-7c77f73897bb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.251829 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:57 crc kubenswrapper[4766]: E1002 10:53:57.252018 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:57.751972462 +0000 UTC m=+152.694843416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.252278 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5699dbba-f019-4274-9521-7c77f73897bb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5699dbba-f019-4274-9521-7c77f73897bb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.252314 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5699dbba-f019-4274-9521-7c77f73897bb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5699dbba-f019-4274-9521-7c77f73897bb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.252340 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.252419 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5699dbba-f019-4274-9521-7c77f73897bb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5699dbba-f019-4274-9521-7c77f73897bb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:53:57 crc kubenswrapper[4766]: E1002 10:53:57.252612 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:57.752600472 +0000 UTC m=+152.695471406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.290564 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:53:57 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:53:57 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:53:57 crc kubenswrapper[4766]: healthz check failed Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.290651 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.305139 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5699dbba-f019-4274-9521-7c77f73897bb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5699dbba-f019-4274-9521-7c77f73897bb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.354113 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:57 crc kubenswrapper[4766]: E1002 10:53:57.354318 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:57.854285929 +0000 UTC m=+152.797156883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.354443 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:57 crc kubenswrapper[4766]: E1002 10:53:57.354811 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:57.854799217 +0000 UTC m=+152.797670221 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.394460 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" event={"ID":"899ef710-299b-4178-850d-1da30747c924","Type":"ContainerStarted","Data":"d0e1a6dabac7aa2d619497d0edb6c9bdedaa064d6a1d2ff3012babc7a5a613c7"} Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.394494 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.395921 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4484ce8cec5b064e15ce9db1f7e740474b76530d2ceb1ecd4231e8638f85631f"} Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.395950 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"78f3b73d6e7e43e93f5f1613f5173d99929b58d65c4c4d41c29e188addeb9e2b"} Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.399457 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" event={"ID":"c4653f10-7eaa-450c-881b-e074e4038d2f","Type":"ContainerStarted","Data":"2a926e484d1c6389d07a49306435f790a31e491954354aee8acf5d4185d8057f"} Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.400937 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d3515518af9e64aad63f015f01dfa1f87c2fe2ea2d4fe995895f732a74fe35e5"} Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.400981 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"efd71a0182c719c406775d2ccbc757f92757b954d00bdca75a34c784a465fefe"} Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.401151 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.403223 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" event={"ID":"67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac","Type":"ContainerDied","Data":"209d8066f862f184c1745c95f5ad6e9fc936dacc79685e924f5cd49f38d0f554"} Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.403253 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.403272 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="209d8066f862f184c1745c95f5ad6e9fc936dacc79685e924f5cd49f38d0f554" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.405011 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5" event={"ID":"55374426-4ab4-4ce6-a180-6f449961e26d","Type":"ContainerStarted","Data":"0a15992a7aeb9a67024d2cc6b87b2528d1833a16656a79e8df3e1f0a9bb907f2"} Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.405226 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.406413 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"42d6b5392c954801d8bb4e4cc390f5561c09c079ed1a675fca5f93bf05e548c1"} Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.408100 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"29fd48d9-df17-4809-9a86-69a825a837d7","Type":"ContainerStarted","Data":"3f7803d8c9b9215c70b897bddfa1bb4b8be76c37436f7703e60ce3475b48c460"} Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.457783 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:57 crc kubenswrapper[4766]: E1002 10:53:57.458145 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:57.958127908 +0000 UTC m=+152.900998852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.502988 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" podStartSLOduration=130.502975424 podStartE2EDuration="2m10.502975424s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:57.501649879 +0000 UTC m=+152.444520833" watchObservedRunningTime="2025-10-02 10:53:57.502975424 +0000 UTC m=+152.445846358" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.513779 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.530759 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5" podStartSLOduration=130.530739343 podStartE2EDuration="2m10.530739343s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:57.52821615 +0000 UTC m=+152.471087094" watchObservedRunningTime="2025-10-02 10:53:57.530739343 +0000 UTC m=+152.473610287" Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.561344 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:57 crc kubenswrapper[4766]: E1002 10:53:57.565232 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:58.065209025 +0000 UTC m=+153.008079969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.665138 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:57 crc kubenswrapper[4766]: E1002 10:53:57.665623 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:58.165605589 +0000 UTC m=+153.108476523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.767395 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:57 crc kubenswrapper[4766]: E1002 10:53:57.768206 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:58.268192766 +0000 UTC m=+153.211063710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.804697 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.869009 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:57 crc kubenswrapper[4766]: E1002 10:53:57.869731 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:58.369711859 +0000 UTC m=+153.312582823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:57 crc kubenswrapper[4766]: I1002 10:53:57.972764 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:57 crc kubenswrapper[4766]: E1002 10:53:57.973115 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:58.473098622 +0000 UTC m=+153.415969576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.074081 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:58 crc kubenswrapper[4766]: E1002 10:53:58.074235 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:58.574211511 +0000 UTC m=+153.517082455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.074284 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:58 crc kubenswrapper[4766]: E1002 10:53:58.074714 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:58.574701147 +0000 UTC m=+153.517572091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.175486 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:58 crc kubenswrapper[4766]: E1002 10:53:58.175626 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:58.675601779 +0000 UTC m=+153.618472713 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.176115 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:58 crc kubenswrapper[4766]: E1002 10:53:58.176497 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:58.676481567 +0000 UTC m=+153.619352511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.277585 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:58 crc kubenswrapper[4766]: E1002 10:53:58.277928 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:58.777909617 +0000 UTC m=+153.720780561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.290865 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:53:58 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:53:58 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:53:58 crc kubenswrapper[4766]: healthz check failed Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.290917 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.379040 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:58 crc kubenswrapper[4766]: E1002 10:53:58.379377 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:58.879362927 +0000 UTC m=+153.822233861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.415228 4766 generic.go:334] "Generic (PLEG): container finished" podID="29fd48d9-df17-4809-9a86-69a825a837d7" containerID="9a180c6c5e4ff9042f8b1495fddfd0dc4c133e7462891640f024aa2f9fb3c331" exitCode=0 Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.415561 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"29fd48d9-df17-4809-9a86-69a825a837d7","Type":"ContainerDied","Data":"9a180c6c5e4ff9042f8b1495fddfd0dc4c133e7462891640f024aa2f9fb3c331"} Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.417255 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5699dbba-f019-4274-9521-7c77f73897bb","Type":"ContainerStarted","Data":"f127673b2d6eafed362cf19bcc809bc2e8039f443d1dab5eb45074bb63cb020c"} Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.417280 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5699dbba-f019-4274-9521-7c77f73897bb","Type":"ContainerStarted","Data":"687f3af32040086a629ed34dd060105cf8e9a23cd32e48a74bf6eeb556160f79"} Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.451313 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.451298429 podStartE2EDuration="2.451298429s" podCreationTimestamp="2025-10-02 10:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:58.448764055 +0000 UTC m=+153.391634999" watchObservedRunningTime="2025-10-02 10:53:58.451298429 +0000 UTC m=+153.394169373" Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.480211 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:58 crc kubenswrapper[4766]: E1002 10:53:58.480551 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:58.980528997 +0000 UTC m=+153.923399941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.582013 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:58 crc kubenswrapper[4766]: E1002 10:53:58.584075 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:59.084054415 +0000 UTC m=+154.026925359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.683845 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:58 crc kubenswrapper[4766]: E1002 10:53:58.684064 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:59.184020826 +0000 UTC m=+154.126891780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.684114 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:58 crc kubenswrapper[4766]: E1002 10:53:58.684757 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:59.18473552 +0000 UTC m=+154.127606524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.785073 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:58 crc kubenswrapper[4766]: E1002 10:53:58.785239 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:59.285214867 +0000 UTC m=+154.228085811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.785324 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:58 crc kubenswrapper[4766]: E1002 10:53:58.785653 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:59.285645381 +0000 UTC m=+154.228516325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.865221 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.865626 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.865727 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5bg6" Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.870457 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qz9xl" Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.886495 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:58 crc kubenswrapper[4766]: E1002 10:53:58.886963 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:59.386934616 +0000 UTC m=+154.329805570 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.921677 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-nm9mf" Oct 02 10:53:58 crc kubenswrapper[4766]: I1002 10:53:58.988286 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:58 crc kubenswrapper[4766]: E1002 10:53:58.990312 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:59.490300599 +0000 UTC m=+154.433171543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.089228 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:59 crc kubenswrapper[4766]: E1002 10:53:59.090178 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:59.590161036 +0000 UTC m=+154.533031980 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.137728 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m8788"] Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.143489 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8788" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.148229 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.167102 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m8788"] Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.190613 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:59 crc kubenswrapper[4766]: E1002 10:53:59.190951 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:59.690938543 +0000 UTC m=+154.633809487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.286864 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.291065 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:53:59 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:53:59 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:53:59 crc kubenswrapper[4766]: healthz check failed Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.291533 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.291574 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:59 crc kubenswrapper[4766]: E1002 10:53:59.291678 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:59.791655358 +0000 UTC m=+154.734526312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.291743 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29lcx\" (UniqueName: \"kubernetes.io/projected/d00b5247-9e12-4202-ae31-20d454dfa183-kube-api-access-29lcx\") pod \"community-operators-m8788\" (UID: \"d00b5247-9e12-4202-ae31-20d454dfa183\") " pod="openshift-marketplace/community-operators-m8788" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.291848 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d00b5247-9e12-4202-ae31-20d454dfa183-utilities\") pod \"community-operators-m8788\" (UID: \"d00b5247-9e12-4202-ae31-20d454dfa183\") " pod="openshift-marketplace/community-operators-m8788" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.291873 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d00b5247-9e12-4202-ae31-20d454dfa183-catalog-content\") pod \"community-operators-m8788\" (UID: \"d00b5247-9e12-4202-ae31-20d454dfa183\") " pod="openshift-marketplace/community-operators-m8788" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.291963 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:59 crc kubenswrapper[4766]: E1002 10:53:59.292263 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:59.792248478 +0000 UTC m=+154.735119422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.300451 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.310340 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.326530 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5mmm7"] Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.327794 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5mmm7" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.329490 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.354751 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-vqq6s" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.359048 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5mmm7"] Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.392987 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:59 crc kubenswrapper[4766]: E1002 10:53:59.393149 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:59.893113389 +0000 UTC m=+154.835984343 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.393386 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d00b5247-9e12-4202-ae31-20d454dfa183-utilities\") pod \"community-operators-m8788\" (UID: \"d00b5247-9e12-4202-ae31-20d454dfa183\") " pod="openshift-marketplace/community-operators-m8788" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.393424 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d00b5247-9e12-4202-ae31-20d454dfa183-catalog-content\") pod \"community-operators-m8788\" (UID: \"d00b5247-9e12-4202-ae31-20d454dfa183\") " pod="openshift-marketplace/community-operators-m8788" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.393462 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.393526 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29lcx\" (UniqueName: \"kubernetes.io/projected/d00b5247-9e12-4202-ae31-20d454dfa183-kube-api-access-29lcx\") pod \"community-operators-m8788\" (UID: \"d00b5247-9e12-4202-ae31-20d454dfa183\") " pod="openshift-marketplace/community-operators-m8788" Oct 02 10:53:59 crc kubenswrapper[4766]: E1002 10:53:59.394402 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:59.89438684 +0000 UTC m=+154.837257854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.395191 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d00b5247-9e12-4202-ae31-20d454dfa183-utilities\") pod \"community-operators-m8788\" (UID: \"d00b5247-9e12-4202-ae31-20d454dfa183\") " pod="openshift-marketplace/community-operators-m8788" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.395233 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d00b5247-9e12-4202-ae31-20d454dfa183-catalog-content\") pod \"community-operators-m8788\" (UID: \"d00b5247-9e12-4202-ae31-20d454dfa183\") " pod="openshift-marketplace/community-operators-m8788" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.425581 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" event={"ID":"899ef710-299b-4178-850d-1da30747c924","Type":"ContainerStarted","Data":"ede0042ca80dfc8477362f0c764f3bcd1c19568e2cc7a5ff688c4acd5f02c215"} Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.425631 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" event={"ID":"899ef710-299b-4178-850d-1da30747c924","Type":"ContainerStarted","Data":"f5bd76303ad81e422edf47d5383f2b80965d7daac1c9e8724276dbc96995d589"} Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.425644 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" event={"ID":"899ef710-299b-4178-850d-1da30747c924","Type":"ContainerStarted","Data":"5ac8afe3de2ab81de23d8d87c2bafc86e050caa28e78fe92b1364ce857f24342"} Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.433004 4766 generic.go:334] "Generic (PLEG): container finished" podID="5699dbba-f019-4274-9521-7c77f73897bb" containerID="f127673b2d6eafed362cf19bcc809bc2e8039f443d1dab5eb45074bb63cb020c" exitCode=0 Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.433244 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5699dbba-f019-4274-9521-7c77f73897bb","Type":"ContainerDied","Data":"f127673b2d6eafed362cf19bcc809bc2e8039f443d1dab5eb45074bb63cb020c"} Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.443602 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29lcx\" (UniqueName: \"kubernetes.io/projected/d00b5247-9e12-4202-ae31-20d454dfa183-kube-api-access-29lcx\") pod \"community-operators-m8788\" (UID: \"d00b5247-9e12-4202-ae31-20d454dfa183\") " pod="openshift-marketplace/community-operators-m8788" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.466132 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-xqcd7 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.466181 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.466395 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-xqcd7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.466410 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.473776 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8788" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.494469 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:59 crc kubenswrapper[4766]: E1002 10:53:59.494607 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:53:59.994588058 +0000 UTC m=+154.937459002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.494857 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-utilities\") pod \"certified-operators-5mmm7\" (UID: \"9a491a4e-eefa-4908-8e7b-1d5c3e67274c\") " pod="openshift-marketplace/certified-operators-5mmm7" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.494988 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dvrj\" (UniqueName: \"kubernetes.io/projected/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-kube-api-access-5dvrj\") pod \"certified-operators-5mmm7\" (UID: \"9a491a4e-eefa-4908-8e7b-1d5c3e67274c\") " pod="openshift-marketplace/certified-operators-5mmm7" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.495014 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-catalog-content\") pod \"certified-operators-5mmm7\" (UID: \"9a491a4e-eefa-4908-8e7b-1d5c3e67274c\") " pod="openshift-marketplace/certified-operators-5mmm7" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.495045 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:59 crc kubenswrapper[4766]: E1002 10:53:59.495345 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:53:59.995334113 +0000 UTC m=+154.938205057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.522230 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-rbh2d" podStartSLOduration=13.522214974 podStartE2EDuration="13.522214974s" podCreationTimestamp="2025-10-02 10:53:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:53:59.519867276 +0000 UTC m=+154.462738220" watchObservedRunningTime="2025-10-02 10:53:59.522214974 +0000 UTC m=+154.465085918" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.555881 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.556566 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-87d2m"] Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.557410 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87d2m" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.562945 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.570008 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-87d2m"] Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.597888 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.598239 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.598421 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dvrj\" (UniqueName: \"kubernetes.io/projected/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-kube-api-access-5dvrj\") pod \"certified-operators-5mmm7\" (UID: \"9a491a4e-eefa-4908-8e7b-1d5c3e67274c\") " pod="openshift-marketplace/certified-operators-5mmm7" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.598442 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-catalog-content\") pod \"certified-operators-5mmm7\" (UID: \"9a491a4e-eefa-4908-8e7b-1d5c3e67274c\") " pod="openshift-marketplace/certified-operators-5mmm7" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.598525 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-utilities\") pod \"certified-operators-5mmm7\" (UID: \"9a491a4e-eefa-4908-8e7b-1d5c3e67274c\") " pod="openshift-marketplace/certified-operators-5mmm7" Oct 02 10:53:59 crc kubenswrapper[4766]: E1002 10:53:59.599578 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:54:00.099564085 +0000 UTC m=+155.042435029 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.600289 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-catalog-content\") pod \"certified-operators-5mmm7\" (UID: \"9a491a4e-eefa-4908-8e7b-1d5c3e67274c\") " pod="openshift-marketplace/certified-operators-5mmm7" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.601429 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-utilities\") pod \"certified-operators-5mmm7\" (UID: \"9a491a4e-eefa-4908-8e7b-1d5c3e67274c\") " pod="openshift-marketplace/certified-operators-5mmm7" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.609319 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.609352 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.639638 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dvrj\" (UniqueName: \"kubernetes.io/projected/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-kube-api-access-5dvrj\") pod \"certified-operators-5mmm7\" (UID: \"9a491a4e-eefa-4908-8e7b-1d5c3e67274c\") " pod="openshift-marketplace/certified-operators-5mmm7" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.644736 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.647860 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5mmm7" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.700552 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v48nk\" (UniqueName: \"kubernetes.io/projected/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-kube-api-access-v48nk\") pod \"community-operators-87d2m\" (UID: \"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f\") " pod="openshift-marketplace/community-operators-87d2m" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.700596 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-utilities\") pod \"community-operators-87d2m\" (UID: \"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f\") " pod="openshift-marketplace/community-operators-87d2m" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.700614 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-catalog-content\") pod \"community-operators-87d2m\" (UID: \"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f\") " pod="openshift-marketplace/community-operators-87d2m" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.700644 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:59 crc kubenswrapper[4766]: E1002 10:53:59.705169 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:54:00.205151752 +0000 UTC m=+155.148022776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.731587 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rdxbh"] Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.732586 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rdxbh" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.733883 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.740986 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-7r26x" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.801570 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rdxbh"] Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.801892 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.802208 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwlcx\" (UniqueName: \"kubernetes.io/projected/47362ad1-8f17-4633-bf4f-6aff2ebda031-kube-api-access-vwlcx\") pod \"certified-operators-rdxbh\" (UID: \"47362ad1-8f17-4633-bf4f-6aff2ebda031\") " pod="openshift-marketplace/certified-operators-rdxbh" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.802252 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v48nk\" (UniqueName: \"kubernetes.io/projected/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-kube-api-access-v48nk\") pod \"community-operators-87d2m\" (UID: \"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f\") " pod="openshift-marketplace/community-operators-87d2m" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.802271 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-utilities\") pod \"community-operators-87d2m\" (UID: \"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f\") " pod="openshift-marketplace/community-operators-87d2m" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.802297 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-catalog-content\") pod \"community-operators-87d2m\" (UID: \"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f\") " pod="openshift-marketplace/community-operators-87d2m" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.802339 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47362ad1-8f17-4633-bf4f-6aff2ebda031-catalog-content\") pod \"certified-operators-rdxbh\" (UID: \"47362ad1-8f17-4633-bf4f-6aff2ebda031\") " pod="openshift-marketplace/certified-operators-rdxbh" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.802365 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47362ad1-8f17-4633-bf4f-6aff2ebda031-utilities\") pod \"certified-operators-rdxbh\" (UID: \"47362ad1-8f17-4633-bf4f-6aff2ebda031\") " pod="openshift-marketplace/certified-operators-rdxbh" Oct 02 10:53:59 crc kubenswrapper[4766]: E1002 10:53:59.802489 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:54:00.302475144 +0000 UTC m=+155.245346088 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.803079 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-utilities\") pod \"community-operators-87d2m\" (UID: \"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f\") " pod="openshift-marketplace/community-operators-87d2m" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.803706 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-catalog-content\") pod \"community-operators-87d2m\" (UID: \"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f\") " pod="openshift-marketplace/community-operators-87d2m" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.847882 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.847919 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.850745 4766 patch_prober.go:28] interesting pod/console-f9d7485db-8z2x2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.40:8443/health\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.850804 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8z2x2" podUID="581ea4c4-072a-4bba-afc9-2f82918ac0c9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.40:8443/health\": dial tcp 10.217.0.40:8443: connect: connection refused" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.856845 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v48nk\" (UniqueName: \"kubernetes.io/projected/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-kube-api-access-v48nk\") pod \"community-operators-87d2m\" (UID: \"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f\") " pod="openshift-marketplace/community-operators-87d2m" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.905030 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.905088 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47362ad1-8f17-4633-bf4f-6aff2ebda031-catalog-content\") pod \"certified-operators-rdxbh\" (UID: \"47362ad1-8f17-4633-bf4f-6aff2ebda031\") " pod="openshift-marketplace/certified-operators-rdxbh" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.905117 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47362ad1-8f17-4633-bf4f-6aff2ebda031-utilities\") pod \"certified-operators-rdxbh\" (UID: \"47362ad1-8f17-4633-bf4f-6aff2ebda031\") " pod="openshift-marketplace/certified-operators-rdxbh" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.905170 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwlcx\" (UniqueName: \"kubernetes.io/projected/47362ad1-8f17-4633-bf4f-6aff2ebda031-kube-api-access-vwlcx\") pod \"certified-operators-rdxbh\" (UID: \"47362ad1-8f17-4633-bf4f-6aff2ebda031\") " pod="openshift-marketplace/certified-operators-rdxbh" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.906699 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47362ad1-8f17-4633-bf4f-6aff2ebda031-catalog-content\") pod \"certified-operators-rdxbh\" (UID: \"47362ad1-8f17-4633-bf4f-6aff2ebda031\") " pod="openshift-marketplace/certified-operators-rdxbh" Oct 02 10:53:59 crc kubenswrapper[4766]: E1002 10:53:59.907011 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:54:00.406997566 +0000 UTC m=+155.349868500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.907996 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47362ad1-8f17-4633-bf4f-6aff2ebda031-utilities\") pod \"certified-operators-rdxbh\" (UID: \"47362ad1-8f17-4633-bf4f-6aff2ebda031\") " pod="openshift-marketplace/certified-operators-rdxbh" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.910559 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87d2m" Oct 02 10:53:59 crc kubenswrapper[4766]: I1002 10:53:59.946963 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwlcx\" (UniqueName: \"kubernetes.io/projected/47362ad1-8f17-4633-bf4f-6aff2ebda031-kube-api-access-vwlcx\") pod \"certified-operators-rdxbh\" (UID: \"47362ad1-8f17-4633-bf4f-6aff2ebda031\") " pod="openshift-marketplace/certified-operators-rdxbh" Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.018206 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:54:00 crc kubenswrapper[4766]: E1002 10:54:00.018763 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:54:00.518737977 +0000 UTC m=+155.461608921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.019058 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:54:00 crc kubenswrapper[4766]: E1002 10:54:00.019339 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:54:00.519332836 +0000 UTC m=+155.462203780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.109952 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.118865 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rdxbh" Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.119791 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:54:00 crc kubenswrapper[4766]: E1002 10:54:00.120241 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:54:00.620226428 +0000 UTC m=+155.563097372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.221953 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29fd48d9-df17-4809-9a86-69a825a837d7-kube-api-access\") pod \"29fd48d9-df17-4809-9a86-69a825a837d7\" (UID: \"29fd48d9-df17-4809-9a86-69a825a837d7\") " Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.222133 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29fd48d9-df17-4809-9a86-69a825a837d7-kubelet-dir\") pod \"29fd48d9-df17-4809-9a86-69a825a837d7\" (UID: \"29fd48d9-df17-4809-9a86-69a825a837d7\") " Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.222325 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:54:00 crc kubenswrapper[4766]: E1002 10:54:00.222720 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:54:00.722708761 +0000 UTC m=+155.665579695 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.222880 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29fd48d9-df17-4809-9a86-69a825a837d7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "29fd48d9-df17-4809-9a86-69a825a837d7" (UID: "29fd48d9-df17-4809-9a86-69a825a837d7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.223717 4766 patch_prober.go:28] interesting pod/apiserver-76f77b778f-qhc8r container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 02 10:54:00 crc kubenswrapper[4766]: [+]log ok Oct 02 10:54:00 crc kubenswrapper[4766]: [+]etcd ok Oct 02 10:54:00 crc kubenswrapper[4766]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 02 10:54:00 crc kubenswrapper[4766]: [+]poststarthook/generic-apiserver-start-informers ok Oct 02 10:54:00 crc kubenswrapper[4766]: [+]poststarthook/max-in-flight-filter ok Oct 02 10:54:00 crc kubenswrapper[4766]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 02 10:54:00 crc kubenswrapper[4766]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 02 10:54:00 crc kubenswrapper[4766]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 02 10:54:00 crc kubenswrapper[4766]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 02 10:54:00 crc kubenswrapper[4766]: [+]poststarthook/project.openshift.io-projectcache ok Oct 02 10:54:00 crc kubenswrapper[4766]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 02 10:54:00 crc kubenswrapper[4766]: [+]poststarthook/openshift.io-startinformers ok Oct 02 10:54:00 crc kubenswrapper[4766]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 02 10:54:00 crc kubenswrapper[4766]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 02 10:54:00 crc kubenswrapper[4766]: livez check failed Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.223800 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" podUID="e28ea3b2-f91d-4f3a-89a3-c1d009b5caf3" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.232583 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29fd48d9-df17-4809-9a86-69a825a837d7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "29fd48d9-df17-4809-9a86-69a825a837d7" (UID: "29fd48d9-df17-4809-9a86-69a825a837d7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.298943 4766 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.310347 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:54:00 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:54:00 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:54:00 crc kubenswrapper[4766]: healthz check failed Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.310415 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.327004 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.327284 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29fd48d9-df17-4809-9a86-69a825a837d7-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.327300 4766 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29fd48d9-df17-4809-9a86-69a825a837d7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 10:54:00 crc kubenswrapper[4766]: E1002 10:54:00.327364 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:54:00.827348667 +0000 UTC m=+155.770219611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.355789 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m8788"] Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.407752 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5mmm7"] Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.429361 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:54:00 crc kubenswrapper[4766]: E1002 10:54:00.429804 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:54:00.929789139 +0000 UTC m=+155.872660093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.440847 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8788" event={"ID":"d00b5247-9e12-4202-ae31-20d454dfa183","Type":"ContainerStarted","Data":"52b1a5d7e063a6062e070a624f19129537cc73f4be3d001ea4433a82a61bae31"} Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.448181 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-87d2m"] Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.459145 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.460722 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"29fd48d9-df17-4809-9a86-69a825a837d7","Type":"ContainerDied","Data":"3f7803d8c9b9215c70b897bddfa1bb4b8be76c37436f7703e60ce3475b48c460"} Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.460763 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f7803d8c9b9215c70b897bddfa1bb4b8be76c37436f7703e60ce3475b48c460" Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.490206 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-7b6dx" Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.531262 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:54:00 crc kubenswrapper[4766]: E1002 10:54:00.531678 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:54:01.031663453 +0000 UTC m=+155.974534397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.632538 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:54:00 crc kubenswrapper[4766]: E1002 10:54:00.635762 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:54:01.13574475 +0000 UTC m=+156.078615794 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.722116 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rdxbh"] Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.733454 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:54:00 crc kubenswrapper[4766]: E1002 10:54:00.733946 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:54:01.233931162 +0000 UTC m=+156.176802106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.789058 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zvx8j" Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.837259 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:54:00 crc kubenswrapper[4766]: E1002 10:54:00.837698 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:54:01.337686158 +0000 UTC m=+156.280557102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.872950 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:54:00 crc kubenswrapper[4766]: I1002 10:54:00.938337 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:54:00 crc kubenswrapper[4766]: E1002 10:54:00.939802 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-02 10:54:01.439785368 +0000 UTC m=+156.382656312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.039845 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5699dbba-f019-4274-9521-7c77f73897bb-kubelet-dir\") pod \"5699dbba-f019-4274-9521-7c77f73897bb\" (UID: \"5699dbba-f019-4274-9521-7c77f73897bb\") " Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.039904 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5699dbba-f019-4274-9521-7c77f73897bb-kube-api-access\") pod \"5699dbba-f019-4274-9521-7c77f73897bb\" (UID: \"5699dbba-f019-4274-9521-7c77f73897bb\") " Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.040069 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5699dbba-f019-4274-9521-7c77f73897bb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5699dbba-f019-4274-9521-7c77f73897bb" (UID: "5699dbba-f019-4274-9521-7c77f73897bb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.040300 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.040340 4766 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5699dbba-f019-4274-9521-7c77f73897bb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 02 10:54:01 crc kubenswrapper[4766]: E1002 10:54:01.040620 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-02 10:54:01.540604747 +0000 UTC m=+156.483475731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-knkk5" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.045636 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5699dbba-f019-4274-9521-7c77f73897bb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5699dbba-f019-4274-9521-7c77f73897bb" (UID: "5699dbba-f019-4274-9521-7c77f73897bb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.070284 4766 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-02T10:54:00.298977217Z","Handler":null,"Name":""} Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.083143 4766 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.083196 4766 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.140934 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.141176 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5699dbba-f019-4274-9521-7c77f73897bb-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.144882 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.244655 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.248828 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.248868 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.268040 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-knkk5\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.289463 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:54:01 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:54:01 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:54:01 crc kubenswrapper[4766]: healthz check failed Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.289535 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.312895 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rgj4n"] Oct 02 10:54:01 crc kubenswrapper[4766]: E1002 10:54:01.313117 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29fd48d9-df17-4809-9a86-69a825a837d7" containerName="pruner" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.313132 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="29fd48d9-df17-4809-9a86-69a825a837d7" containerName="pruner" Oct 02 10:54:01 crc kubenswrapper[4766]: E1002 10:54:01.313142 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5699dbba-f019-4274-9521-7c77f73897bb" containerName="pruner" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.313149 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5699dbba-f019-4274-9521-7c77f73897bb" containerName="pruner" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.313457 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="29fd48d9-df17-4809-9a86-69a825a837d7" containerName="pruner" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.313472 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5699dbba-f019-4274-9521-7c77f73897bb" containerName="pruner" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.314275 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgj4n" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.316301 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.319448 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-6pxm5" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.326681 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgj4n"] Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.447708 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-utilities\") pod \"redhat-marketplace-rgj4n\" (UID: \"8e1a892b-5e21-4f62-9859-d4e4a8ef9623\") " pod="openshift-marketplace/redhat-marketplace-rgj4n" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.447751 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-catalog-content\") pod \"redhat-marketplace-rgj4n\" (UID: \"8e1a892b-5e21-4f62-9859-d4e4a8ef9623\") " pod="openshift-marketplace/redhat-marketplace-rgj4n" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.447776 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf7mr\" (UniqueName: \"kubernetes.io/projected/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-kube-api-access-lf7mr\") pod \"redhat-marketplace-rgj4n\" (UID: \"8e1a892b-5e21-4f62-9859-d4e4a8ef9623\") " pod="openshift-marketplace/redhat-marketplace-rgj4n" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.463991 4766 generic.go:334] "Generic (PLEG): container finished" podID="d00b5247-9e12-4202-ae31-20d454dfa183" containerID="436fbc50f5d0ba2b82408d9d0ec39e1bc35b58c695c138de51e0b4ac7ebfce97" exitCode=0 Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.464065 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8788" event={"ID":"d00b5247-9e12-4202-ae31-20d454dfa183","Type":"ContainerDied","Data":"436fbc50f5d0ba2b82408d9d0ec39e1bc35b58c695c138de51e0b4ac7ebfce97"} Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.465054 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5699dbba-f019-4274-9521-7c77f73897bb","Type":"ContainerDied","Data":"687f3af32040086a629ed34dd060105cf8e9a23cd32e48a74bf6eeb556160f79"} Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.465084 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="687f3af32040086a629ed34dd060105cf8e9a23cd32e48a74bf6eeb556160f79" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.465101 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.465570 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.473592 4766 generic.go:334] "Generic (PLEG): container finished" podID="c6f0281f-0dee-4b1b-a4c0-35294ed0a88f" containerID="f37fb9acc461d9e5ee883e55a7c6d3545199f29015bab7e738e132c1d490134a" exitCode=0 Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.473698 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87d2m" event={"ID":"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f","Type":"ContainerDied","Data":"f37fb9acc461d9e5ee883e55a7c6d3545199f29015bab7e738e132c1d490134a"} Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.473768 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87d2m" event={"ID":"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f","Type":"ContainerStarted","Data":"f243f1845022e6420a6dd625bbf9d9b9ec55dfbf40a96e2391582b12ee97cb09"} Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.475177 4766 generic.go:334] "Generic (PLEG): container finished" podID="47362ad1-8f17-4633-bf4f-6aff2ebda031" containerID="d4b2e7c962b0b17667a4e59b46020ec75ccdc337b5e18740df9dfd79b16f0b4e" exitCode=0 Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.475234 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdxbh" event={"ID":"47362ad1-8f17-4633-bf4f-6aff2ebda031","Type":"ContainerDied","Data":"d4b2e7c962b0b17667a4e59b46020ec75ccdc337b5e18740df9dfd79b16f0b4e"} Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.475256 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdxbh" event={"ID":"47362ad1-8f17-4633-bf4f-6aff2ebda031","Type":"ContainerStarted","Data":"6796d79a97ef828db3032972c180b5c2fcce0cf9c0f0c5430e88fa07f292c999"} Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.477620 4766 generic.go:334] "Generic (PLEG): container finished" podID="9a491a4e-eefa-4908-8e7b-1d5c3e67274c" containerID="439e8cff8c1b745a07a992c762939d486cbf7f8d70fb2e55cb9de9b1c50b904d" exitCode=0 Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.478059 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5mmm7" event={"ID":"9a491a4e-eefa-4908-8e7b-1d5c3e67274c","Type":"ContainerDied","Data":"439e8cff8c1b745a07a992c762939d486cbf7f8d70fb2e55cb9de9b1c50b904d"} Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.478091 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5mmm7" event={"ID":"9a491a4e-eefa-4908-8e7b-1d5c3e67274c","Type":"ContainerStarted","Data":"5ad6f96e85e84b91c8203a2743e9d76cefd71b6212d34488ac81f5857cd82f26"} Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.496541 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.548945 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-catalog-content\") pod \"redhat-marketplace-rgj4n\" (UID: \"8e1a892b-5e21-4f62-9859-d4e4a8ef9623\") " pod="openshift-marketplace/redhat-marketplace-rgj4n" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.548990 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf7mr\" (UniqueName: \"kubernetes.io/projected/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-kube-api-access-lf7mr\") pod \"redhat-marketplace-rgj4n\" (UID: \"8e1a892b-5e21-4f62-9859-d4e4a8ef9623\") " pod="openshift-marketplace/redhat-marketplace-rgj4n" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.549093 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-utilities\") pod \"redhat-marketplace-rgj4n\" (UID: \"8e1a892b-5e21-4f62-9859-d4e4a8ef9623\") " pod="openshift-marketplace/redhat-marketplace-rgj4n" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.549544 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-catalog-content\") pod \"redhat-marketplace-rgj4n\" (UID: \"8e1a892b-5e21-4f62-9859-d4e4a8ef9623\") " pod="openshift-marketplace/redhat-marketplace-rgj4n" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.550697 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-utilities\") pod \"redhat-marketplace-rgj4n\" (UID: \"8e1a892b-5e21-4f62-9859-d4e4a8ef9623\") " pod="openshift-marketplace/redhat-marketplace-rgj4n" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.577612 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf7mr\" (UniqueName: \"kubernetes.io/projected/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-kube-api-access-lf7mr\") pod \"redhat-marketplace-rgj4n\" (UID: \"8e1a892b-5e21-4f62-9859-d4e4a8ef9623\") " pod="openshift-marketplace/redhat-marketplace-rgj4n" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.678334 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-knkk5"] Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.709611 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vgvgz"] Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.711101 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgvgz" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.725087 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgvgz"] Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.770936 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgj4n" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.853349 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-catalog-content\") pod \"redhat-marketplace-vgvgz\" (UID: \"d03ba5d0-89cc-4be3-b1f5-38fe6f006332\") " pod="openshift-marketplace/redhat-marketplace-vgvgz" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.853403 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx88s\" (UniqueName: \"kubernetes.io/projected/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-kube-api-access-qx88s\") pod \"redhat-marketplace-vgvgz\" (UID: \"d03ba5d0-89cc-4be3-b1f5-38fe6f006332\") " pod="openshift-marketplace/redhat-marketplace-vgvgz" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.853423 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-utilities\") pod \"redhat-marketplace-vgvgz\" (UID: \"d03ba5d0-89cc-4be3-b1f5-38fe6f006332\") " pod="openshift-marketplace/redhat-marketplace-vgvgz" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.888988 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.933947 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgj4n"] Oct 02 10:54:01 crc kubenswrapper[4766]: W1002 10:54:01.942498 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e1a892b_5e21_4f62_9859_d4e4a8ef9623.slice/crio-156d4bb23f641500df18203279e69b1581d53696c455e4c4ea8bdfebed4708db WatchSource:0}: Error finding container 156d4bb23f641500df18203279e69b1581d53696c455e4c4ea8bdfebed4708db: Status 404 returned error can't find the container with id 156d4bb23f641500df18203279e69b1581d53696c455e4c4ea8bdfebed4708db Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.953926 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx88s\" (UniqueName: \"kubernetes.io/projected/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-kube-api-access-qx88s\") pod \"redhat-marketplace-vgvgz\" (UID: \"d03ba5d0-89cc-4be3-b1f5-38fe6f006332\") " pod="openshift-marketplace/redhat-marketplace-vgvgz" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.953960 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-utilities\") pod \"redhat-marketplace-vgvgz\" (UID: \"d03ba5d0-89cc-4be3-b1f5-38fe6f006332\") " pod="openshift-marketplace/redhat-marketplace-vgvgz" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.954024 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-catalog-content\") pod \"redhat-marketplace-vgvgz\" (UID: \"d03ba5d0-89cc-4be3-b1f5-38fe6f006332\") " pod="openshift-marketplace/redhat-marketplace-vgvgz" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.954386 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-catalog-content\") pod \"redhat-marketplace-vgvgz\" (UID: \"d03ba5d0-89cc-4be3-b1f5-38fe6f006332\") " pod="openshift-marketplace/redhat-marketplace-vgvgz" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.954579 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-utilities\") pod \"redhat-marketplace-vgvgz\" (UID: \"d03ba5d0-89cc-4be3-b1f5-38fe6f006332\") " pod="openshift-marketplace/redhat-marketplace-vgvgz" Oct 02 10:54:01 crc kubenswrapper[4766]: I1002 10:54:01.973467 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx88s\" (UniqueName: \"kubernetes.io/projected/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-kube-api-access-qx88s\") pod \"redhat-marketplace-vgvgz\" (UID: \"d03ba5d0-89cc-4be3-b1f5-38fe6f006332\") " pod="openshift-marketplace/redhat-marketplace-vgvgz" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.038928 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgvgz" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.229304 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgvgz"] Oct 02 10:54:02 crc kubenswrapper[4766]: W1002 10:54:02.239898 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd03ba5d0_89cc_4be3_b1f5_38fe6f006332.slice/crio-0848384d9552e695f730092ff71c49bb205f425448622f4ef922ca44031b9771 WatchSource:0}: Error finding container 0848384d9552e695f730092ff71c49bb205f425448622f4ef922ca44031b9771: Status 404 returned error can't find the container with id 0848384d9552e695f730092ff71c49bb205f425448622f4ef922ca44031b9771 Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.292033 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:54:02 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:54:02 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:54:02 crc kubenswrapper[4766]: healthz check failed Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.292089 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.313229 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-npvrp"] Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.314548 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npvrp" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.316490 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.321766 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-npvrp"] Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.461613 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4fgz\" (UniqueName: \"kubernetes.io/projected/bb5d66be-21fe-4237-b616-a8c4f41f5f14-kube-api-access-w4fgz\") pod \"redhat-operators-npvrp\" (UID: \"bb5d66be-21fe-4237-b616-a8c4f41f5f14\") " pod="openshift-marketplace/redhat-operators-npvrp" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.461891 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb5d66be-21fe-4237-b616-a8c4f41f5f14-catalog-content\") pod \"redhat-operators-npvrp\" (UID: \"bb5d66be-21fe-4237-b616-a8c4f41f5f14\") " pod="openshift-marketplace/redhat-operators-npvrp" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.461946 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb5d66be-21fe-4237-b616-a8c4f41f5f14-utilities\") pod \"redhat-operators-npvrp\" (UID: \"bb5d66be-21fe-4237-b616-a8c4f41f5f14\") " pod="openshift-marketplace/redhat-operators-npvrp" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.487398 4766 generic.go:334] "Generic (PLEG): container finished" podID="8e1a892b-5e21-4f62-9859-d4e4a8ef9623" containerID="d56a833cb57739632b9f84c95b0b320e869c7253cf0bc96d26ffd8fa4d8647f9" exitCode=0 Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.487477 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgj4n" event={"ID":"8e1a892b-5e21-4f62-9859-d4e4a8ef9623","Type":"ContainerDied","Data":"d56a833cb57739632b9f84c95b0b320e869c7253cf0bc96d26ffd8fa4d8647f9"} Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.487518 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgj4n" event={"ID":"8e1a892b-5e21-4f62-9859-d4e4a8ef9623","Type":"ContainerStarted","Data":"156d4bb23f641500df18203279e69b1581d53696c455e4c4ea8bdfebed4708db"} Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.490904 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" event={"ID":"347022cb-d24b-4f67-900e-c2b858cc49fc","Type":"ContainerStarted","Data":"172700260a37b1c421d758df5e909ea4af6599d94b9fea4c53fc2194ff417154"} Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.490949 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" event={"ID":"347022cb-d24b-4f67-900e-c2b858cc49fc","Type":"ContainerStarted","Data":"bf6be94adb900ecf84345c9e790c9c20168dac4044871e877ee7fe3fe3949cdb"} Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.491149 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.495076 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgvgz" event={"ID":"d03ba5d0-89cc-4be3-b1f5-38fe6f006332","Type":"ContainerStarted","Data":"0848384d9552e695f730092ff71c49bb205f425448622f4ef922ca44031b9771"} Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.518647 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" podStartSLOduration=135.518629694 podStartE2EDuration="2m15.518629694s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:54:02.518058995 +0000 UTC m=+157.460929939" watchObservedRunningTime="2025-10-02 10:54:02.518629694 +0000 UTC m=+157.461500638" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.563697 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb5d66be-21fe-4237-b616-a8c4f41f5f14-catalog-content\") pod \"redhat-operators-npvrp\" (UID: \"bb5d66be-21fe-4237-b616-a8c4f41f5f14\") " pod="openshift-marketplace/redhat-operators-npvrp" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.563747 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb5d66be-21fe-4237-b616-a8c4f41f5f14-utilities\") pod \"redhat-operators-npvrp\" (UID: \"bb5d66be-21fe-4237-b616-a8c4f41f5f14\") " pod="openshift-marketplace/redhat-operators-npvrp" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.563780 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4fgz\" (UniqueName: \"kubernetes.io/projected/bb5d66be-21fe-4237-b616-a8c4f41f5f14-kube-api-access-w4fgz\") pod \"redhat-operators-npvrp\" (UID: \"bb5d66be-21fe-4237-b616-a8c4f41f5f14\") " pod="openshift-marketplace/redhat-operators-npvrp" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.564149 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb5d66be-21fe-4237-b616-a8c4f41f5f14-catalog-content\") pod \"redhat-operators-npvrp\" (UID: \"bb5d66be-21fe-4237-b616-a8c4f41f5f14\") " pod="openshift-marketplace/redhat-operators-npvrp" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.564219 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb5d66be-21fe-4237-b616-a8c4f41f5f14-utilities\") pod \"redhat-operators-npvrp\" (UID: \"bb5d66be-21fe-4237-b616-a8c4f41f5f14\") " pod="openshift-marketplace/redhat-operators-npvrp" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.583058 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4fgz\" (UniqueName: \"kubernetes.io/projected/bb5d66be-21fe-4237-b616-a8c4f41f5f14-kube-api-access-w4fgz\") pod \"redhat-operators-npvrp\" (UID: \"bb5d66be-21fe-4237-b616-a8c4f41f5f14\") " pod="openshift-marketplace/redhat-operators-npvrp" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.690590 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npvrp" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.713358 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-plr97"] Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.714676 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plr97" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.720117 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-plr97"] Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.867370 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bbf587f-0557-445b-ac38-0bf602f222a4-catalog-content\") pod \"redhat-operators-plr97\" (UID: \"9bbf587f-0557-445b-ac38-0bf602f222a4\") " pod="openshift-marketplace/redhat-operators-plr97" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.868379 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bbf587f-0557-445b-ac38-0bf602f222a4-utilities\") pod \"redhat-operators-plr97\" (UID: \"9bbf587f-0557-445b-ac38-0bf602f222a4\") " pod="openshift-marketplace/redhat-operators-plr97" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.868675 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mxrk\" (UniqueName: \"kubernetes.io/projected/9bbf587f-0557-445b-ac38-0bf602f222a4-kube-api-access-4mxrk\") pod \"redhat-operators-plr97\" (UID: \"9bbf587f-0557-445b-ac38-0bf602f222a4\") " pod="openshift-marketplace/redhat-operators-plr97" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.969867 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mxrk\" (UniqueName: \"kubernetes.io/projected/9bbf587f-0557-445b-ac38-0bf602f222a4-kube-api-access-4mxrk\") pod \"redhat-operators-plr97\" (UID: \"9bbf587f-0557-445b-ac38-0bf602f222a4\") " pod="openshift-marketplace/redhat-operators-plr97" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.972160 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bbf587f-0557-445b-ac38-0bf602f222a4-catalog-content\") pod \"redhat-operators-plr97\" (UID: \"9bbf587f-0557-445b-ac38-0bf602f222a4\") " pod="openshift-marketplace/redhat-operators-plr97" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.972210 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bbf587f-0557-445b-ac38-0bf602f222a4-utilities\") pod \"redhat-operators-plr97\" (UID: \"9bbf587f-0557-445b-ac38-0bf602f222a4\") " pod="openshift-marketplace/redhat-operators-plr97" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.973149 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bbf587f-0557-445b-ac38-0bf602f222a4-utilities\") pod \"redhat-operators-plr97\" (UID: \"9bbf587f-0557-445b-ac38-0bf602f222a4\") " pod="openshift-marketplace/redhat-operators-plr97" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.973199 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bbf587f-0557-445b-ac38-0bf602f222a4-catalog-content\") pod \"redhat-operators-plr97\" (UID: \"9bbf587f-0557-445b-ac38-0bf602f222a4\") " pod="openshift-marketplace/redhat-operators-plr97" Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.975959 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-npvrp"] Oct 02 10:54:02 crc kubenswrapper[4766]: I1002 10:54:02.989091 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mxrk\" (UniqueName: \"kubernetes.io/projected/9bbf587f-0557-445b-ac38-0bf602f222a4-kube-api-access-4mxrk\") pod \"redhat-operators-plr97\" (UID: \"9bbf587f-0557-445b-ac38-0bf602f222a4\") " pod="openshift-marketplace/redhat-operators-plr97" Oct 02 10:54:03 crc kubenswrapper[4766]: W1002 10:54:03.013131 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb5d66be_21fe_4237_b616_a8c4f41f5f14.slice/crio-8861784ca2964f106940eea6b527a24879741be6c29f5754e0d2f23724222b82 WatchSource:0}: Error finding container 8861784ca2964f106940eea6b527a24879741be6c29f5754e0d2f23724222b82: Status 404 returned error can't find the container with id 8861784ca2964f106940eea6b527a24879741be6c29f5754e0d2f23724222b82 Oct 02 10:54:03 crc kubenswrapper[4766]: I1002 10:54:03.048623 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plr97" Oct 02 10:54:03 crc kubenswrapper[4766]: I1002 10:54:03.254369 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-plr97"] Oct 02 10:54:03 crc kubenswrapper[4766]: I1002 10:54:03.290047 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:54:03 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:54:03 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:54:03 crc kubenswrapper[4766]: healthz check failed Oct 02 10:54:03 crc kubenswrapper[4766]: I1002 10:54:03.290112 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:54:03 crc kubenswrapper[4766]: W1002 10:54:03.299288 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bbf587f_0557_445b_ac38_0bf602f222a4.slice/crio-3d47c5f588ed00f17e6d5ca6c5705a59bfbc8ed8b2b50f60124080da5e1a6505 WatchSource:0}: Error finding container 3d47c5f588ed00f17e6d5ca6c5705a59bfbc8ed8b2b50f60124080da5e1a6505: Status 404 returned error can't find the container with id 3d47c5f588ed00f17e6d5ca6c5705a59bfbc8ed8b2b50f60124080da5e1a6505 Oct 02 10:54:03 crc kubenswrapper[4766]: I1002 10:54:03.502647 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plr97" event={"ID":"9bbf587f-0557-445b-ac38-0bf602f222a4","Type":"ContainerStarted","Data":"155679f7cb7fec90088d36a8def3561c8cd2dd13d044251ee13b97d2a39b69f0"} Oct 02 10:54:03 crc kubenswrapper[4766]: I1002 10:54:03.502685 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plr97" event={"ID":"9bbf587f-0557-445b-ac38-0bf602f222a4","Type":"ContainerStarted","Data":"3d47c5f588ed00f17e6d5ca6c5705a59bfbc8ed8b2b50f60124080da5e1a6505"} Oct 02 10:54:03 crc kubenswrapper[4766]: I1002 10:54:03.505629 4766 generic.go:334] "Generic (PLEG): container finished" podID="bb5d66be-21fe-4237-b616-a8c4f41f5f14" containerID="3f5b69a4b7de54aa4a8ace06d81018fa1e2eceb1155b3a97a7194dc4a9e583c1" exitCode=0 Oct 02 10:54:03 crc kubenswrapper[4766]: I1002 10:54:03.505722 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npvrp" event={"ID":"bb5d66be-21fe-4237-b616-a8c4f41f5f14","Type":"ContainerDied","Data":"3f5b69a4b7de54aa4a8ace06d81018fa1e2eceb1155b3a97a7194dc4a9e583c1"} Oct 02 10:54:03 crc kubenswrapper[4766]: I1002 10:54:03.505747 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npvrp" event={"ID":"bb5d66be-21fe-4237-b616-a8c4f41f5f14","Type":"ContainerStarted","Data":"8861784ca2964f106940eea6b527a24879741be6c29f5754e0d2f23724222b82"} Oct 02 10:54:03 crc kubenswrapper[4766]: I1002 10:54:03.509866 4766 generic.go:334] "Generic (PLEG): container finished" podID="d03ba5d0-89cc-4be3-b1f5-38fe6f006332" containerID="028e7e72b40d1a7994c8c1be2e91243e2ca4dab24b6004206c97e91a08ce5eb7" exitCode=0 Oct 02 10:54:03 crc kubenswrapper[4766]: I1002 10:54:03.510647 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgvgz" event={"ID":"d03ba5d0-89cc-4be3-b1f5-38fe6f006332","Type":"ContainerDied","Data":"028e7e72b40d1a7994c8c1be2e91243e2ca4dab24b6004206c97e91a08ce5eb7"} Oct 02 10:54:03 crc kubenswrapper[4766]: I1002 10:54:03.869804 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:54:03 crc kubenswrapper[4766]: I1002 10:54:03.910881 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-qhc8r" Oct 02 10:54:04 crc kubenswrapper[4766]: I1002 10:54:04.289136 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:54:04 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:54:04 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:54:04 crc kubenswrapper[4766]: healthz check failed Oct 02 10:54:04 crc kubenswrapper[4766]: I1002 10:54:04.289219 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:54:04 crc kubenswrapper[4766]: I1002 10:54:04.520589 4766 generic.go:334] "Generic (PLEG): container finished" podID="9bbf587f-0557-445b-ac38-0bf602f222a4" containerID="155679f7cb7fec90088d36a8def3561c8cd2dd13d044251ee13b97d2a39b69f0" exitCode=0 Oct 02 10:54:04 crc kubenswrapper[4766]: I1002 10:54:04.520673 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plr97" event={"ID":"9bbf587f-0557-445b-ac38-0bf602f222a4","Type":"ContainerDied","Data":"155679f7cb7fec90088d36a8def3561c8cd2dd13d044251ee13b97d2a39b69f0"} Oct 02 10:54:05 crc kubenswrapper[4766]: I1002 10:54:05.289235 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:54:05 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:54:05 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:54:05 crc kubenswrapper[4766]: healthz check failed Oct 02 10:54:05 crc kubenswrapper[4766]: I1002 10:54:05.289376 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:54:06 crc kubenswrapper[4766]: I1002 10:54:06.288431 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:54:06 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:54:06 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:54:06 crc kubenswrapper[4766]: healthz check failed Oct 02 10:54:06 crc kubenswrapper[4766]: I1002 10:54:06.288522 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:54:07 crc kubenswrapper[4766]: I1002 10:54:07.300915 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:54:07 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:54:07 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:54:07 crc kubenswrapper[4766]: healthz check failed Oct 02 10:54:07 crc kubenswrapper[4766]: I1002 10:54:07.300988 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:54:08 crc kubenswrapper[4766]: I1002 10:54:08.288682 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:54:08 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:54:08 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:54:08 crc kubenswrapper[4766]: healthz check failed Oct 02 10:54:08 crc kubenswrapper[4766]: I1002 10:54:08.288744 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:54:09 crc kubenswrapper[4766]: I1002 10:54:09.286915 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs\") pod \"network-metrics-daemon-klg2z\" (UID: \"6d68573a-5250-4407-8631-2199a3de7e9e\") " pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:54:09 crc kubenswrapper[4766]: I1002 10:54:09.289818 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:54:09 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:54:09 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:54:09 crc kubenswrapper[4766]: healthz check failed Oct 02 10:54:09 crc kubenswrapper[4766]: I1002 10:54:09.289869 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:54:09 crc kubenswrapper[4766]: I1002 10:54:09.293092 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d68573a-5250-4407-8631-2199a3de7e9e-metrics-certs\") pod \"network-metrics-daemon-klg2z\" (UID: \"6d68573a-5250-4407-8631-2199a3de7e9e\") " pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:54:09 crc kubenswrapper[4766]: I1002 10:54:09.462966 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-xqcd7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 02 10:54:09 crc kubenswrapper[4766]: I1002 10:54:09.463042 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-xqcd7 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 02 10:54:09 crc kubenswrapper[4766]: I1002 10:54:09.463373 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 02 10:54:09 crc kubenswrapper[4766]: I1002 10:54:09.463316 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 02 10:54:09 crc kubenswrapper[4766]: I1002 10:54:09.494293 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-klg2z" Oct 02 10:54:09 crc kubenswrapper[4766]: I1002 10:54:09.837824 4766 patch_prober.go:28] interesting pod/console-f9d7485db-8z2x2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.40:8443/health\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Oct 02 10:54:09 crc kubenswrapper[4766]: I1002 10:54:09.837901 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-8z2x2" podUID="581ea4c4-072a-4bba-afc9-2f82918ac0c9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.40:8443/health\": dial tcp 10.217.0.40:8443: connect: connection refused" Oct 02 10:54:09 crc kubenswrapper[4766]: I1002 10:54:09.879842 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-klg2z"] Oct 02 10:54:10 crc kubenswrapper[4766]: I1002 10:54:10.289575 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:54:10 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:54:10 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:54:10 crc kubenswrapper[4766]: healthz check failed Oct 02 10:54:10 crc kubenswrapper[4766]: I1002 10:54:10.289930 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:54:11 crc kubenswrapper[4766]: I1002 10:54:11.291911 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:54:11 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:54:11 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:54:11 crc kubenswrapper[4766]: healthz check failed Oct 02 10:54:11 crc kubenswrapper[4766]: I1002 10:54:11.291996 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:54:12 crc kubenswrapper[4766]: I1002 10:54:12.288330 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:54:12 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:54:12 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:54:12 crc kubenswrapper[4766]: healthz check failed Oct 02 10:54:12 crc kubenswrapper[4766]: I1002 10:54:12.288400 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:54:13 crc kubenswrapper[4766]: I1002 10:54:13.291261 4766 patch_prober.go:28] interesting pod/router-default-5444994796-44mnp container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 02 10:54:13 crc kubenswrapper[4766]: [-]has-synced failed: reason withheld Oct 02 10:54:13 crc kubenswrapper[4766]: [+]process-running ok Oct 02 10:54:13 crc kubenswrapper[4766]: healthz check failed Oct 02 10:54:13 crc kubenswrapper[4766]: I1002 10:54:13.291595 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-44mnp" podUID="97952e7f-5262-40a4-8a14-0a881ce34703" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 10:54:14 crc kubenswrapper[4766]: I1002 10:54:14.289468 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:54:14 crc kubenswrapper[4766]: I1002 10:54:14.291765 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-44mnp" Oct 02 10:54:19 crc kubenswrapper[4766]: I1002 10:54:19.462870 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-xqcd7 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 02 10:54:19 crc kubenswrapper[4766]: I1002 10:54:19.463220 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 02 10:54:19 crc kubenswrapper[4766]: I1002 10:54:19.463265 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-xqcd7" Oct 02 10:54:19 crc kubenswrapper[4766]: I1002 10:54:19.462964 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-xqcd7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 02 10:54:19 crc kubenswrapper[4766]: I1002 10:54:19.463434 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 02 10:54:19 crc kubenswrapper[4766]: I1002 10:54:19.463815 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-xqcd7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 02 10:54:19 crc kubenswrapper[4766]: I1002 10:54:19.463876 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 02 10:54:19 crc kubenswrapper[4766]: I1002 10:54:19.463874 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"e006c52aa1d853709aa0b8c51e1c5b7cf1bfc50b2b48193cf5b7f4c872431cbd"} pod="openshift-console/downloads-7954f5f757-xqcd7" containerMessage="Container download-server failed liveness probe, will be restarted" Oct 02 10:54:19 crc kubenswrapper[4766]: I1002 10:54:19.463959 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" containerID="cri-o://e006c52aa1d853709aa0b8c51e1c5b7cf1bfc50b2b48193cf5b7f4c872431cbd" gracePeriod=2 Oct 02 10:54:19 crc kubenswrapper[4766]: I1002 10:54:19.845821 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:54:19 crc kubenswrapper[4766]: I1002 10:54:19.854388 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 10:54:20 crc kubenswrapper[4766]: I1002 10:54:20.655524 4766 generic.go:334] "Generic (PLEG): container finished" podID="743fe4dc-299d-4f28-9448-644d12db4af7" containerID="e006c52aa1d853709aa0b8c51e1c5b7cf1bfc50b2b48193cf5b7f4c872431cbd" exitCode=0 Oct 02 10:54:20 crc kubenswrapper[4766]: I1002 10:54:20.655611 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xqcd7" event={"ID":"743fe4dc-299d-4f28-9448-644d12db4af7","Type":"ContainerDied","Data":"e006c52aa1d853709aa0b8c51e1c5b7cf1bfc50b2b48193cf5b7f4c872431cbd"} Oct 02 10:54:21 crc kubenswrapper[4766]: I1002 10:54:21.503676 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:54:24 crc kubenswrapper[4766]: I1002 10:54:24.432605 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 10:54:24 crc kubenswrapper[4766]: I1002 10:54:24.433034 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 10:54:25 crc kubenswrapper[4766]: W1002 10:54:25.975880 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d68573a_5250_4407_8631_2199a3de7e9e.slice/crio-5947ea554daa46048377023786b468a610bf1251b3467b22970a7b6eb55bccd8 WatchSource:0}: Error finding container 5947ea554daa46048377023786b468a610bf1251b3467b22970a7b6eb55bccd8: Status 404 returned error can't find the container with id 5947ea554daa46048377023786b468a610bf1251b3467b22970a7b6eb55bccd8 Oct 02 10:54:26 crc kubenswrapper[4766]: I1002 10:54:26.018345 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 02 10:54:26 crc kubenswrapper[4766]: I1002 10:54:26.684781 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-klg2z" event={"ID":"6d68573a-5250-4407-8631-2199a3de7e9e","Type":"ContainerStarted","Data":"5947ea554daa46048377023786b468a610bf1251b3467b22970a7b6eb55bccd8"} Oct 02 10:54:28 crc kubenswrapper[4766]: I1002 10:54:28.867321 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-z5bg6" Oct 02 10:54:29 crc kubenswrapper[4766]: I1002 10:54:29.463013 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-xqcd7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 02 10:54:29 crc kubenswrapper[4766]: I1002 10:54:29.463059 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 02 10:54:38 crc kubenswrapper[4766]: E1002 10:54:38.884745 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06\": context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 10:54:38 crc kubenswrapper[4766]: E1002 10:54:38.885414 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4fgz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-npvrp_openshift-marketplace(bb5d66be-21fe-4237-b616-a8c4f41f5f14): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06\": context canceled" logger="UnhandledError" Oct 02 10:54:38 crc kubenswrapper[4766]: E1002 10:54:38.887406 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06: Get \\\"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06\\\": context canceled\"" pod="openshift-marketplace/redhat-operators-npvrp" podUID="bb5d66be-21fe-4237-b616-a8c4f41f5f14" Oct 02 10:54:38 crc kubenswrapper[4766]: E1002 10:54:38.897090 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage857072933/2\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 02 10:54:38 crc kubenswrapper[4766]: E1002 10:54:38.897287 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lf7mr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rgj4n_openshift-marketplace(8e1a892b-5e21-4f62-9859-d4e4a8ef9623): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage857072933/2\": happened during read: context canceled" logger="UnhandledError" Oct 02 10:54:38 crc kubenswrapper[4766]: E1002 10:54:38.898717 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage857072933/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rgj4n" podUID="8e1a892b-5e21-4f62-9859-d4e4a8ef9623" Oct 02 10:54:39 crc kubenswrapper[4766]: I1002 10:54:39.464216 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-xqcd7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 02 10:54:39 crc kubenswrapper[4766]: I1002 10:54:39.464280 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 02 10:54:40 crc kubenswrapper[4766]: E1002 10:54:40.176792 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 02 10:54:40 crc kubenswrapper[4766]: E1002 10:54:40.177107 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vwlcx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-rdxbh_openshift-marketplace(47362ad1-8f17-4633-bf4f-6aff2ebda031): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 10:54:40 crc kubenswrapper[4766]: E1002 10:54:40.178371 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-rdxbh" podUID="47362ad1-8f17-4633-bf4f-6aff2ebda031" Oct 02 10:54:42 crc kubenswrapper[4766]: E1002 10:54:42.034321 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-npvrp" podUID="bb5d66be-21fe-4237-b616-a8c4f41f5f14" Oct 02 10:54:42 crc kubenswrapper[4766]: E1002 10:54:42.034402 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rgj4n" podUID="8e1a892b-5e21-4f62-9859-d4e4a8ef9623" Oct 02 10:54:42 crc kubenswrapper[4766]: E1002 10:54:42.034685 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-rdxbh" podUID="47362ad1-8f17-4633-bf4f-6aff2ebda031" Oct 02 10:54:42 crc kubenswrapper[4766]: E1002 10:54:42.740588 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3304064129/2\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 10:54:42 crc kubenswrapper[4766]: E1002 10:54:42.740769 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4mxrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-plr97_openshift-marketplace(9bbf587f-0557-445b-ac38-0bf602f222a4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3304064129/2\": happened during read: context canceled" logger="UnhandledError" Oct 02 10:54:42 crc kubenswrapper[4766]: E1002 10:54:42.742857 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage3304064129/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-operators-plr97" podUID="9bbf587f-0557-445b-ac38-0bf602f222a4" Oct 02 10:54:45 crc kubenswrapper[4766]: E1002 10:54:45.444306 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 02 10:54:45 crc kubenswrapper[4766]: E1002 10:54:45.444452 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5dvrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5mmm7_openshift-marketplace(9a491a4e-eefa-4908-8e7b-1d5c3e67274c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 10:54:45 crc kubenswrapper[4766]: E1002 10:54:45.445623 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5mmm7" podUID="9a491a4e-eefa-4908-8e7b-1d5c3e67274c" Oct 02 10:54:46 crc kubenswrapper[4766]: E1002 10:54:46.858757 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-plr97" podUID="9bbf587f-0557-445b-ac38-0bf602f222a4" Oct 02 10:54:46 crc kubenswrapper[4766]: E1002 10:54:46.859487 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5mmm7" podUID="9a491a4e-eefa-4908-8e7b-1d5c3e67274c" Oct 02 10:54:49 crc kubenswrapper[4766]: I1002 10:54:49.465730 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-xqcd7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 02 10:54:49 crc kubenswrapper[4766]: I1002 10:54:49.466099 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 02 10:54:52 crc kubenswrapper[4766]: E1002 10:54:52.788768 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 02 10:54:52 crc kubenswrapper[4766]: E1002 10:54:52.789263 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v48nk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-87d2m_openshift-marketplace(c6f0281f-0dee-4b1b-a4c0-35294ed0a88f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 10:54:52 crc kubenswrapper[4766]: E1002 10:54:52.790533 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-87d2m" podUID="c6f0281f-0dee-4b1b-a4c0-35294ed0a88f" Oct 02 10:54:54 crc kubenswrapper[4766]: I1002 10:54:54.432317 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 10:54:54 crc kubenswrapper[4766]: I1002 10:54:54.432383 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 10:54:54 crc kubenswrapper[4766]: I1002 10:54:54.432431 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 10:54:54 crc kubenswrapper[4766]: I1002 10:54:54.432970 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 10:54:54 crc kubenswrapper[4766]: I1002 10:54:54.433023 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65" gracePeriod=600 Oct 02 10:54:59 crc kubenswrapper[4766]: I1002 10:54:59.462583 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-xqcd7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 02 10:54:59 crc kubenswrapper[4766]: I1002 10:54:59.463079 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 02 10:55:02 crc kubenswrapper[4766]: E1002 10:55:02.555610 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-87d2m" podUID="c6f0281f-0dee-4b1b-a4c0-35294ed0a88f" Oct 02 10:55:02 crc kubenswrapper[4766]: I1002 10:55:02.887317 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65" exitCode=0 Oct 02 10:55:02 crc kubenswrapper[4766]: I1002 10:55:02.887408 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65"} Oct 02 10:55:03 crc kubenswrapper[4766]: E1002 10:55:03.306846 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 02 10:55:03 crc kubenswrapper[4766]: E1002 10:55:03.306994 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-29lcx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-m8788_openshift-marketplace(d00b5247-9e12-4202-ae31-20d454dfa183): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 10:55:03 crc kubenswrapper[4766]: E1002 10:55:03.308191 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-m8788" podUID="d00b5247-9e12-4202-ae31-20d454dfa183" Oct 02 10:55:03 crc kubenswrapper[4766]: E1002 10:55:03.898443 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-m8788" podUID="d00b5247-9e12-4202-ae31-20d454dfa183" Oct 02 10:55:04 crc kubenswrapper[4766]: E1002 10:55:04.228463 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 02 10:55:04 crc kubenswrapper[4766]: E1002 10:55:04.229009 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qx88s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-vgvgz_openshift-marketplace(d03ba5d0-89cc-4be3-b1f5-38fe6f006332): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 10:55:04 crc kubenswrapper[4766]: E1002 10:55:04.230378 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-vgvgz" podUID="d03ba5d0-89cc-4be3-b1f5-38fe6f006332" Oct 02 10:55:04 crc kubenswrapper[4766]: I1002 10:55:04.900988 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"769721c28661e517737abb6064d68fa4fa2746b13fc804a3158b4c035b7b61c8"} Oct 02 10:55:04 crc kubenswrapper[4766]: I1002 10:55:04.904824 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xqcd7" event={"ID":"743fe4dc-299d-4f28-9448-644d12db4af7","Type":"ContainerStarted","Data":"d04ef9766254ae63ec977d6471d67e6773ddfe53ffea2bd840648923f1dd6e2c"} Oct 02 10:55:04 crc kubenswrapper[4766]: I1002 10:55:04.905066 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-xqcd7" Oct 02 10:55:04 crc kubenswrapper[4766]: I1002 10:55:04.905532 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-xqcd7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 02 10:55:04 crc kubenswrapper[4766]: I1002 10:55:04.905601 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 02 10:55:04 crc kubenswrapper[4766]: I1002 10:55:04.906617 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-klg2z" event={"ID":"6d68573a-5250-4407-8631-2199a3de7e9e","Type":"ContainerStarted","Data":"e7ce0f58fd3672a315cd1555f3ddcb70a86af7f8f75cd0d5cb0d576f699c8783"} Oct 02 10:55:04 crc kubenswrapper[4766]: I1002 10:55:04.906660 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-klg2z" event={"ID":"6d68573a-5250-4407-8631-2199a3de7e9e","Type":"ContainerStarted","Data":"2cb83242617287d6aa80b0c919c8186b0a431aa5f8bf230105b9ca0a8daf9598"} Oct 02 10:55:04 crc kubenswrapper[4766]: I1002 10:55:04.935685 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-klg2z" podStartSLOduration=197.935664198 podStartE2EDuration="3m17.935664198s" podCreationTimestamp="2025-10-02 10:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:55:04.934117888 +0000 UTC m=+219.876988822" watchObservedRunningTime="2025-10-02 10:55:04.935664198 +0000 UTC m=+219.878535142" Oct 02 10:55:05 crc kubenswrapper[4766]: E1002 10:55:05.277769 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-vgvgz" podUID="d03ba5d0-89cc-4be3-b1f5-38fe6f006332" Oct 02 10:55:05 crc kubenswrapper[4766]: I1002 10:55:05.920401 4766 generic.go:334] "Generic (PLEG): container finished" podID="8e1a892b-5e21-4f62-9859-d4e4a8ef9623" containerID="142e0abee00e3d9ca8450600da3d8910177502f16c547d91c78efe9ee729e135" exitCode=0 Oct 02 10:55:05 crc kubenswrapper[4766]: I1002 10:55:05.920471 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgj4n" event={"ID":"8e1a892b-5e21-4f62-9859-d4e4a8ef9623","Type":"ContainerDied","Data":"142e0abee00e3d9ca8450600da3d8910177502f16c547d91c78efe9ee729e135"} Oct 02 10:55:05 crc kubenswrapper[4766]: I1002 10:55:05.922612 4766 generic.go:334] "Generic (PLEG): container finished" podID="9a491a4e-eefa-4908-8e7b-1d5c3e67274c" containerID="3d2044096080bb4a32fd798df403dd5075aa5a145e243d00cd2ab9e540191d24" exitCode=0 Oct 02 10:55:05 crc kubenswrapper[4766]: I1002 10:55:05.922653 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5mmm7" event={"ID":"9a491a4e-eefa-4908-8e7b-1d5c3e67274c","Type":"ContainerDied","Data":"3d2044096080bb4a32fd798df403dd5075aa5a145e243d00cd2ab9e540191d24"} Oct 02 10:55:05 crc kubenswrapper[4766]: I1002 10:55:05.926288 4766 generic.go:334] "Generic (PLEG): container finished" podID="47362ad1-8f17-4633-bf4f-6aff2ebda031" containerID="2345ec16e94e72ef17cbedf6135f96e4f9b99a11a236261716c64a21ad9ae6c9" exitCode=0 Oct 02 10:55:05 crc kubenswrapper[4766]: I1002 10:55:05.926397 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdxbh" event={"ID":"47362ad1-8f17-4633-bf4f-6aff2ebda031","Type":"ContainerDied","Data":"2345ec16e94e72ef17cbedf6135f96e4f9b99a11a236261716c64a21ad9ae6c9"} Oct 02 10:55:05 crc kubenswrapper[4766]: I1002 10:55:05.928766 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-xqcd7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 02 10:55:05 crc kubenswrapper[4766]: I1002 10:55:05.928832 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 02 10:55:06 crc kubenswrapper[4766]: I1002 10:55:06.935175 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdxbh" event={"ID":"47362ad1-8f17-4633-bf4f-6aff2ebda031","Type":"ContainerStarted","Data":"2640b57cff2f341df0bff427016688d951b2431d329d53c37f56b0a894edcfcf"} Oct 02 10:55:06 crc kubenswrapper[4766]: I1002 10:55:06.938030 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgj4n" event={"ID":"8e1a892b-5e21-4f62-9859-d4e4a8ef9623","Type":"ContainerStarted","Data":"9b6a44c0fd03ae35e84e43af15c18cd0d0ec872ed1aac6bf92c020fe135f9051"} Oct 02 10:55:06 crc kubenswrapper[4766]: I1002 10:55:06.941064 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5mmm7" event={"ID":"9a491a4e-eefa-4908-8e7b-1d5c3e67274c","Type":"ContainerStarted","Data":"9ccb90a5e883e26604eab9b292fe43ba1b9b0c8e93748b0179c798cf35c8b7e0"} Oct 02 10:55:06 crc kubenswrapper[4766]: I1002 10:55:06.952202 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rdxbh" podStartSLOduration=2.765160607 podStartE2EDuration="1m7.95218313s" podCreationTimestamp="2025-10-02 10:53:59 +0000 UTC" firstStartedPulling="2025-10-02 10:54:01.476776701 +0000 UTC m=+156.419647645" lastFinishedPulling="2025-10-02 10:55:06.663799224 +0000 UTC m=+221.606670168" observedRunningTime="2025-10-02 10:55:06.952069556 +0000 UTC m=+221.894940520" watchObservedRunningTime="2025-10-02 10:55:06.95218313 +0000 UTC m=+221.895054074" Oct 02 10:55:06 crc kubenswrapper[4766]: I1002 10:55:06.974075 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5mmm7" podStartSLOduration=2.92761741 podStartE2EDuration="1m7.974056548s" podCreationTimestamp="2025-10-02 10:53:59 +0000 UTC" firstStartedPulling="2025-10-02 10:54:01.479158791 +0000 UTC m=+156.422029735" lastFinishedPulling="2025-10-02 10:55:06.525597929 +0000 UTC m=+221.468468873" observedRunningTime="2025-10-02 10:55:06.973240101 +0000 UTC m=+221.916111045" watchObservedRunningTime="2025-10-02 10:55:06.974056548 +0000 UTC m=+221.916927492" Oct 02 10:55:06 crc kubenswrapper[4766]: I1002 10:55:06.994196 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rgj4n" podStartSLOduration=2.113603445 podStartE2EDuration="1m5.994177399s" podCreationTimestamp="2025-10-02 10:54:01 +0000 UTC" firstStartedPulling="2025-10-02 10:54:02.491569668 +0000 UTC m=+157.434440602" lastFinishedPulling="2025-10-02 10:55:06.372143612 +0000 UTC m=+221.315014556" observedRunningTime="2025-10-02 10:55:06.99202655 +0000 UTC m=+221.934897504" watchObservedRunningTime="2025-10-02 10:55:06.994177399 +0000 UTC m=+221.937048343" Oct 02 10:55:09 crc kubenswrapper[4766]: I1002 10:55:09.462622 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-xqcd7 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 02 10:55:09 crc kubenswrapper[4766]: I1002 10:55:09.463151 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 02 10:55:09 crc kubenswrapper[4766]: I1002 10:55:09.462711 4766 patch_prober.go:28] interesting pod/downloads-7954f5f757-xqcd7 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Oct 02 10:55:09 crc kubenswrapper[4766]: I1002 10:55:09.463444 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-xqcd7" podUID="743fe4dc-299d-4f28-9448-644d12db4af7" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Oct 02 10:55:09 crc kubenswrapper[4766]: I1002 10:55:09.649175 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5mmm7" Oct 02 10:55:09 crc kubenswrapper[4766]: I1002 10:55:09.649222 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5mmm7" Oct 02 10:55:10 crc kubenswrapper[4766]: I1002 10:55:10.119568 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rdxbh" Oct 02 10:55:10 crc kubenswrapper[4766]: I1002 10:55:10.119691 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rdxbh" Oct 02 10:55:10 crc kubenswrapper[4766]: I1002 10:55:10.276330 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rdxbh" Oct 02 10:55:10 crc kubenswrapper[4766]: I1002 10:55:10.278685 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5mmm7" Oct 02 10:55:11 crc kubenswrapper[4766]: I1002 10:55:11.771063 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rgj4n" Oct 02 10:55:11 crc kubenswrapper[4766]: I1002 10:55:11.771471 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rgj4n" Oct 02 10:55:11 crc kubenswrapper[4766]: I1002 10:55:11.822951 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rgj4n" Oct 02 10:55:12 crc kubenswrapper[4766]: I1002 10:55:12.012655 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rgj4n" Oct 02 10:55:19 crc kubenswrapper[4766]: I1002 10:55:19.512427 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-xqcd7" Oct 02 10:55:19 crc kubenswrapper[4766]: I1002 10:55:19.685611 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5mmm7" Oct 02 10:55:19 crc kubenswrapper[4766]: E1002 10:55:19.849924 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 10:55:19 crc kubenswrapper[4766]: E1002 10:55:19.850085 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4mxrk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-plr97_openshift-marketplace(9bbf587f-0557-445b-ac38-0bf602f222a4): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 10:55:19 crc kubenswrapper[4766]: E1002 10:55:19.851275 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-plr97" podUID="9bbf587f-0557-445b-ac38-0bf602f222a4" Oct 02 10:55:20 crc kubenswrapper[4766]: I1002 10:55:20.156073 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rdxbh" Oct 02 10:55:20 crc kubenswrapper[4766]: E1002 10:55:20.343788 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 02 10:55:20 crc kubenswrapper[4766]: E1002 10:55:20.343934 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4fgz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-npvrp_openshift-marketplace(bb5d66be-21fe-4237-b616-a8c4f41f5f14): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 02 10:55:20 crc kubenswrapper[4766]: E1002 10:55:20.345756 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-npvrp" podUID="bb5d66be-21fe-4237-b616-a8c4f41f5f14" Oct 02 10:55:21 crc kubenswrapper[4766]: I1002 10:55:21.312186 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rdxbh"] Oct 02 10:55:21 crc kubenswrapper[4766]: I1002 10:55:21.312391 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rdxbh" podUID="47362ad1-8f17-4633-bf4f-6aff2ebda031" containerName="registry-server" containerID="cri-o://2640b57cff2f341df0bff427016688d951b2431d329d53c37f56b0a894edcfcf" gracePeriod=2 Oct 02 10:55:22 crc kubenswrapper[4766]: I1002 10:55:22.024087 4766 generic.go:334] "Generic (PLEG): container finished" podID="47362ad1-8f17-4633-bf4f-6aff2ebda031" containerID="2640b57cff2f341df0bff427016688d951b2431d329d53c37f56b0a894edcfcf" exitCode=0 Oct 02 10:55:22 crc kubenswrapper[4766]: I1002 10:55:22.024163 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdxbh" event={"ID":"47362ad1-8f17-4633-bf4f-6aff2ebda031","Type":"ContainerDied","Data":"2640b57cff2f341df0bff427016688d951b2431d329d53c37f56b0a894edcfcf"} Oct 02 10:55:30 crc kubenswrapper[4766]: E1002 10:55:30.120152 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2640b57cff2f341df0bff427016688d951b2431d329d53c37f56b0a894edcfcf is running failed: container process not found" containerID="2640b57cff2f341df0bff427016688d951b2431d329d53c37f56b0a894edcfcf" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 10:55:30 crc kubenswrapper[4766]: E1002 10:55:30.120983 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2640b57cff2f341df0bff427016688d951b2431d329d53c37f56b0a894edcfcf is running failed: container process not found" containerID="2640b57cff2f341df0bff427016688d951b2431d329d53c37f56b0a894edcfcf" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 10:55:30 crc kubenswrapper[4766]: E1002 10:55:30.121354 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2640b57cff2f341df0bff427016688d951b2431d329d53c37f56b0a894edcfcf is running failed: container process not found" containerID="2640b57cff2f341df0bff427016688d951b2431d329d53c37f56b0a894edcfcf" cmd=["grpc_health_probe","-addr=:50051"] Oct 02 10:55:30 crc kubenswrapper[4766]: E1002 10:55:30.121389 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2640b57cff2f341df0bff427016688d951b2431d329d53c37f56b0a894edcfcf is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-rdxbh" podUID="47362ad1-8f17-4633-bf4f-6aff2ebda031" containerName="registry-server" Oct 02 10:55:35 crc kubenswrapper[4766]: E1002 10:55:35.258087 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-npvrp" podUID="bb5d66be-21fe-4237-b616-a8c4f41f5f14" Oct 02 10:55:35 crc kubenswrapper[4766]: E1002 10:55:35.258178 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-plr97" podUID="9bbf587f-0557-445b-ac38-0bf602f222a4" Oct 02 10:55:35 crc kubenswrapper[4766]: I1002 10:55:35.321987 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rdxbh" Oct 02 10:55:35 crc kubenswrapper[4766]: I1002 10:55:35.432558 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47362ad1-8f17-4633-bf4f-6aff2ebda031-catalog-content\") pod \"47362ad1-8f17-4633-bf4f-6aff2ebda031\" (UID: \"47362ad1-8f17-4633-bf4f-6aff2ebda031\") " Oct 02 10:55:35 crc kubenswrapper[4766]: I1002 10:55:35.432629 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47362ad1-8f17-4633-bf4f-6aff2ebda031-utilities\") pod \"47362ad1-8f17-4633-bf4f-6aff2ebda031\" (UID: \"47362ad1-8f17-4633-bf4f-6aff2ebda031\") " Oct 02 10:55:35 crc kubenswrapper[4766]: I1002 10:55:35.432719 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwlcx\" (UniqueName: \"kubernetes.io/projected/47362ad1-8f17-4633-bf4f-6aff2ebda031-kube-api-access-vwlcx\") pod \"47362ad1-8f17-4633-bf4f-6aff2ebda031\" (UID: \"47362ad1-8f17-4633-bf4f-6aff2ebda031\") " Oct 02 10:55:35 crc kubenswrapper[4766]: I1002 10:55:35.433622 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47362ad1-8f17-4633-bf4f-6aff2ebda031-utilities" (OuterVolumeSpecName: "utilities") pod "47362ad1-8f17-4633-bf4f-6aff2ebda031" (UID: "47362ad1-8f17-4633-bf4f-6aff2ebda031"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:55:35 crc kubenswrapper[4766]: I1002 10:55:35.438201 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47362ad1-8f17-4633-bf4f-6aff2ebda031-kube-api-access-vwlcx" (OuterVolumeSpecName: "kube-api-access-vwlcx") pod "47362ad1-8f17-4633-bf4f-6aff2ebda031" (UID: "47362ad1-8f17-4633-bf4f-6aff2ebda031"). InnerVolumeSpecName "kube-api-access-vwlcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:55:35 crc kubenswrapper[4766]: I1002 10:55:35.470112 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47362ad1-8f17-4633-bf4f-6aff2ebda031-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47362ad1-8f17-4633-bf4f-6aff2ebda031" (UID: "47362ad1-8f17-4633-bf4f-6aff2ebda031"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:55:35 crc kubenswrapper[4766]: I1002 10:55:35.534352 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47362ad1-8f17-4633-bf4f-6aff2ebda031-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:35 crc kubenswrapper[4766]: I1002 10:55:35.534398 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47362ad1-8f17-4633-bf4f-6aff2ebda031-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:35 crc kubenswrapper[4766]: I1002 10:55:35.534416 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwlcx\" (UniqueName: \"kubernetes.io/projected/47362ad1-8f17-4633-bf4f-6aff2ebda031-kube-api-access-vwlcx\") on node \"crc\" DevicePath \"\"" Oct 02 10:55:36 crc kubenswrapper[4766]: I1002 10:55:36.108260 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdxbh" event={"ID":"47362ad1-8f17-4633-bf4f-6aff2ebda031","Type":"ContainerDied","Data":"6796d79a97ef828db3032972c180b5c2fcce0cf9c0f0c5430e88fa07f292c999"} Oct 02 10:55:36 crc kubenswrapper[4766]: I1002 10:55:36.108317 4766 scope.go:117] "RemoveContainer" containerID="2640b57cff2f341df0bff427016688d951b2431d329d53c37f56b0a894edcfcf" Oct 02 10:55:36 crc kubenswrapper[4766]: I1002 10:55:36.108437 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rdxbh" Oct 02 10:55:36 crc kubenswrapper[4766]: I1002 10:55:36.134918 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rdxbh"] Oct 02 10:55:36 crc kubenswrapper[4766]: I1002 10:55:36.139191 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rdxbh"] Oct 02 10:55:37 crc kubenswrapper[4766]: I1002 10:55:37.890321 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47362ad1-8f17-4633-bf4f-6aff2ebda031" path="/var/lib/kubelet/pods/47362ad1-8f17-4633-bf4f-6aff2ebda031/volumes" Oct 02 10:55:40 crc kubenswrapper[4766]: I1002 10:55:40.777464 4766 scope.go:117] "RemoveContainer" containerID="2345ec16e94e72ef17cbedf6135f96e4f9b99a11a236261716c64a21ad9ae6c9" Oct 02 10:55:43 crc kubenswrapper[4766]: I1002 10:55:43.791347 4766 scope.go:117] "RemoveContainer" containerID="d4b2e7c962b0b17667a4e59b46020ec75ccdc337b5e18740df9dfd79b16f0b4e" Oct 02 10:55:51 crc kubenswrapper[4766]: I1002 10:55:51.226941 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plr97" event={"ID":"9bbf587f-0557-445b-ac38-0bf602f222a4","Type":"ContainerStarted","Data":"ba4dbfc05707ef9a2f2571d55d0dbfd589962266bd2bf058541e4b4cc5261579"} Oct 02 10:55:51 crc kubenswrapper[4766]: I1002 10:55:51.228182 4766 generic.go:334] "Generic (PLEG): container finished" podID="c6f0281f-0dee-4b1b-a4c0-35294ed0a88f" containerID="fe6c1258dd13fd66ce0596f7c508cf96a3ae1c84e90b668a6f901f9a0004fcd0" exitCode=0 Oct 02 10:55:51 crc kubenswrapper[4766]: I1002 10:55:51.228202 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87d2m" event={"ID":"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f","Type":"ContainerDied","Data":"fe6c1258dd13fd66ce0596f7c508cf96a3ae1c84e90b668a6f901f9a0004fcd0"} Oct 02 10:55:51 crc kubenswrapper[4766]: I1002 10:55:51.230318 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npvrp" event={"ID":"bb5d66be-21fe-4237-b616-a8c4f41f5f14","Type":"ContainerStarted","Data":"bba5c2714ea0c09d4f94df2927e72d04c7b825f87186ce2896be3c54af73a983"} Oct 02 10:55:51 crc kubenswrapper[4766]: I1002 10:55:51.234242 4766 generic.go:334] "Generic (PLEG): container finished" podID="d00b5247-9e12-4202-ae31-20d454dfa183" containerID="4429e69ebb39117a0a033ab82ef3eb817316a84166139debc63cfc685ebdb9d5" exitCode=0 Oct 02 10:55:51 crc kubenswrapper[4766]: I1002 10:55:51.234369 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8788" event={"ID":"d00b5247-9e12-4202-ae31-20d454dfa183","Type":"ContainerDied","Data":"4429e69ebb39117a0a033ab82ef3eb817316a84166139debc63cfc685ebdb9d5"} Oct 02 10:55:51 crc kubenswrapper[4766]: I1002 10:55:51.236804 4766 generic.go:334] "Generic (PLEG): container finished" podID="d03ba5d0-89cc-4be3-b1f5-38fe6f006332" containerID="c003685b97187dbab1141f6909c3a3ce46d5548ae694b5f4c52921c8921ff704" exitCode=0 Oct 02 10:55:51 crc kubenswrapper[4766]: I1002 10:55:51.236855 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgvgz" event={"ID":"d03ba5d0-89cc-4be3-b1f5-38fe6f006332","Type":"ContainerDied","Data":"c003685b97187dbab1141f6909c3a3ce46d5548ae694b5f4c52921c8921ff704"} Oct 02 10:55:52 crc kubenswrapper[4766]: I1002 10:55:52.243454 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8788" event={"ID":"d00b5247-9e12-4202-ae31-20d454dfa183","Type":"ContainerStarted","Data":"f30c9b5668ede69240d70ad01c553dd1d7f1bab01e6df14524ff6b38d9879631"} Oct 02 10:55:52 crc kubenswrapper[4766]: I1002 10:55:52.248686 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgvgz" event={"ID":"d03ba5d0-89cc-4be3-b1f5-38fe6f006332","Type":"ContainerStarted","Data":"940f44f4a8f4eea2114540f9102bbf99a49f7b748fb696444c2830127a23bc18"} Oct 02 10:55:52 crc kubenswrapper[4766]: I1002 10:55:52.251681 4766 generic.go:334] "Generic (PLEG): container finished" podID="9bbf587f-0557-445b-ac38-0bf602f222a4" containerID="ba4dbfc05707ef9a2f2571d55d0dbfd589962266bd2bf058541e4b4cc5261579" exitCode=0 Oct 02 10:55:52 crc kubenswrapper[4766]: I1002 10:55:52.251823 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plr97" event={"ID":"9bbf587f-0557-445b-ac38-0bf602f222a4","Type":"ContainerDied","Data":"ba4dbfc05707ef9a2f2571d55d0dbfd589962266bd2bf058541e4b4cc5261579"} Oct 02 10:55:52 crc kubenswrapper[4766]: I1002 10:55:52.254224 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87d2m" event={"ID":"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f","Type":"ContainerStarted","Data":"ba7f3a91d849a30472ed7ee86a4b1a4b5274d9be3e6b45abe14fa2b9be476f99"} Oct 02 10:55:52 crc kubenswrapper[4766]: I1002 10:55:52.257779 4766 generic.go:334] "Generic (PLEG): container finished" podID="bb5d66be-21fe-4237-b616-a8c4f41f5f14" containerID="bba5c2714ea0c09d4f94df2927e72d04c7b825f87186ce2896be3c54af73a983" exitCode=0 Oct 02 10:55:52 crc kubenswrapper[4766]: I1002 10:55:52.257827 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npvrp" event={"ID":"bb5d66be-21fe-4237-b616-a8c4f41f5f14","Type":"ContainerDied","Data":"bba5c2714ea0c09d4f94df2927e72d04c7b825f87186ce2896be3c54af73a983"} Oct 02 10:55:52 crc kubenswrapper[4766]: I1002 10:55:52.291168 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m8788" podStartSLOduration=3.012346887 podStartE2EDuration="1m53.291144812s" podCreationTimestamp="2025-10-02 10:53:59 +0000 UTC" firstStartedPulling="2025-10-02 10:54:01.465265271 +0000 UTC m=+156.408136215" lastFinishedPulling="2025-10-02 10:55:51.744063196 +0000 UTC m=+266.686934140" observedRunningTime="2025-10-02 10:55:52.273210318 +0000 UTC m=+267.216081272" watchObservedRunningTime="2025-10-02 10:55:52.291144812 +0000 UTC m=+267.234015756" Oct 02 10:55:52 crc kubenswrapper[4766]: I1002 10:55:52.309050 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vgvgz" podStartSLOduration=3.101072211 podStartE2EDuration="1m51.309036173s" podCreationTimestamp="2025-10-02 10:54:01 +0000 UTC" firstStartedPulling="2025-10-02 10:54:03.512164826 +0000 UTC m=+158.455035770" lastFinishedPulling="2025-10-02 10:55:51.720128788 +0000 UTC m=+266.662999732" observedRunningTime="2025-10-02 10:55:52.30676435 +0000 UTC m=+267.249635294" watchObservedRunningTime="2025-10-02 10:55:52.309036173 +0000 UTC m=+267.251907117" Oct 02 10:55:52 crc kubenswrapper[4766]: I1002 10:55:52.342368 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-87d2m" podStartSLOduration=3.049064358 podStartE2EDuration="1m53.342350368s" podCreationTimestamp="2025-10-02 10:53:59 +0000 UTC" firstStartedPulling="2025-10-02 10:54:01.475560452 +0000 UTC m=+156.418431396" lastFinishedPulling="2025-10-02 10:55:51.768846462 +0000 UTC m=+266.711717406" observedRunningTime="2025-10-02 10:55:52.341641025 +0000 UTC m=+267.284511979" watchObservedRunningTime="2025-10-02 10:55:52.342350368 +0000 UTC m=+267.285221302" Oct 02 10:55:53 crc kubenswrapper[4766]: I1002 10:55:53.264360 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plr97" event={"ID":"9bbf587f-0557-445b-ac38-0bf602f222a4","Type":"ContainerStarted","Data":"306f49595c6bd211d1894524cff9868ebdd5d8cfbd2636df72981ced298eaa37"} Oct 02 10:55:53 crc kubenswrapper[4766]: I1002 10:55:53.267289 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npvrp" event={"ID":"bb5d66be-21fe-4237-b616-a8c4f41f5f14","Type":"ContainerStarted","Data":"e064e7f4deae82a6364cce579ee77fcc5fec7af8a8af134197876d757637f9e2"} Oct 02 10:55:53 crc kubenswrapper[4766]: I1002 10:55:53.289209 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-plr97" podStartSLOduration=2.020534839 podStartE2EDuration="1m51.289192777s" podCreationTimestamp="2025-10-02 10:54:02 +0000 UTC" firstStartedPulling="2025-10-02 10:54:03.503921414 +0000 UTC m=+158.446792358" lastFinishedPulling="2025-10-02 10:55:52.772579352 +0000 UTC m=+267.715450296" observedRunningTime="2025-10-02 10:55:53.284872077 +0000 UTC m=+268.227743031" watchObservedRunningTime="2025-10-02 10:55:53.289192777 +0000 UTC m=+268.232063721" Oct 02 10:55:53 crc kubenswrapper[4766]: I1002 10:55:53.305052 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-npvrp" podStartSLOduration=2.177089285 podStartE2EDuration="1m51.305034033s" podCreationTimestamp="2025-10-02 10:54:02 +0000 UTC" firstStartedPulling="2025-10-02 10:54:03.509811689 +0000 UTC m=+158.452682633" lastFinishedPulling="2025-10-02 10:55:52.637756437 +0000 UTC m=+267.580627381" observedRunningTime="2025-10-02 10:55:53.30096902 +0000 UTC m=+268.243839974" watchObservedRunningTime="2025-10-02 10:55:53.305034033 +0000 UTC m=+268.247904987" Oct 02 10:55:59 crc kubenswrapper[4766]: I1002 10:55:59.362013 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m2kx2"] Oct 02 10:55:59 crc kubenswrapper[4766]: I1002 10:55:59.474919 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m8788" Oct 02 10:55:59 crc kubenswrapper[4766]: I1002 10:55:59.474973 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m8788" Oct 02 10:55:59 crc kubenswrapper[4766]: I1002 10:55:59.531141 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m8788" Oct 02 10:55:59 crc kubenswrapper[4766]: I1002 10:55:59.912158 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-87d2m" Oct 02 10:55:59 crc kubenswrapper[4766]: I1002 10:55:59.912672 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-87d2m" Oct 02 10:55:59 crc kubenswrapper[4766]: I1002 10:55:59.947978 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-87d2m" Oct 02 10:56:00 crc kubenswrapper[4766]: I1002 10:56:00.338956 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-87d2m" Oct 02 10:56:00 crc kubenswrapper[4766]: I1002 10:56:00.343096 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m8788" Oct 02 10:56:01 crc kubenswrapper[4766]: I1002 10:56:01.462538 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-87d2m"] Oct 02 10:56:02 crc kubenswrapper[4766]: I1002 10:56:02.039761 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vgvgz" Oct 02 10:56:02 crc kubenswrapper[4766]: I1002 10:56:02.040108 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vgvgz" Oct 02 10:56:02 crc kubenswrapper[4766]: I1002 10:56:02.077017 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vgvgz" Oct 02 10:56:02 crc kubenswrapper[4766]: I1002 10:56:02.349862 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vgvgz" Oct 02 10:56:02 crc kubenswrapper[4766]: I1002 10:56:02.693453 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-npvrp" Oct 02 10:56:02 crc kubenswrapper[4766]: I1002 10:56:02.693575 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-npvrp" Oct 02 10:56:02 crc kubenswrapper[4766]: I1002 10:56:02.749901 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-npvrp" Oct 02 10:56:03 crc kubenswrapper[4766]: I1002 10:56:03.049252 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-plr97" Oct 02 10:56:03 crc kubenswrapper[4766]: I1002 10:56:03.049518 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-plr97" Oct 02 10:56:03 crc kubenswrapper[4766]: I1002 10:56:03.094996 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-plr97" Oct 02 10:56:03 crc kubenswrapper[4766]: I1002 10:56:03.318229 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-87d2m" podUID="c6f0281f-0dee-4b1b-a4c0-35294ed0a88f" containerName="registry-server" containerID="cri-o://ba7f3a91d849a30472ed7ee86a4b1a4b5274d9be3e6b45abe14fa2b9be476f99" gracePeriod=2 Oct 02 10:56:03 crc kubenswrapper[4766]: I1002 10:56:03.357867 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-plr97" Oct 02 10:56:03 crc kubenswrapper[4766]: I1002 10:56:03.362084 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-npvrp" Oct 02 10:56:04 crc kubenswrapper[4766]: I1002 10:56:04.460659 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgvgz"] Oct 02 10:56:04 crc kubenswrapper[4766]: I1002 10:56:04.461166 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vgvgz" podUID="d03ba5d0-89cc-4be3-b1f5-38fe6f006332" containerName="registry-server" containerID="cri-o://940f44f4a8f4eea2114540f9102bbf99a49f7b748fb696444c2830127a23bc18" gracePeriod=2 Oct 02 10:56:04 crc kubenswrapper[4766]: I1002 10:56:04.892195 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgvgz" Oct 02 10:56:04 crc kubenswrapper[4766]: I1002 10:56:04.945242 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-utilities\") pod \"d03ba5d0-89cc-4be3-b1f5-38fe6f006332\" (UID: \"d03ba5d0-89cc-4be3-b1f5-38fe6f006332\") " Oct 02 10:56:04 crc kubenswrapper[4766]: I1002 10:56:04.945289 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-catalog-content\") pod \"d03ba5d0-89cc-4be3-b1f5-38fe6f006332\" (UID: \"d03ba5d0-89cc-4be3-b1f5-38fe6f006332\") " Oct 02 10:56:04 crc kubenswrapper[4766]: I1002 10:56:04.945348 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx88s\" (UniqueName: \"kubernetes.io/projected/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-kube-api-access-qx88s\") pod \"d03ba5d0-89cc-4be3-b1f5-38fe6f006332\" (UID: \"d03ba5d0-89cc-4be3-b1f5-38fe6f006332\") " Oct 02 10:56:04 crc kubenswrapper[4766]: I1002 10:56:04.946517 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-utilities" (OuterVolumeSpecName: "utilities") pod "d03ba5d0-89cc-4be3-b1f5-38fe6f006332" (UID: "d03ba5d0-89cc-4be3-b1f5-38fe6f006332"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:56:04 crc kubenswrapper[4766]: I1002 10:56:04.952042 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-kube-api-access-qx88s" (OuterVolumeSpecName: "kube-api-access-qx88s") pod "d03ba5d0-89cc-4be3-b1f5-38fe6f006332" (UID: "d03ba5d0-89cc-4be3-b1f5-38fe6f006332"). InnerVolumeSpecName "kube-api-access-qx88s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:56:04 crc kubenswrapper[4766]: I1002 10:56:04.960396 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d03ba5d0-89cc-4be3-b1f5-38fe6f006332" (UID: "d03ba5d0-89cc-4be3-b1f5-38fe6f006332"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.047090 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.047422 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.047434 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx88s\" (UniqueName: \"kubernetes.io/projected/d03ba5d0-89cc-4be3-b1f5-38fe6f006332-kube-api-access-qx88s\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.330092 4766 generic.go:334] "Generic (PLEG): container finished" podID="c6f0281f-0dee-4b1b-a4c0-35294ed0a88f" containerID="ba7f3a91d849a30472ed7ee86a4b1a4b5274d9be3e6b45abe14fa2b9be476f99" exitCode=0 Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.330154 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87d2m" event={"ID":"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f","Type":"ContainerDied","Data":"ba7f3a91d849a30472ed7ee86a4b1a4b5274d9be3e6b45abe14fa2b9be476f99"} Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.333925 4766 generic.go:334] "Generic (PLEG): container finished" podID="d03ba5d0-89cc-4be3-b1f5-38fe6f006332" containerID="940f44f4a8f4eea2114540f9102bbf99a49f7b748fb696444c2830127a23bc18" exitCode=0 Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.334585 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vgvgz" Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.335721 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgvgz" event={"ID":"d03ba5d0-89cc-4be3-b1f5-38fe6f006332","Type":"ContainerDied","Data":"940f44f4a8f4eea2114540f9102bbf99a49f7b748fb696444c2830127a23bc18"} Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.335804 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vgvgz" event={"ID":"d03ba5d0-89cc-4be3-b1f5-38fe6f006332","Type":"ContainerDied","Data":"0848384d9552e695f730092ff71c49bb205f425448622f4ef922ca44031b9771"} Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.335830 4766 scope.go:117] "RemoveContainer" containerID="940f44f4a8f4eea2114540f9102bbf99a49f7b748fb696444c2830127a23bc18" Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.355472 4766 scope.go:117] "RemoveContainer" containerID="c003685b97187dbab1141f6909c3a3ce46d5548ae694b5f4c52921c8921ff704" Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.362860 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgvgz"] Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.366414 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vgvgz"] Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.375216 4766 scope.go:117] "RemoveContainer" containerID="028e7e72b40d1a7994c8c1be2e91243e2ca4dab24b6004206c97e91a08ce5eb7" Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.393264 4766 scope.go:117] "RemoveContainer" containerID="940f44f4a8f4eea2114540f9102bbf99a49f7b748fb696444c2830127a23bc18" Oct 02 10:56:05 crc kubenswrapper[4766]: E1002 10:56:05.393650 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"940f44f4a8f4eea2114540f9102bbf99a49f7b748fb696444c2830127a23bc18\": container with ID starting with 940f44f4a8f4eea2114540f9102bbf99a49f7b748fb696444c2830127a23bc18 not found: ID does not exist" containerID="940f44f4a8f4eea2114540f9102bbf99a49f7b748fb696444c2830127a23bc18" Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.393679 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"940f44f4a8f4eea2114540f9102bbf99a49f7b748fb696444c2830127a23bc18"} err="failed to get container status \"940f44f4a8f4eea2114540f9102bbf99a49f7b748fb696444c2830127a23bc18\": rpc error: code = NotFound desc = could not find container \"940f44f4a8f4eea2114540f9102bbf99a49f7b748fb696444c2830127a23bc18\": container with ID starting with 940f44f4a8f4eea2114540f9102bbf99a49f7b748fb696444c2830127a23bc18 not found: ID does not exist" Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.393701 4766 scope.go:117] "RemoveContainer" containerID="c003685b97187dbab1141f6909c3a3ce46d5548ae694b5f4c52921c8921ff704" Oct 02 10:56:05 crc kubenswrapper[4766]: E1002 10:56:05.394039 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c003685b97187dbab1141f6909c3a3ce46d5548ae694b5f4c52921c8921ff704\": container with ID starting with c003685b97187dbab1141f6909c3a3ce46d5548ae694b5f4c52921c8921ff704 not found: ID does not exist" containerID="c003685b97187dbab1141f6909c3a3ce46d5548ae694b5f4c52921c8921ff704" Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.394091 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c003685b97187dbab1141f6909c3a3ce46d5548ae694b5f4c52921c8921ff704"} err="failed to get container status \"c003685b97187dbab1141f6909c3a3ce46d5548ae694b5f4c52921c8921ff704\": rpc error: code = NotFound desc = could not find container \"c003685b97187dbab1141f6909c3a3ce46d5548ae694b5f4c52921c8921ff704\": container with ID starting with c003685b97187dbab1141f6909c3a3ce46d5548ae694b5f4c52921c8921ff704 not found: ID does not exist" Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.394123 4766 scope.go:117] "RemoveContainer" containerID="028e7e72b40d1a7994c8c1be2e91243e2ca4dab24b6004206c97e91a08ce5eb7" Oct 02 10:56:05 crc kubenswrapper[4766]: E1002 10:56:05.394412 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"028e7e72b40d1a7994c8c1be2e91243e2ca4dab24b6004206c97e91a08ce5eb7\": container with ID starting with 028e7e72b40d1a7994c8c1be2e91243e2ca4dab24b6004206c97e91a08ce5eb7 not found: ID does not exist" containerID="028e7e72b40d1a7994c8c1be2e91243e2ca4dab24b6004206c97e91a08ce5eb7" Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.394439 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"028e7e72b40d1a7994c8c1be2e91243e2ca4dab24b6004206c97e91a08ce5eb7"} err="failed to get container status \"028e7e72b40d1a7994c8c1be2e91243e2ca4dab24b6004206c97e91a08ce5eb7\": rpc error: code = NotFound desc = could not find container \"028e7e72b40d1a7994c8c1be2e91243e2ca4dab24b6004206c97e91a08ce5eb7\": container with ID starting with 028e7e72b40d1a7994c8c1be2e91243e2ca4dab24b6004206c97e91a08ce5eb7 not found: ID does not exist" Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.466339 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-plr97"] Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.837425 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87d2m" Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.907981 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d03ba5d0-89cc-4be3-b1f5-38fe6f006332" path="/var/lib/kubelet/pods/d03ba5d0-89cc-4be3-b1f5-38fe6f006332/volumes" Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.959769 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-catalog-content\") pod \"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f\" (UID: \"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f\") " Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.959848 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v48nk\" (UniqueName: \"kubernetes.io/projected/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-kube-api-access-v48nk\") pod \"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f\" (UID: \"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f\") " Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.959870 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-utilities\") pod \"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f\" (UID: \"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f\") " Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.960838 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-utilities" (OuterVolumeSpecName: "utilities") pod "c6f0281f-0dee-4b1b-a4c0-35294ed0a88f" (UID: "c6f0281f-0dee-4b1b-a4c0-35294ed0a88f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:56:05 crc kubenswrapper[4766]: I1002 10:56:05.966711 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-kube-api-access-v48nk" (OuterVolumeSpecName: "kube-api-access-v48nk") pod "c6f0281f-0dee-4b1b-a4c0-35294ed0a88f" (UID: "c6f0281f-0dee-4b1b-a4c0-35294ed0a88f"). InnerVolumeSpecName "kube-api-access-v48nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:56:06 crc kubenswrapper[4766]: I1002 10:56:06.003610 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6f0281f-0dee-4b1b-a4c0-35294ed0a88f" (UID: "c6f0281f-0dee-4b1b-a4c0-35294ed0a88f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:56:06 crc kubenswrapper[4766]: I1002 10:56:06.068176 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:06 crc kubenswrapper[4766]: I1002 10:56:06.068209 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v48nk\" (UniqueName: \"kubernetes.io/projected/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-kube-api-access-v48nk\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:06 crc kubenswrapper[4766]: I1002 10:56:06.068222 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:06 crc kubenswrapper[4766]: I1002 10:56:06.341116 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87d2m" Oct 02 10:56:06 crc kubenswrapper[4766]: I1002 10:56:06.341109 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87d2m" event={"ID":"c6f0281f-0dee-4b1b-a4c0-35294ed0a88f","Type":"ContainerDied","Data":"f243f1845022e6420a6dd625bbf9d9b9ec55dfbf40a96e2391582b12ee97cb09"} Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:06.341202 4766 scope.go:117] "RemoveContainer" containerID="ba7f3a91d849a30472ed7ee86a4b1a4b5274d9be3e6b45abe14fa2b9be476f99" Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:06.341277 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-plr97" podUID="9bbf587f-0557-445b-ac38-0bf602f222a4" containerName="registry-server" containerID="cri-o://306f49595c6bd211d1894524cff9868ebdd5d8cfbd2636df72981ced298eaa37" gracePeriod=2 Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:06.358093 4766 scope.go:117] "RemoveContainer" containerID="fe6c1258dd13fd66ce0596f7c508cf96a3ae1c84e90b668a6f901f9a0004fcd0" Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:06.366621 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-87d2m"] Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:06.370046 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-87d2m"] Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:06.396118 4766 scope.go:117] "RemoveContainer" containerID="f37fb9acc461d9e5ee883e55a7c6d3545199f29015bab7e738e132c1d490134a" Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:07.347326 4766 generic.go:334] "Generic (PLEG): container finished" podID="9bbf587f-0557-445b-ac38-0bf602f222a4" containerID="306f49595c6bd211d1894524cff9868ebdd5d8cfbd2636df72981ced298eaa37" exitCode=0 Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:07.347407 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plr97" event={"ID":"9bbf587f-0557-445b-ac38-0bf602f222a4","Type":"ContainerDied","Data":"306f49595c6bd211d1894524cff9868ebdd5d8cfbd2636df72981ced298eaa37"} Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:07.778156 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plr97" Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:07.887401 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6f0281f-0dee-4b1b-a4c0-35294ed0a88f" path="/var/lib/kubelet/pods/c6f0281f-0dee-4b1b-a4c0-35294ed0a88f/volumes" Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:07.890164 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bbf587f-0557-445b-ac38-0bf602f222a4-utilities\") pod \"9bbf587f-0557-445b-ac38-0bf602f222a4\" (UID: \"9bbf587f-0557-445b-ac38-0bf602f222a4\") " Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:07.890258 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bbf587f-0557-445b-ac38-0bf602f222a4-catalog-content\") pod \"9bbf587f-0557-445b-ac38-0bf602f222a4\" (UID: \"9bbf587f-0557-445b-ac38-0bf602f222a4\") " Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:07.890319 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mxrk\" (UniqueName: \"kubernetes.io/projected/9bbf587f-0557-445b-ac38-0bf602f222a4-kube-api-access-4mxrk\") pod \"9bbf587f-0557-445b-ac38-0bf602f222a4\" (UID: \"9bbf587f-0557-445b-ac38-0bf602f222a4\") " Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:07.890941 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bbf587f-0557-445b-ac38-0bf602f222a4-utilities" (OuterVolumeSpecName: "utilities") pod "9bbf587f-0557-445b-ac38-0bf602f222a4" (UID: "9bbf587f-0557-445b-ac38-0bf602f222a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:07.893942 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bbf587f-0557-445b-ac38-0bf602f222a4-kube-api-access-4mxrk" (OuterVolumeSpecName: "kube-api-access-4mxrk") pod "9bbf587f-0557-445b-ac38-0bf602f222a4" (UID: "9bbf587f-0557-445b-ac38-0bf602f222a4"). InnerVolumeSpecName "kube-api-access-4mxrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:07.973380 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bbf587f-0557-445b-ac38-0bf602f222a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9bbf587f-0557-445b-ac38-0bf602f222a4" (UID: "9bbf587f-0557-445b-ac38-0bf602f222a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:07.991897 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9bbf587f-0557-445b-ac38-0bf602f222a4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:07.991924 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mxrk\" (UniqueName: \"kubernetes.io/projected/9bbf587f-0557-445b-ac38-0bf602f222a4-kube-api-access-4mxrk\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:07 crc kubenswrapper[4766]: I1002 10:56:07.991935 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9bbf587f-0557-445b-ac38-0bf602f222a4-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:08 crc kubenswrapper[4766]: I1002 10:56:08.355200 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-plr97" event={"ID":"9bbf587f-0557-445b-ac38-0bf602f222a4","Type":"ContainerDied","Data":"3d47c5f588ed00f17e6d5ca6c5705a59bfbc8ed8b2b50f60124080da5e1a6505"} Oct 02 10:56:08 crc kubenswrapper[4766]: I1002 10:56:08.355261 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-plr97" Oct 02 10:56:08 crc kubenswrapper[4766]: I1002 10:56:08.355264 4766 scope.go:117] "RemoveContainer" containerID="306f49595c6bd211d1894524cff9868ebdd5d8cfbd2636df72981ced298eaa37" Oct 02 10:56:08 crc kubenswrapper[4766]: I1002 10:56:08.370693 4766 scope.go:117] "RemoveContainer" containerID="ba4dbfc05707ef9a2f2571d55d0dbfd589962266bd2bf058541e4b4cc5261579" Oct 02 10:56:08 crc kubenswrapper[4766]: I1002 10:56:08.383532 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-plr97"] Oct 02 10:56:08 crc kubenswrapper[4766]: I1002 10:56:08.385933 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-plr97"] Oct 02 10:56:08 crc kubenswrapper[4766]: I1002 10:56:08.406449 4766 scope.go:117] "RemoveContainer" containerID="155679f7cb7fec90088d36a8def3561c8cd2dd13d044251ee13b97d2a39b69f0" Oct 02 10:56:09 crc kubenswrapper[4766]: I1002 10:56:09.887337 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bbf587f-0557-445b-ac38-0bf602f222a4" path="/var/lib/kubelet/pods/9bbf587f-0557-445b-ac38-0bf602f222a4/volumes" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.388075 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" podUID="d24e91e7-dadf-4b67-be1b-a945b1250017" containerName="oauth-openshift" containerID="cri-o://767ea522b236dfacd8fcf1666d4426e2f68e1488d2ad20a2fb8be28ddaaaed8e" gracePeriod=15 Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.702770 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.735552 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-66c58d94fc-fdxmt"] Oct 02 10:56:24 crc kubenswrapper[4766]: E1002 10:56:24.735745 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f0281f-0dee-4b1b-a4c0-35294ed0a88f" containerName="extract-content" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.735755 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f0281f-0dee-4b1b-a4c0-35294ed0a88f" containerName="extract-content" Oct 02 10:56:24 crc kubenswrapper[4766]: E1002 10:56:24.735765 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bbf587f-0557-445b-ac38-0bf602f222a4" containerName="registry-server" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.735771 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bbf587f-0557-445b-ac38-0bf602f222a4" containerName="registry-server" Oct 02 10:56:24 crc kubenswrapper[4766]: E1002 10:56:24.735780 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d24e91e7-dadf-4b67-be1b-a945b1250017" containerName="oauth-openshift" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.735786 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d24e91e7-dadf-4b67-be1b-a945b1250017" containerName="oauth-openshift" Oct 02 10:56:24 crc kubenswrapper[4766]: E1002 10:56:24.735794 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d03ba5d0-89cc-4be3-b1f5-38fe6f006332" containerName="extract-content" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.735800 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d03ba5d0-89cc-4be3-b1f5-38fe6f006332" containerName="extract-content" Oct 02 10:56:24 crc kubenswrapper[4766]: E1002 10:56:24.735806 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f0281f-0dee-4b1b-a4c0-35294ed0a88f" containerName="registry-server" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.735813 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f0281f-0dee-4b1b-a4c0-35294ed0a88f" containerName="registry-server" Oct 02 10:56:24 crc kubenswrapper[4766]: E1002 10:56:24.735822 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bbf587f-0557-445b-ac38-0bf602f222a4" containerName="extract-content" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.735828 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bbf587f-0557-445b-ac38-0bf602f222a4" containerName="extract-content" Oct 02 10:56:24 crc kubenswrapper[4766]: E1002 10:56:24.735837 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f0281f-0dee-4b1b-a4c0-35294ed0a88f" containerName="extract-utilities" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.735842 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f0281f-0dee-4b1b-a4c0-35294ed0a88f" containerName="extract-utilities" Oct 02 10:56:24 crc kubenswrapper[4766]: E1002 10:56:24.735850 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d03ba5d0-89cc-4be3-b1f5-38fe6f006332" containerName="registry-server" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.735855 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d03ba5d0-89cc-4be3-b1f5-38fe6f006332" containerName="registry-server" Oct 02 10:56:24 crc kubenswrapper[4766]: E1002 10:56:24.735866 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d03ba5d0-89cc-4be3-b1f5-38fe6f006332" containerName="extract-utilities" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.735872 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d03ba5d0-89cc-4be3-b1f5-38fe6f006332" containerName="extract-utilities" Oct 02 10:56:24 crc kubenswrapper[4766]: E1002 10:56:24.735882 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bbf587f-0557-445b-ac38-0bf602f222a4" containerName="extract-utilities" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.735888 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bbf587f-0557-445b-ac38-0bf602f222a4" containerName="extract-utilities" Oct 02 10:56:24 crc kubenswrapper[4766]: E1002 10:56:24.735895 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47362ad1-8f17-4633-bf4f-6aff2ebda031" containerName="extract-content" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.735900 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="47362ad1-8f17-4633-bf4f-6aff2ebda031" containerName="extract-content" Oct 02 10:56:24 crc kubenswrapper[4766]: E1002 10:56:24.735908 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47362ad1-8f17-4633-bf4f-6aff2ebda031" containerName="registry-server" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.735914 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="47362ad1-8f17-4633-bf4f-6aff2ebda031" containerName="registry-server" Oct 02 10:56:24 crc kubenswrapper[4766]: E1002 10:56:24.735923 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47362ad1-8f17-4633-bf4f-6aff2ebda031" containerName="extract-utilities" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.735929 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="47362ad1-8f17-4633-bf4f-6aff2ebda031" containerName="extract-utilities" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.736011 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bbf587f-0557-445b-ac38-0bf602f222a4" containerName="registry-server" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.736021 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="47362ad1-8f17-4633-bf4f-6aff2ebda031" containerName="registry-server" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.736031 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6f0281f-0dee-4b1b-a4c0-35294ed0a88f" containerName="registry-server" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.736039 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d03ba5d0-89cc-4be3-b1f5-38fe6f006332" containerName="registry-server" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.736047 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d24e91e7-dadf-4b67-be1b-a945b1250017" containerName="oauth-openshift" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.736404 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.750443 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66c58d94fc-fdxmt"] Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.811407 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d24e91e7-dadf-4b67-be1b-a945b1250017-audit-dir\") pod \"d24e91e7-dadf-4b67-be1b-a945b1250017\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.811704 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-cliconfig\") pod \"d24e91e7-dadf-4b67-be1b-a945b1250017\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.811819 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-idp-0-file-data\") pod \"d24e91e7-dadf-4b67-be1b-a945b1250017\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.811916 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-audit-policies\") pod \"d24e91e7-dadf-4b67-be1b-a945b1250017\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.812036 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-service-ca\") pod \"d24e91e7-dadf-4b67-be1b-a945b1250017\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.812141 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-provider-selection\") pod \"d24e91e7-dadf-4b67-be1b-a945b1250017\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.812218 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-session\") pod \"d24e91e7-dadf-4b67-be1b-a945b1250017\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.812319 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-router-certs\") pod \"d24e91e7-dadf-4b67-be1b-a945b1250017\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.812409 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-login\") pod \"d24e91e7-dadf-4b67-be1b-a945b1250017\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.812483 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-error\") pod \"d24e91e7-dadf-4b67-be1b-a945b1250017\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.812611 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stmtk\" (UniqueName: \"kubernetes.io/projected/d24e91e7-dadf-4b67-be1b-a945b1250017-kube-api-access-stmtk\") pod \"d24e91e7-dadf-4b67-be1b-a945b1250017\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.812812 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-serving-cert\") pod \"d24e91e7-dadf-4b67-be1b-a945b1250017\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.812935 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-trusted-ca-bundle\") pod \"d24e91e7-dadf-4b67-be1b-a945b1250017\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.813059 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-ocp-branding-template\") pod \"d24e91e7-dadf-4b67-be1b-a945b1250017\" (UID: \"d24e91e7-dadf-4b67-be1b-a945b1250017\") " Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.811544 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d24e91e7-dadf-4b67-be1b-a945b1250017-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d24e91e7-dadf-4b67-be1b-a945b1250017" (UID: "d24e91e7-dadf-4b67-be1b-a945b1250017"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.812781 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d24e91e7-dadf-4b67-be1b-a945b1250017" (UID: "d24e91e7-dadf-4b67-be1b-a945b1250017"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.812801 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d24e91e7-dadf-4b67-be1b-a945b1250017" (UID: "d24e91e7-dadf-4b67-be1b-a945b1250017"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.812846 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d24e91e7-dadf-4b67-be1b-a945b1250017" (UID: "d24e91e7-dadf-4b67-be1b-a945b1250017"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.813884 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d24e91e7-dadf-4b67-be1b-a945b1250017" (UID: "d24e91e7-dadf-4b67-be1b-a945b1250017"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.814151 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.814173 4766 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d24e91e7-dadf-4b67-be1b-a945b1250017-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.814186 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.814197 4766 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.814208 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.820182 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d24e91e7-dadf-4b67-be1b-a945b1250017" (UID: "d24e91e7-dadf-4b67-be1b-a945b1250017"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.820656 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d24e91e7-dadf-4b67-be1b-a945b1250017-kube-api-access-stmtk" (OuterVolumeSpecName: "kube-api-access-stmtk") pod "d24e91e7-dadf-4b67-be1b-a945b1250017" (UID: "d24e91e7-dadf-4b67-be1b-a945b1250017"). InnerVolumeSpecName "kube-api-access-stmtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.820989 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d24e91e7-dadf-4b67-be1b-a945b1250017" (UID: "d24e91e7-dadf-4b67-be1b-a945b1250017"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.821350 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d24e91e7-dadf-4b67-be1b-a945b1250017" (UID: "d24e91e7-dadf-4b67-be1b-a945b1250017"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.821592 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d24e91e7-dadf-4b67-be1b-a945b1250017" (UID: "d24e91e7-dadf-4b67-be1b-a945b1250017"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.821829 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d24e91e7-dadf-4b67-be1b-a945b1250017" (UID: "d24e91e7-dadf-4b67-be1b-a945b1250017"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.821934 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d24e91e7-dadf-4b67-be1b-a945b1250017" (UID: "d24e91e7-dadf-4b67-be1b-a945b1250017"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.824213 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d24e91e7-dadf-4b67-be1b-a945b1250017" (UID: "d24e91e7-dadf-4b67-be1b-a945b1250017"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.829150 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d24e91e7-dadf-4b67-be1b-a945b1250017" (UID: "d24e91e7-dadf-4b67-be1b-a945b1250017"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.915216 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-session\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.915289 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.915319 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.915340 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.915359 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dptq\" (UniqueName: \"kubernetes.io/projected/9d4c51e8-3e90-436e-a275-e7538580d8d2-kube-api-access-8dptq\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.915421 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-user-template-error\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.915448 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.915475 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.915602 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-router-certs\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.915741 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9d4c51e8-3e90-436e-a275-e7538580d8d2-audit-policies\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.915768 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-user-template-login\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.915792 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-service-ca\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.915863 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9d4c51e8-3e90-436e-a275-e7538580d8d2-audit-dir\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.915932 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.916033 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.916053 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.916065 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.916075 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.916085 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.916094 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.916105 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stmtk\" (UniqueName: \"kubernetes.io/projected/d24e91e7-dadf-4b67-be1b-a945b1250017-kube-api-access-stmtk\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.916114 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:24 crc kubenswrapper[4766]: I1002 10:56:24.916124 4766 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d24e91e7-dadf-4b67-be1b-a945b1250017-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.016996 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.017046 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-router-certs\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.017110 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9d4c51e8-3e90-436e-a275-e7538580d8d2-audit-policies\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.017144 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-user-template-login\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.017197 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-service-ca\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.017218 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9d4c51e8-3e90-436e-a275-e7538580d8d2-audit-dir\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.017271 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.017292 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-session\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.017338 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.017365 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.017384 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.017408 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dptq\" (UniqueName: \"kubernetes.io/projected/9d4c51e8-3e90-436e-a275-e7538580d8d2-kube-api-access-8dptq\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.017438 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-user-template-error\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.017477 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.018109 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9d4c51e8-3e90-436e-a275-e7538580d8d2-audit-dir\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.018670 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-service-ca\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.018865 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.019147 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9d4c51e8-3e90-436e-a275-e7538580d8d2-audit-policies\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.019246 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.020717 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.020744 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-router-certs\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.021033 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-user-template-error\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.021192 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-user-template-login\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.021426 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-session\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.022848 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.023657 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.023820 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9d4c51e8-3e90-436e-a275-e7538580d8d2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.033274 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dptq\" (UniqueName: \"kubernetes.io/projected/9d4c51e8-3e90-436e-a275-e7538580d8d2-kube-api-access-8dptq\") pod \"oauth-openshift-66c58d94fc-fdxmt\" (UID: \"9d4c51e8-3e90-436e-a275-e7538580d8d2\") " pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.059467 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.221223 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66c58d94fc-fdxmt"] Oct 02 10:56:25 crc kubenswrapper[4766]: W1002 10:56:25.228396 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d4c51e8_3e90_436e_a275_e7538580d8d2.slice/crio-bcc7acd7e327385a2ac200245dcdd7c1a26d789cb5be614eb107f0b141ae07ea WatchSource:0}: Error finding container bcc7acd7e327385a2ac200245dcdd7c1a26d789cb5be614eb107f0b141ae07ea: Status 404 returned error can't find the container with id bcc7acd7e327385a2ac200245dcdd7c1a26d789cb5be614eb107f0b141ae07ea Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.454952 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" event={"ID":"9d4c51e8-3e90-436e-a275-e7538580d8d2","Type":"ContainerStarted","Data":"bcc7acd7e327385a2ac200245dcdd7c1a26d789cb5be614eb107f0b141ae07ea"} Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.456165 4766 generic.go:334] "Generic (PLEG): container finished" podID="d24e91e7-dadf-4b67-be1b-a945b1250017" containerID="767ea522b236dfacd8fcf1666d4426e2f68e1488d2ad20a2fb8be28ddaaaed8e" exitCode=0 Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.456240 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" event={"ID":"d24e91e7-dadf-4b67-be1b-a945b1250017","Type":"ContainerDied","Data":"767ea522b236dfacd8fcf1666d4426e2f68e1488d2ad20a2fb8be28ddaaaed8e"} Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.456260 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.456275 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-m2kx2" event={"ID":"d24e91e7-dadf-4b67-be1b-a945b1250017","Type":"ContainerDied","Data":"fa36233076b97946cc538d98600057e8767c67b07eb04e728f5de6d612c85606"} Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.456296 4766 scope.go:117] "RemoveContainer" containerID="767ea522b236dfacd8fcf1666d4426e2f68e1488d2ad20a2fb8be28ddaaaed8e" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.471365 4766 scope.go:117] "RemoveContainer" containerID="767ea522b236dfacd8fcf1666d4426e2f68e1488d2ad20a2fb8be28ddaaaed8e" Oct 02 10:56:25 crc kubenswrapper[4766]: E1002 10:56:25.471754 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"767ea522b236dfacd8fcf1666d4426e2f68e1488d2ad20a2fb8be28ddaaaed8e\": container with ID starting with 767ea522b236dfacd8fcf1666d4426e2f68e1488d2ad20a2fb8be28ddaaaed8e not found: ID does not exist" containerID="767ea522b236dfacd8fcf1666d4426e2f68e1488d2ad20a2fb8be28ddaaaed8e" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.471783 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"767ea522b236dfacd8fcf1666d4426e2f68e1488d2ad20a2fb8be28ddaaaed8e"} err="failed to get container status \"767ea522b236dfacd8fcf1666d4426e2f68e1488d2ad20a2fb8be28ddaaaed8e\": rpc error: code = NotFound desc = could not find container \"767ea522b236dfacd8fcf1666d4426e2f68e1488d2ad20a2fb8be28ddaaaed8e\": container with ID starting with 767ea522b236dfacd8fcf1666d4426e2f68e1488d2ad20a2fb8be28ddaaaed8e not found: ID does not exist" Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.486899 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m2kx2"] Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.491103 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-m2kx2"] Oct 02 10:56:25 crc kubenswrapper[4766]: I1002 10:56:25.888666 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d24e91e7-dadf-4b67-be1b-a945b1250017" path="/var/lib/kubelet/pods/d24e91e7-dadf-4b67-be1b-a945b1250017/volumes" Oct 02 10:56:26 crc kubenswrapper[4766]: I1002 10:56:26.464393 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" event={"ID":"9d4c51e8-3e90-436e-a275-e7538580d8d2","Type":"ContainerStarted","Data":"9ee864211908e1d447b575d470cae61e58a92f43afc4d14298c8c30f8c7c078f"} Oct 02 10:56:26 crc kubenswrapper[4766]: I1002 10:56:26.464629 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:26 crc kubenswrapper[4766]: I1002 10:56:26.469400 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" Oct 02 10:56:26 crc kubenswrapper[4766]: I1002 10:56:26.485859 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-66c58d94fc-fdxmt" podStartSLOduration=27.485844065 podStartE2EDuration="27.485844065s" podCreationTimestamp="2025-10-02 10:55:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:56:26.485172553 +0000 UTC m=+301.428043527" watchObservedRunningTime="2025-10-02 10:56:26.485844065 +0000 UTC m=+301.428715009" Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.621259 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5mmm7"] Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.622159 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5mmm7" podUID="9a491a4e-eefa-4908-8e7b-1d5c3e67274c" containerName="registry-server" containerID="cri-o://9ccb90a5e883e26604eab9b292fe43ba1b9b0c8e93748b0179c798cf35c8b7e0" gracePeriod=30 Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.628280 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m8788"] Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.628572 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m8788" podUID="d00b5247-9e12-4202-ae31-20d454dfa183" containerName="registry-server" containerID="cri-o://f30c9b5668ede69240d70ad01c553dd1d7f1bab01e6df14524ff6b38d9879631" gracePeriod=30 Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.660733 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vkm2v"] Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.660961 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" podUID="a57c2a77-db59-4b73-b376-640de2af9a7e" containerName="marketplace-operator" containerID="cri-o://fd592f4eaa72c7cc60d59e29049bd321e3843dbfd1355aa26fb9e593fcd2862a" gracePeriod=30 Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.680359 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgj4n"] Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.680812 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rgj4n" podUID="8e1a892b-5e21-4f62-9859-d4e4a8ef9623" containerName="registry-server" containerID="cri-o://9b6a44c0fd03ae35e84e43af15c18cd0d0ec872ed1aac6bf92c020fe135f9051" gracePeriod=30 Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.683898 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-npvrp"] Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.691171 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-npvrp" podUID="bb5d66be-21fe-4237-b616-a8c4f41f5f14" containerName="registry-server" containerID="cri-o://e064e7f4deae82a6364cce579ee77fcc5fec7af8a8af134197876d757637f9e2" gracePeriod=30 Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.691624 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7wx9g"] Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.692336 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7wx9g" Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.713400 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7wx9g"] Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.864929 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6chm7\" (UniqueName: \"kubernetes.io/projected/9677731c-12a8-4fa5-b5c1-ba1238a7f315-kube-api-access-6chm7\") pod \"marketplace-operator-79b997595-7wx9g\" (UID: \"9677731c-12a8-4fa5-b5c1-ba1238a7f315\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wx9g" Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.865020 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9677731c-12a8-4fa5-b5c1-ba1238a7f315-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7wx9g\" (UID: \"9677731c-12a8-4fa5-b5c1-ba1238a7f315\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wx9g" Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.865101 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9677731c-12a8-4fa5-b5c1-ba1238a7f315-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7wx9g\" (UID: \"9677731c-12a8-4fa5-b5c1-ba1238a7f315\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wx9g" Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.966524 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9677731c-12a8-4fa5-b5c1-ba1238a7f315-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7wx9g\" (UID: \"9677731c-12a8-4fa5-b5c1-ba1238a7f315\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wx9g" Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.966619 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6chm7\" (UniqueName: \"kubernetes.io/projected/9677731c-12a8-4fa5-b5c1-ba1238a7f315-kube-api-access-6chm7\") pod \"marketplace-operator-79b997595-7wx9g\" (UID: \"9677731c-12a8-4fa5-b5c1-ba1238a7f315\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wx9g" Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.966716 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9677731c-12a8-4fa5-b5c1-ba1238a7f315-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7wx9g\" (UID: \"9677731c-12a8-4fa5-b5c1-ba1238a7f315\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wx9g" Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.968406 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9677731c-12a8-4fa5-b5c1-ba1238a7f315-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-7wx9g\" (UID: \"9677731c-12a8-4fa5-b5c1-ba1238a7f315\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wx9g" Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.980370 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9677731c-12a8-4fa5-b5c1-ba1238a7f315-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-7wx9g\" (UID: \"9677731c-12a8-4fa5-b5c1-ba1238a7f315\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wx9g" Oct 02 10:56:46 crc kubenswrapper[4766]: I1002 10:56:46.984752 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6chm7\" (UniqueName: \"kubernetes.io/projected/9677731c-12a8-4fa5-b5c1-ba1238a7f315-kube-api-access-6chm7\") pod \"marketplace-operator-79b997595-7wx9g\" (UID: \"9677731c-12a8-4fa5-b5c1-ba1238a7f315\") " pod="openshift-marketplace/marketplace-operator-79b997595-7wx9g" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.086338 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-7wx9g" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.088927 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5mmm7" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.098093 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.103257 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8788" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.132045 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npvrp" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.133694 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgj4n" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.271180 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf7mr\" (UniqueName: \"kubernetes.io/projected/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-kube-api-access-lf7mr\") pod \"8e1a892b-5e21-4f62-9859-d4e4a8ef9623\" (UID: \"8e1a892b-5e21-4f62-9859-d4e4a8ef9623\") " Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.271245 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d00b5247-9e12-4202-ae31-20d454dfa183-catalog-content\") pod \"d00b5247-9e12-4202-ae31-20d454dfa183\" (UID: \"d00b5247-9e12-4202-ae31-20d454dfa183\") " Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.271287 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4fgz\" (UniqueName: \"kubernetes.io/projected/bb5d66be-21fe-4237-b616-a8c4f41f5f14-kube-api-access-w4fgz\") pod \"bb5d66be-21fe-4237-b616-a8c4f41f5f14\" (UID: \"bb5d66be-21fe-4237-b616-a8c4f41f5f14\") " Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.271318 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d00b5247-9e12-4202-ae31-20d454dfa183-utilities\") pod \"d00b5247-9e12-4202-ae31-20d454dfa183\" (UID: \"d00b5247-9e12-4202-ae31-20d454dfa183\") " Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.271343 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb5d66be-21fe-4237-b616-a8c4f41f5f14-catalog-content\") pod \"bb5d66be-21fe-4237-b616-a8c4f41f5f14\" (UID: \"bb5d66be-21fe-4237-b616-a8c4f41f5f14\") " Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.271379 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a57c2a77-db59-4b73-b376-640de2af9a7e-marketplace-operator-metrics\") pod \"a57c2a77-db59-4b73-b376-640de2af9a7e\" (UID: \"a57c2a77-db59-4b73-b376-640de2af9a7e\") " Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.271402 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dvrj\" (UniqueName: \"kubernetes.io/projected/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-kube-api-access-5dvrj\") pod \"9a491a4e-eefa-4908-8e7b-1d5c3e67274c\" (UID: \"9a491a4e-eefa-4908-8e7b-1d5c3e67274c\") " Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.271435 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29lcx\" (UniqueName: \"kubernetes.io/projected/d00b5247-9e12-4202-ae31-20d454dfa183-kube-api-access-29lcx\") pod \"d00b5247-9e12-4202-ae31-20d454dfa183\" (UID: \"d00b5247-9e12-4202-ae31-20d454dfa183\") " Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.271464 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-utilities\") pod \"9a491a4e-eefa-4908-8e7b-1d5c3e67274c\" (UID: \"9a491a4e-eefa-4908-8e7b-1d5c3e67274c\") " Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.271531 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpg74\" (UniqueName: \"kubernetes.io/projected/a57c2a77-db59-4b73-b376-640de2af9a7e-kube-api-access-tpg74\") pod \"a57c2a77-db59-4b73-b376-640de2af9a7e\" (UID: \"a57c2a77-db59-4b73-b376-640de2af9a7e\") " Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.271554 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb5d66be-21fe-4237-b616-a8c4f41f5f14-utilities\") pod \"bb5d66be-21fe-4237-b616-a8c4f41f5f14\" (UID: \"bb5d66be-21fe-4237-b616-a8c4f41f5f14\") " Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.271583 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-catalog-content\") pod \"9a491a4e-eefa-4908-8e7b-1d5c3e67274c\" (UID: \"9a491a4e-eefa-4908-8e7b-1d5c3e67274c\") " Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.271620 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-utilities\") pod \"8e1a892b-5e21-4f62-9859-d4e4a8ef9623\" (UID: \"8e1a892b-5e21-4f62-9859-d4e4a8ef9623\") " Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.271658 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a57c2a77-db59-4b73-b376-640de2af9a7e-marketplace-trusted-ca\") pod \"a57c2a77-db59-4b73-b376-640de2af9a7e\" (UID: \"a57c2a77-db59-4b73-b376-640de2af9a7e\") " Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.271694 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-catalog-content\") pod \"8e1a892b-5e21-4f62-9859-d4e4a8ef9623\" (UID: \"8e1a892b-5e21-4f62-9859-d4e4a8ef9623\") " Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.272946 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d00b5247-9e12-4202-ae31-20d454dfa183-utilities" (OuterVolumeSpecName: "utilities") pod "d00b5247-9e12-4202-ae31-20d454dfa183" (UID: "d00b5247-9e12-4202-ae31-20d454dfa183"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.273349 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-utilities" (OuterVolumeSpecName: "utilities") pod "9a491a4e-eefa-4908-8e7b-1d5c3e67274c" (UID: "9a491a4e-eefa-4908-8e7b-1d5c3e67274c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.273762 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-utilities" (OuterVolumeSpecName: "utilities") pod "8e1a892b-5e21-4f62-9859-d4e4a8ef9623" (UID: "8e1a892b-5e21-4f62-9859-d4e4a8ef9623"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.273936 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a57c2a77-db59-4b73-b376-640de2af9a7e-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "a57c2a77-db59-4b73-b376-640de2af9a7e" (UID: "a57c2a77-db59-4b73-b376-640de2af9a7e"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.274105 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb5d66be-21fe-4237-b616-a8c4f41f5f14-utilities" (OuterVolumeSpecName: "utilities") pod "bb5d66be-21fe-4237-b616-a8c4f41f5f14" (UID: "bb5d66be-21fe-4237-b616-a8c4f41f5f14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.276810 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-kube-api-access-5dvrj" (OuterVolumeSpecName: "kube-api-access-5dvrj") pod "9a491a4e-eefa-4908-8e7b-1d5c3e67274c" (UID: "9a491a4e-eefa-4908-8e7b-1d5c3e67274c"). InnerVolumeSpecName "kube-api-access-5dvrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.278239 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb5d66be-21fe-4237-b616-a8c4f41f5f14-kube-api-access-w4fgz" (OuterVolumeSpecName: "kube-api-access-w4fgz") pod "bb5d66be-21fe-4237-b616-a8c4f41f5f14" (UID: "bb5d66be-21fe-4237-b616-a8c4f41f5f14"). InnerVolumeSpecName "kube-api-access-w4fgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.278350 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d00b5247-9e12-4202-ae31-20d454dfa183-kube-api-access-29lcx" (OuterVolumeSpecName: "kube-api-access-29lcx") pod "d00b5247-9e12-4202-ae31-20d454dfa183" (UID: "d00b5247-9e12-4202-ae31-20d454dfa183"). InnerVolumeSpecName "kube-api-access-29lcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.278704 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a57c2a77-db59-4b73-b376-640de2af9a7e-kube-api-access-tpg74" (OuterVolumeSpecName: "kube-api-access-tpg74") pod "a57c2a77-db59-4b73-b376-640de2af9a7e" (UID: "a57c2a77-db59-4b73-b376-640de2af9a7e"). InnerVolumeSpecName "kube-api-access-tpg74". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.288701 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-kube-api-access-lf7mr" (OuterVolumeSpecName: "kube-api-access-lf7mr") pod "8e1a892b-5e21-4f62-9859-d4e4a8ef9623" (UID: "8e1a892b-5e21-4f62-9859-d4e4a8ef9623"). InnerVolumeSpecName "kube-api-access-lf7mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.290174 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e1a892b-5e21-4f62-9859-d4e4a8ef9623" (UID: "8e1a892b-5e21-4f62-9859-d4e4a8ef9623"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.290738 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a57c2a77-db59-4b73-b376-640de2af9a7e-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "a57c2a77-db59-4b73-b376-640de2af9a7e" (UID: "a57c2a77-db59-4b73-b376-640de2af9a7e"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.331327 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a491a4e-eefa-4908-8e7b-1d5c3e67274c" (UID: "9a491a4e-eefa-4908-8e7b-1d5c3e67274c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.344762 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d00b5247-9e12-4202-ae31-20d454dfa183-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d00b5247-9e12-4202-ae31-20d454dfa183" (UID: "d00b5247-9e12-4202-ae31-20d454dfa183"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.373036 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29lcx\" (UniqueName: \"kubernetes.io/projected/d00b5247-9e12-4202-ae31-20d454dfa183-kube-api-access-29lcx\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.373089 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.373102 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpg74\" (UniqueName: \"kubernetes.io/projected/a57c2a77-db59-4b73-b376-640de2af9a7e-kube-api-access-tpg74\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.373141 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb5d66be-21fe-4237-b616-a8c4f41f5f14-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.373150 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.373160 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.373171 4766 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a57c2a77-db59-4b73-b376-640de2af9a7e-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.373180 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.373189 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lf7mr\" (UniqueName: \"kubernetes.io/projected/8e1a892b-5e21-4f62-9859-d4e4a8ef9623-kube-api-access-lf7mr\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.373199 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d00b5247-9e12-4202-ae31-20d454dfa183-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.373208 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4fgz\" (UniqueName: \"kubernetes.io/projected/bb5d66be-21fe-4237-b616-a8c4f41f5f14-kube-api-access-w4fgz\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.373217 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d00b5247-9e12-4202-ae31-20d454dfa183-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.373226 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dvrj\" (UniqueName: \"kubernetes.io/projected/9a491a4e-eefa-4908-8e7b-1d5c3e67274c-kube-api-access-5dvrj\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.373235 4766 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a57c2a77-db59-4b73-b376-640de2af9a7e-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.380479 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb5d66be-21fe-4237-b616-a8c4f41f5f14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb5d66be-21fe-4237-b616-a8c4f41f5f14" (UID: "bb5d66be-21fe-4237-b616-a8c4f41f5f14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.474128 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb5d66be-21fe-4237-b616-a8c4f41f5f14-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.517455 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-7wx9g"] Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.572192 4766 generic.go:334] "Generic (PLEG): container finished" podID="d00b5247-9e12-4202-ae31-20d454dfa183" containerID="f30c9b5668ede69240d70ad01c553dd1d7f1bab01e6df14524ff6b38d9879631" exitCode=0 Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.572268 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8788" event={"ID":"d00b5247-9e12-4202-ae31-20d454dfa183","Type":"ContainerDied","Data":"f30c9b5668ede69240d70ad01c553dd1d7f1bab01e6df14524ff6b38d9879631"} Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.572322 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8788" event={"ID":"d00b5247-9e12-4202-ae31-20d454dfa183","Type":"ContainerDied","Data":"52b1a5d7e063a6062e070a624f19129537cc73f4be3d001ea4433a82a61bae31"} Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.572343 4766 scope.go:117] "RemoveContainer" containerID="f30c9b5668ede69240d70ad01c553dd1d7f1bab01e6df14524ff6b38d9879631" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.572284 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8788" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.577220 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7wx9g" event={"ID":"9677731c-12a8-4fa5-b5c1-ba1238a7f315","Type":"ContainerStarted","Data":"146a058580a147a25eca89939b9d7d91a377c1755c75e7b0ce79757f4b9234b9"} Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.579808 4766 generic.go:334] "Generic (PLEG): container finished" podID="8e1a892b-5e21-4f62-9859-d4e4a8ef9623" containerID="9b6a44c0fd03ae35e84e43af15c18cd0d0ec872ed1aac6bf92c020fe135f9051" exitCode=0 Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.579866 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgj4n" event={"ID":"8e1a892b-5e21-4f62-9859-d4e4a8ef9623","Type":"ContainerDied","Data":"9b6a44c0fd03ae35e84e43af15c18cd0d0ec872ed1aac6bf92c020fe135f9051"} Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.579883 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rgj4n" event={"ID":"8e1a892b-5e21-4f62-9859-d4e4a8ef9623","Type":"ContainerDied","Data":"156d4bb23f641500df18203279e69b1581d53696c455e4c4ea8bdfebed4708db"} Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.579894 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rgj4n" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.581495 4766 generic.go:334] "Generic (PLEG): container finished" podID="a57c2a77-db59-4b73-b376-640de2af9a7e" containerID="fd592f4eaa72c7cc60d59e29049bd321e3843dbfd1355aa26fb9e593fcd2862a" exitCode=0 Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.581562 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.581569 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" event={"ID":"a57c2a77-db59-4b73-b376-640de2af9a7e","Type":"ContainerDied","Data":"fd592f4eaa72c7cc60d59e29049bd321e3843dbfd1355aa26fb9e593fcd2862a"} Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.581585 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vkm2v" event={"ID":"a57c2a77-db59-4b73-b376-640de2af9a7e","Type":"ContainerDied","Data":"46a2a604d643b7bb74a35fe0a4452af5c93be4f6b5c4422f48d4e14e4cd5e76b"} Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.584513 4766 generic.go:334] "Generic (PLEG): container finished" podID="9a491a4e-eefa-4908-8e7b-1d5c3e67274c" containerID="9ccb90a5e883e26604eab9b292fe43ba1b9b0c8e93748b0179c798cf35c8b7e0" exitCode=0 Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.584558 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5mmm7" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.584583 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5mmm7" event={"ID":"9a491a4e-eefa-4908-8e7b-1d5c3e67274c","Type":"ContainerDied","Data":"9ccb90a5e883e26604eab9b292fe43ba1b9b0c8e93748b0179c798cf35c8b7e0"} Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.584619 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5mmm7" event={"ID":"9a491a4e-eefa-4908-8e7b-1d5c3e67274c","Type":"ContainerDied","Data":"5ad6f96e85e84b91c8203a2743e9d76cefd71b6212d34488ac81f5857cd82f26"} Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.587543 4766 generic.go:334] "Generic (PLEG): container finished" podID="bb5d66be-21fe-4237-b616-a8c4f41f5f14" containerID="e064e7f4deae82a6364cce579ee77fcc5fec7af8a8af134197876d757637f9e2" exitCode=0 Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.587617 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npvrp" event={"ID":"bb5d66be-21fe-4237-b616-a8c4f41f5f14","Type":"ContainerDied","Data":"e064e7f4deae82a6364cce579ee77fcc5fec7af8a8af134197876d757637f9e2"} Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.587662 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npvrp" event={"ID":"bb5d66be-21fe-4237-b616-a8c4f41f5f14","Type":"ContainerDied","Data":"8861784ca2964f106940eea6b527a24879741be6c29f5754e0d2f23724222b82"} Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.587742 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npvrp" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.599920 4766 scope.go:117] "RemoveContainer" containerID="4429e69ebb39117a0a033ab82ef3eb817316a84166139debc63cfc685ebdb9d5" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.620344 4766 scope.go:117] "RemoveContainer" containerID="436fbc50f5d0ba2b82408d9d0ec39e1bc35b58c695c138de51e0b4ac7ebfce97" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.637965 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m8788"] Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.642350 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m8788"] Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.653004 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vkm2v"] Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.671168 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vkm2v"] Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.671186 4766 scope.go:117] "RemoveContainer" containerID="f30c9b5668ede69240d70ad01c553dd1d7f1bab01e6df14524ff6b38d9879631" Oct 02 10:56:47 crc kubenswrapper[4766]: E1002 10:56:47.672969 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f30c9b5668ede69240d70ad01c553dd1d7f1bab01e6df14524ff6b38d9879631\": container with ID starting with f30c9b5668ede69240d70ad01c553dd1d7f1bab01e6df14524ff6b38d9879631 not found: ID does not exist" containerID="f30c9b5668ede69240d70ad01c553dd1d7f1bab01e6df14524ff6b38d9879631" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.673039 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f30c9b5668ede69240d70ad01c553dd1d7f1bab01e6df14524ff6b38d9879631"} err="failed to get container status \"f30c9b5668ede69240d70ad01c553dd1d7f1bab01e6df14524ff6b38d9879631\": rpc error: code = NotFound desc = could not find container \"f30c9b5668ede69240d70ad01c553dd1d7f1bab01e6df14524ff6b38d9879631\": container with ID starting with f30c9b5668ede69240d70ad01c553dd1d7f1bab01e6df14524ff6b38d9879631 not found: ID does not exist" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.673161 4766 scope.go:117] "RemoveContainer" containerID="4429e69ebb39117a0a033ab82ef3eb817316a84166139debc63cfc685ebdb9d5" Oct 02 10:56:47 crc kubenswrapper[4766]: E1002 10:56:47.675437 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4429e69ebb39117a0a033ab82ef3eb817316a84166139debc63cfc685ebdb9d5\": container with ID starting with 4429e69ebb39117a0a033ab82ef3eb817316a84166139debc63cfc685ebdb9d5 not found: ID does not exist" containerID="4429e69ebb39117a0a033ab82ef3eb817316a84166139debc63cfc685ebdb9d5" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.675495 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4429e69ebb39117a0a033ab82ef3eb817316a84166139debc63cfc685ebdb9d5"} err="failed to get container status \"4429e69ebb39117a0a033ab82ef3eb817316a84166139debc63cfc685ebdb9d5\": rpc error: code = NotFound desc = could not find container \"4429e69ebb39117a0a033ab82ef3eb817316a84166139debc63cfc685ebdb9d5\": container with ID starting with 4429e69ebb39117a0a033ab82ef3eb817316a84166139debc63cfc685ebdb9d5 not found: ID does not exist" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.675562 4766 scope.go:117] "RemoveContainer" containerID="436fbc50f5d0ba2b82408d9d0ec39e1bc35b58c695c138de51e0b4ac7ebfce97" Oct 02 10:56:47 crc kubenswrapper[4766]: E1002 10:56:47.676109 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"436fbc50f5d0ba2b82408d9d0ec39e1bc35b58c695c138de51e0b4ac7ebfce97\": container with ID starting with 436fbc50f5d0ba2b82408d9d0ec39e1bc35b58c695c138de51e0b4ac7ebfce97 not found: ID does not exist" containerID="436fbc50f5d0ba2b82408d9d0ec39e1bc35b58c695c138de51e0b4ac7ebfce97" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.676165 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"436fbc50f5d0ba2b82408d9d0ec39e1bc35b58c695c138de51e0b4ac7ebfce97"} err="failed to get container status \"436fbc50f5d0ba2b82408d9d0ec39e1bc35b58c695c138de51e0b4ac7ebfce97\": rpc error: code = NotFound desc = could not find container \"436fbc50f5d0ba2b82408d9d0ec39e1bc35b58c695c138de51e0b4ac7ebfce97\": container with ID starting with 436fbc50f5d0ba2b82408d9d0ec39e1bc35b58c695c138de51e0b4ac7ebfce97 not found: ID does not exist" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.676200 4766 scope.go:117] "RemoveContainer" containerID="9b6a44c0fd03ae35e84e43af15c18cd0d0ec872ed1aac6bf92c020fe135f9051" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.678075 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-npvrp"] Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.688552 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-npvrp"] Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.700523 4766 scope.go:117] "RemoveContainer" containerID="142e0abee00e3d9ca8450600da3d8910177502f16c547d91c78efe9ee729e135" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.700634 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5mmm7"] Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.704767 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5mmm7"] Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.716697 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgj4n"] Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.717687 4766 scope.go:117] "RemoveContainer" containerID="d56a833cb57739632b9f84c95b0b320e869c7253cf0bc96d26ffd8fa4d8647f9" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.719139 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rgj4n"] Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.731434 4766 scope.go:117] "RemoveContainer" containerID="9b6a44c0fd03ae35e84e43af15c18cd0d0ec872ed1aac6bf92c020fe135f9051" Oct 02 10:56:47 crc kubenswrapper[4766]: E1002 10:56:47.732069 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b6a44c0fd03ae35e84e43af15c18cd0d0ec872ed1aac6bf92c020fe135f9051\": container with ID starting with 9b6a44c0fd03ae35e84e43af15c18cd0d0ec872ed1aac6bf92c020fe135f9051 not found: ID does not exist" containerID="9b6a44c0fd03ae35e84e43af15c18cd0d0ec872ed1aac6bf92c020fe135f9051" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.732112 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b6a44c0fd03ae35e84e43af15c18cd0d0ec872ed1aac6bf92c020fe135f9051"} err="failed to get container status \"9b6a44c0fd03ae35e84e43af15c18cd0d0ec872ed1aac6bf92c020fe135f9051\": rpc error: code = NotFound desc = could not find container \"9b6a44c0fd03ae35e84e43af15c18cd0d0ec872ed1aac6bf92c020fe135f9051\": container with ID starting with 9b6a44c0fd03ae35e84e43af15c18cd0d0ec872ed1aac6bf92c020fe135f9051 not found: ID does not exist" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.732162 4766 scope.go:117] "RemoveContainer" containerID="142e0abee00e3d9ca8450600da3d8910177502f16c547d91c78efe9ee729e135" Oct 02 10:56:47 crc kubenswrapper[4766]: E1002 10:56:47.732494 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"142e0abee00e3d9ca8450600da3d8910177502f16c547d91c78efe9ee729e135\": container with ID starting with 142e0abee00e3d9ca8450600da3d8910177502f16c547d91c78efe9ee729e135 not found: ID does not exist" containerID="142e0abee00e3d9ca8450600da3d8910177502f16c547d91c78efe9ee729e135" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.732542 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"142e0abee00e3d9ca8450600da3d8910177502f16c547d91c78efe9ee729e135"} err="failed to get container status \"142e0abee00e3d9ca8450600da3d8910177502f16c547d91c78efe9ee729e135\": rpc error: code = NotFound desc = could not find container \"142e0abee00e3d9ca8450600da3d8910177502f16c547d91c78efe9ee729e135\": container with ID starting with 142e0abee00e3d9ca8450600da3d8910177502f16c547d91c78efe9ee729e135 not found: ID does not exist" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.732560 4766 scope.go:117] "RemoveContainer" containerID="d56a833cb57739632b9f84c95b0b320e869c7253cf0bc96d26ffd8fa4d8647f9" Oct 02 10:56:47 crc kubenswrapper[4766]: E1002 10:56:47.732854 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d56a833cb57739632b9f84c95b0b320e869c7253cf0bc96d26ffd8fa4d8647f9\": container with ID starting with d56a833cb57739632b9f84c95b0b320e869c7253cf0bc96d26ffd8fa4d8647f9 not found: ID does not exist" containerID="d56a833cb57739632b9f84c95b0b320e869c7253cf0bc96d26ffd8fa4d8647f9" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.732878 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d56a833cb57739632b9f84c95b0b320e869c7253cf0bc96d26ffd8fa4d8647f9"} err="failed to get container status \"d56a833cb57739632b9f84c95b0b320e869c7253cf0bc96d26ffd8fa4d8647f9\": rpc error: code = NotFound desc = could not find container \"d56a833cb57739632b9f84c95b0b320e869c7253cf0bc96d26ffd8fa4d8647f9\": container with ID starting with d56a833cb57739632b9f84c95b0b320e869c7253cf0bc96d26ffd8fa4d8647f9 not found: ID does not exist" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.732896 4766 scope.go:117] "RemoveContainer" containerID="fd592f4eaa72c7cc60d59e29049bd321e3843dbfd1355aa26fb9e593fcd2862a" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.747936 4766 scope.go:117] "RemoveContainer" containerID="fd592f4eaa72c7cc60d59e29049bd321e3843dbfd1355aa26fb9e593fcd2862a" Oct 02 10:56:47 crc kubenswrapper[4766]: E1002 10:56:47.748418 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd592f4eaa72c7cc60d59e29049bd321e3843dbfd1355aa26fb9e593fcd2862a\": container with ID starting with fd592f4eaa72c7cc60d59e29049bd321e3843dbfd1355aa26fb9e593fcd2862a not found: ID does not exist" containerID="fd592f4eaa72c7cc60d59e29049bd321e3843dbfd1355aa26fb9e593fcd2862a" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.748464 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd592f4eaa72c7cc60d59e29049bd321e3843dbfd1355aa26fb9e593fcd2862a"} err="failed to get container status \"fd592f4eaa72c7cc60d59e29049bd321e3843dbfd1355aa26fb9e593fcd2862a\": rpc error: code = NotFound desc = could not find container \"fd592f4eaa72c7cc60d59e29049bd321e3843dbfd1355aa26fb9e593fcd2862a\": container with ID starting with fd592f4eaa72c7cc60d59e29049bd321e3843dbfd1355aa26fb9e593fcd2862a not found: ID does not exist" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.748486 4766 scope.go:117] "RemoveContainer" containerID="9ccb90a5e883e26604eab9b292fe43ba1b9b0c8e93748b0179c798cf35c8b7e0" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.765526 4766 scope.go:117] "RemoveContainer" containerID="3d2044096080bb4a32fd798df403dd5075aa5a145e243d00cd2ab9e540191d24" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.783902 4766 scope.go:117] "RemoveContainer" containerID="439e8cff8c1b745a07a992c762939d486cbf7f8d70fb2e55cb9de9b1c50b904d" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.798851 4766 scope.go:117] "RemoveContainer" containerID="9ccb90a5e883e26604eab9b292fe43ba1b9b0c8e93748b0179c798cf35c8b7e0" Oct 02 10:56:47 crc kubenswrapper[4766]: E1002 10:56:47.799347 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ccb90a5e883e26604eab9b292fe43ba1b9b0c8e93748b0179c798cf35c8b7e0\": container with ID starting with 9ccb90a5e883e26604eab9b292fe43ba1b9b0c8e93748b0179c798cf35c8b7e0 not found: ID does not exist" containerID="9ccb90a5e883e26604eab9b292fe43ba1b9b0c8e93748b0179c798cf35c8b7e0" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.799385 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ccb90a5e883e26604eab9b292fe43ba1b9b0c8e93748b0179c798cf35c8b7e0"} err="failed to get container status \"9ccb90a5e883e26604eab9b292fe43ba1b9b0c8e93748b0179c798cf35c8b7e0\": rpc error: code = NotFound desc = could not find container \"9ccb90a5e883e26604eab9b292fe43ba1b9b0c8e93748b0179c798cf35c8b7e0\": container with ID starting with 9ccb90a5e883e26604eab9b292fe43ba1b9b0c8e93748b0179c798cf35c8b7e0 not found: ID does not exist" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.799435 4766 scope.go:117] "RemoveContainer" containerID="3d2044096080bb4a32fd798df403dd5075aa5a145e243d00cd2ab9e540191d24" Oct 02 10:56:47 crc kubenswrapper[4766]: E1002 10:56:47.800244 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d2044096080bb4a32fd798df403dd5075aa5a145e243d00cd2ab9e540191d24\": container with ID starting with 3d2044096080bb4a32fd798df403dd5075aa5a145e243d00cd2ab9e540191d24 not found: ID does not exist" containerID="3d2044096080bb4a32fd798df403dd5075aa5a145e243d00cd2ab9e540191d24" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.800304 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d2044096080bb4a32fd798df403dd5075aa5a145e243d00cd2ab9e540191d24"} err="failed to get container status \"3d2044096080bb4a32fd798df403dd5075aa5a145e243d00cd2ab9e540191d24\": rpc error: code = NotFound desc = could not find container \"3d2044096080bb4a32fd798df403dd5075aa5a145e243d00cd2ab9e540191d24\": container with ID starting with 3d2044096080bb4a32fd798df403dd5075aa5a145e243d00cd2ab9e540191d24 not found: ID does not exist" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.800344 4766 scope.go:117] "RemoveContainer" containerID="439e8cff8c1b745a07a992c762939d486cbf7f8d70fb2e55cb9de9b1c50b904d" Oct 02 10:56:47 crc kubenswrapper[4766]: E1002 10:56:47.800719 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"439e8cff8c1b745a07a992c762939d486cbf7f8d70fb2e55cb9de9b1c50b904d\": container with ID starting with 439e8cff8c1b745a07a992c762939d486cbf7f8d70fb2e55cb9de9b1c50b904d not found: ID does not exist" containerID="439e8cff8c1b745a07a992c762939d486cbf7f8d70fb2e55cb9de9b1c50b904d" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.800766 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"439e8cff8c1b745a07a992c762939d486cbf7f8d70fb2e55cb9de9b1c50b904d"} err="failed to get container status \"439e8cff8c1b745a07a992c762939d486cbf7f8d70fb2e55cb9de9b1c50b904d\": rpc error: code = NotFound desc = could not find container \"439e8cff8c1b745a07a992c762939d486cbf7f8d70fb2e55cb9de9b1c50b904d\": container with ID starting with 439e8cff8c1b745a07a992c762939d486cbf7f8d70fb2e55cb9de9b1c50b904d not found: ID does not exist" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.800800 4766 scope.go:117] "RemoveContainer" containerID="e064e7f4deae82a6364cce579ee77fcc5fec7af8a8af134197876d757637f9e2" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.836546 4766 scope.go:117] "RemoveContainer" containerID="bba5c2714ea0c09d4f94df2927e72d04c7b825f87186ce2896be3c54af73a983" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.850559 4766 scope.go:117] "RemoveContainer" containerID="3f5b69a4b7de54aa4a8ace06d81018fa1e2eceb1155b3a97a7194dc4a9e583c1" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.863243 4766 scope.go:117] "RemoveContainer" containerID="e064e7f4deae82a6364cce579ee77fcc5fec7af8a8af134197876d757637f9e2" Oct 02 10:56:47 crc kubenswrapper[4766]: E1002 10:56:47.863787 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e064e7f4deae82a6364cce579ee77fcc5fec7af8a8af134197876d757637f9e2\": container with ID starting with e064e7f4deae82a6364cce579ee77fcc5fec7af8a8af134197876d757637f9e2 not found: ID does not exist" containerID="e064e7f4deae82a6364cce579ee77fcc5fec7af8a8af134197876d757637f9e2" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.863837 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e064e7f4deae82a6364cce579ee77fcc5fec7af8a8af134197876d757637f9e2"} err="failed to get container status \"e064e7f4deae82a6364cce579ee77fcc5fec7af8a8af134197876d757637f9e2\": rpc error: code = NotFound desc = could not find container \"e064e7f4deae82a6364cce579ee77fcc5fec7af8a8af134197876d757637f9e2\": container with ID starting with e064e7f4deae82a6364cce579ee77fcc5fec7af8a8af134197876d757637f9e2 not found: ID does not exist" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.863871 4766 scope.go:117] "RemoveContainer" containerID="bba5c2714ea0c09d4f94df2927e72d04c7b825f87186ce2896be3c54af73a983" Oct 02 10:56:47 crc kubenswrapper[4766]: E1002 10:56:47.864242 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba5c2714ea0c09d4f94df2927e72d04c7b825f87186ce2896be3c54af73a983\": container with ID starting with bba5c2714ea0c09d4f94df2927e72d04c7b825f87186ce2896be3c54af73a983 not found: ID does not exist" containerID="bba5c2714ea0c09d4f94df2927e72d04c7b825f87186ce2896be3c54af73a983" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.864277 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba5c2714ea0c09d4f94df2927e72d04c7b825f87186ce2896be3c54af73a983"} err="failed to get container status \"bba5c2714ea0c09d4f94df2927e72d04c7b825f87186ce2896be3c54af73a983\": rpc error: code = NotFound desc = could not find container \"bba5c2714ea0c09d4f94df2927e72d04c7b825f87186ce2896be3c54af73a983\": container with ID starting with bba5c2714ea0c09d4f94df2927e72d04c7b825f87186ce2896be3c54af73a983 not found: ID does not exist" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.864302 4766 scope.go:117] "RemoveContainer" containerID="3f5b69a4b7de54aa4a8ace06d81018fa1e2eceb1155b3a97a7194dc4a9e583c1" Oct 02 10:56:47 crc kubenswrapper[4766]: E1002 10:56:47.864961 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f5b69a4b7de54aa4a8ace06d81018fa1e2eceb1155b3a97a7194dc4a9e583c1\": container with ID starting with 3f5b69a4b7de54aa4a8ace06d81018fa1e2eceb1155b3a97a7194dc4a9e583c1 not found: ID does not exist" containerID="3f5b69a4b7de54aa4a8ace06d81018fa1e2eceb1155b3a97a7194dc4a9e583c1" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.864988 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f5b69a4b7de54aa4a8ace06d81018fa1e2eceb1155b3a97a7194dc4a9e583c1"} err="failed to get container status \"3f5b69a4b7de54aa4a8ace06d81018fa1e2eceb1155b3a97a7194dc4a9e583c1\": rpc error: code = NotFound desc = could not find container \"3f5b69a4b7de54aa4a8ace06d81018fa1e2eceb1155b3a97a7194dc4a9e583c1\": container with ID starting with 3f5b69a4b7de54aa4a8ace06d81018fa1e2eceb1155b3a97a7194dc4a9e583c1 not found: ID does not exist" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.887898 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e1a892b-5e21-4f62-9859-d4e4a8ef9623" path="/var/lib/kubelet/pods/8e1a892b-5e21-4f62-9859-d4e4a8ef9623/volumes" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.888716 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a491a4e-eefa-4908-8e7b-1d5c3e67274c" path="/var/lib/kubelet/pods/9a491a4e-eefa-4908-8e7b-1d5c3e67274c/volumes" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.889329 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a57c2a77-db59-4b73-b376-640de2af9a7e" path="/var/lib/kubelet/pods/a57c2a77-db59-4b73-b376-640de2af9a7e/volumes" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.890243 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb5d66be-21fe-4237-b616-a8c4f41f5f14" path="/var/lib/kubelet/pods/bb5d66be-21fe-4237-b616-a8c4f41f5f14/volumes" Oct 02 10:56:47 crc kubenswrapper[4766]: I1002 10:56:47.890856 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d00b5247-9e12-4202-ae31-20d454dfa183" path="/var/lib/kubelet/pods/d00b5247-9e12-4202-ae31-20d454dfa183/volumes" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.595793 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-7wx9g" event={"ID":"9677731c-12a8-4fa5-b5c1-ba1238a7f315","Type":"ContainerStarted","Data":"4f92d03d0f76af2bb96ab3ad8a556f80f30e4418edd837e224dc8c7447e09101"} Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.597005 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-7wx9g" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.601890 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-7wx9g" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.621363 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-7wx9g" podStartSLOduration=2.6213462610000002 podStartE2EDuration="2.621346261s" podCreationTimestamp="2025-10-02 10:56:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:56:48.614772627 +0000 UTC m=+323.557643571" watchObservedRunningTime="2025-10-02 10:56:48.621346261 +0000 UTC m=+323.564217205" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.832403 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4zwjb"] Oct 02 10:56:48 crc kubenswrapper[4766]: E1002 10:56:48.832648 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00b5247-9e12-4202-ae31-20d454dfa183" containerName="extract-utilities" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.832665 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00b5247-9e12-4202-ae31-20d454dfa183" containerName="extract-utilities" Oct 02 10:56:48 crc kubenswrapper[4766]: E1002 10:56:48.832674 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb5d66be-21fe-4237-b616-a8c4f41f5f14" containerName="extract-utilities" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.832682 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb5d66be-21fe-4237-b616-a8c4f41f5f14" containerName="extract-utilities" Oct 02 10:56:48 crc kubenswrapper[4766]: E1002 10:56:48.832717 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1a892b-5e21-4f62-9859-d4e4a8ef9623" containerName="registry-server" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.832725 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1a892b-5e21-4f62-9859-d4e4a8ef9623" containerName="registry-server" Oct 02 10:56:48 crc kubenswrapper[4766]: E1002 10:56:48.832733 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb5d66be-21fe-4237-b616-a8c4f41f5f14" containerName="extract-content" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.832740 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb5d66be-21fe-4237-b616-a8c4f41f5f14" containerName="extract-content" Oct 02 10:56:48 crc kubenswrapper[4766]: E1002 10:56:48.832751 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a491a4e-eefa-4908-8e7b-1d5c3e67274c" containerName="extract-utilities" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.832758 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a491a4e-eefa-4908-8e7b-1d5c3e67274c" containerName="extract-utilities" Oct 02 10:56:48 crc kubenswrapper[4766]: E1002 10:56:48.832770 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a491a4e-eefa-4908-8e7b-1d5c3e67274c" containerName="registry-server" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.832778 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a491a4e-eefa-4908-8e7b-1d5c3e67274c" containerName="registry-server" Oct 02 10:56:48 crc kubenswrapper[4766]: E1002 10:56:48.832790 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb5d66be-21fe-4237-b616-a8c4f41f5f14" containerName="registry-server" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.832796 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb5d66be-21fe-4237-b616-a8c4f41f5f14" containerName="registry-server" Oct 02 10:56:48 crc kubenswrapper[4766]: E1002 10:56:48.832807 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1a892b-5e21-4f62-9859-d4e4a8ef9623" containerName="extract-utilities" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.832813 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1a892b-5e21-4f62-9859-d4e4a8ef9623" containerName="extract-utilities" Oct 02 10:56:48 crc kubenswrapper[4766]: E1002 10:56:48.832822 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a491a4e-eefa-4908-8e7b-1d5c3e67274c" containerName="extract-content" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.832831 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a491a4e-eefa-4908-8e7b-1d5c3e67274c" containerName="extract-content" Oct 02 10:56:48 crc kubenswrapper[4766]: E1002 10:56:48.832842 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00b5247-9e12-4202-ae31-20d454dfa183" containerName="registry-server" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.832850 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00b5247-9e12-4202-ae31-20d454dfa183" containerName="registry-server" Oct 02 10:56:48 crc kubenswrapper[4766]: E1002 10:56:48.832879 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1a892b-5e21-4f62-9859-d4e4a8ef9623" containerName="extract-content" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.832886 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1a892b-5e21-4f62-9859-d4e4a8ef9623" containerName="extract-content" Oct 02 10:56:48 crc kubenswrapper[4766]: E1002 10:56:48.832895 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00b5247-9e12-4202-ae31-20d454dfa183" containerName="extract-content" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.832901 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00b5247-9e12-4202-ae31-20d454dfa183" containerName="extract-content" Oct 02 10:56:48 crc kubenswrapper[4766]: E1002 10:56:48.832908 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a57c2a77-db59-4b73-b376-640de2af9a7e" containerName="marketplace-operator" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.832914 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a57c2a77-db59-4b73-b376-640de2af9a7e" containerName="marketplace-operator" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.832995 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a491a4e-eefa-4908-8e7b-1d5c3e67274c" containerName="registry-server" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.833005 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb5d66be-21fe-4237-b616-a8c4f41f5f14" containerName="registry-server" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.833011 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e1a892b-5e21-4f62-9859-d4e4a8ef9623" containerName="registry-server" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.833024 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d00b5247-9e12-4202-ae31-20d454dfa183" containerName="registry-server" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.833031 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a57c2a77-db59-4b73-b376-640de2af9a7e" containerName="marketplace-operator" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.833681 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zwjb" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.835787 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.843677 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4zwjb"] Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.988899 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db54d\" (UniqueName: \"kubernetes.io/projected/9233c36a-a15b-4668-9da2-d7e2a778fa2e-kube-api-access-db54d\") pod \"community-operators-4zwjb\" (UID: \"9233c36a-a15b-4668-9da2-d7e2a778fa2e\") " pod="openshift-marketplace/community-operators-4zwjb" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.989306 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9233c36a-a15b-4668-9da2-d7e2a778fa2e-utilities\") pod \"community-operators-4zwjb\" (UID: \"9233c36a-a15b-4668-9da2-d7e2a778fa2e\") " pod="openshift-marketplace/community-operators-4zwjb" Oct 02 10:56:48 crc kubenswrapper[4766]: I1002 10:56:48.989379 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9233c36a-a15b-4668-9da2-d7e2a778fa2e-catalog-content\") pod \"community-operators-4zwjb\" (UID: \"9233c36a-a15b-4668-9da2-d7e2a778fa2e\") " pod="openshift-marketplace/community-operators-4zwjb" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.036030 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p827n"] Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.038172 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p827n" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.040075 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.047162 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p827n"] Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.091115 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9233c36a-a15b-4668-9da2-d7e2a778fa2e-utilities\") pod \"community-operators-4zwjb\" (UID: \"9233c36a-a15b-4668-9da2-d7e2a778fa2e\") " pod="openshift-marketplace/community-operators-4zwjb" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.091167 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9233c36a-a15b-4668-9da2-d7e2a778fa2e-catalog-content\") pod \"community-operators-4zwjb\" (UID: \"9233c36a-a15b-4668-9da2-d7e2a778fa2e\") " pod="openshift-marketplace/community-operators-4zwjb" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.091225 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db54d\" (UniqueName: \"kubernetes.io/projected/9233c36a-a15b-4668-9da2-d7e2a778fa2e-kube-api-access-db54d\") pod \"community-operators-4zwjb\" (UID: \"9233c36a-a15b-4668-9da2-d7e2a778fa2e\") " pod="openshift-marketplace/community-operators-4zwjb" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.091776 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9233c36a-a15b-4668-9da2-d7e2a778fa2e-catalog-content\") pod \"community-operators-4zwjb\" (UID: \"9233c36a-a15b-4668-9da2-d7e2a778fa2e\") " pod="openshift-marketplace/community-operators-4zwjb" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.092319 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9233c36a-a15b-4668-9da2-d7e2a778fa2e-utilities\") pod \"community-operators-4zwjb\" (UID: \"9233c36a-a15b-4668-9da2-d7e2a778fa2e\") " pod="openshift-marketplace/community-operators-4zwjb" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.111069 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db54d\" (UniqueName: \"kubernetes.io/projected/9233c36a-a15b-4668-9da2-d7e2a778fa2e-kube-api-access-db54d\") pod \"community-operators-4zwjb\" (UID: \"9233c36a-a15b-4668-9da2-d7e2a778fa2e\") " pod="openshift-marketplace/community-operators-4zwjb" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.154289 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4zwjb" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.192267 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/697f9f5a-2f67-4b88-8fab-f29d029c1643-catalog-content\") pod \"redhat-marketplace-p827n\" (UID: \"697f9f5a-2f67-4b88-8fab-f29d029c1643\") " pod="openshift-marketplace/redhat-marketplace-p827n" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.192344 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/697f9f5a-2f67-4b88-8fab-f29d029c1643-utilities\") pod \"redhat-marketplace-p827n\" (UID: \"697f9f5a-2f67-4b88-8fab-f29d029c1643\") " pod="openshift-marketplace/redhat-marketplace-p827n" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.192414 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdzhw\" (UniqueName: \"kubernetes.io/projected/697f9f5a-2f67-4b88-8fab-f29d029c1643-kube-api-access-wdzhw\") pod \"redhat-marketplace-p827n\" (UID: \"697f9f5a-2f67-4b88-8fab-f29d029c1643\") " pod="openshift-marketplace/redhat-marketplace-p827n" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.293778 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdzhw\" (UniqueName: \"kubernetes.io/projected/697f9f5a-2f67-4b88-8fab-f29d029c1643-kube-api-access-wdzhw\") pod \"redhat-marketplace-p827n\" (UID: \"697f9f5a-2f67-4b88-8fab-f29d029c1643\") " pod="openshift-marketplace/redhat-marketplace-p827n" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.295378 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/697f9f5a-2f67-4b88-8fab-f29d029c1643-catalog-content\") pod \"redhat-marketplace-p827n\" (UID: \"697f9f5a-2f67-4b88-8fab-f29d029c1643\") " pod="openshift-marketplace/redhat-marketplace-p827n" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.295430 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/697f9f5a-2f67-4b88-8fab-f29d029c1643-utilities\") pod \"redhat-marketplace-p827n\" (UID: \"697f9f5a-2f67-4b88-8fab-f29d029c1643\") " pod="openshift-marketplace/redhat-marketplace-p827n" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.299904 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/697f9f5a-2f67-4b88-8fab-f29d029c1643-catalog-content\") pod \"redhat-marketplace-p827n\" (UID: \"697f9f5a-2f67-4b88-8fab-f29d029c1643\") " pod="openshift-marketplace/redhat-marketplace-p827n" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.299990 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/697f9f5a-2f67-4b88-8fab-f29d029c1643-utilities\") pod \"redhat-marketplace-p827n\" (UID: \"697f9f5a-2f67-4b88-8fab-f29d029c1643\") " pod="openshift-marketplace/redhat-marketplace-p827n" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.314181 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdzhw\" (UniqueName: \"kubernetes.io/projected/697f9f5a-2f67-4b88-8fab-f29d029c1643-kube-api-access-wdzhw\") pod \"redhat-marketplace-p827n\" (UID: \"697f9f5a-2f67-4b88-8fab-f29d029c1643\") " pod="openshift-marketplace/redhat-marketplace-p827n" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.346099 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4zwjb"] Oct 02 10:56:49 crc kubenswrapper[4766]: W1002 10:56:49.353055 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9233c36a_a15b_4668_9da2_d7e2a778fa2e.slice/crio-0e550737dbe884ca9f4e249b39f63a0f12ba69e76a2d8dbd2db1ff45abdc6e4e WatchSource:0}: Error finding container 0e550737dbe884ca9f4e249b39f63a0f12ba69e76a2d8dbd2db1ff45abdc6e4e: Status 404 returned error can't find the container with id 0e550737dbe884ca9f4e249b39f63a0f12ba69e76a2d8dbd2db1ff45abdc6e4e Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.363649 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p827n" Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.607596 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zwjb" event={"ID":"9233c36a-a15b-4668-9da2-d7e2a778fa2e","Type":"ContainerStarted","Data":"0e550737dbe884ca9f4e249b39f63a0f12ba69e76a2d8dbd2db1ff45abdc6e4e"} Oct 02 10:56:49 crc kubenswrapper[4766]: I1002 10:56:49.744304 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p827n"] Oct 02 10:56:49 crc kubenswrapper[4766]: W1002 10:56:49.753168 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod697f9f5a_2f67_4b88_8fab_f29d029c1643.slice/crio-8785ec9af955239852f53faa07339e07edec0009950c34c72e73823425442376 WatchSource:0}: Error finding container 8785ec9af955239852f53faa07339e07edec0009950c34c72e73823425442376: Status 404 returned error can't find the container with id 8785ec9af955239852f53faa07339e07edec0009950c34c72e73823425442376 Oct 02 10:56:50 crc kubenswrapper[4766]: I1002 10:56:50.616141 4766 generic.go:334] "Generic (PLEG): container finished" podID="697f9f5a-2f67-4b88-8fab-f29d029c1643" containerID="e843acaa4a217f89d5f884184267de5f80d83cfef46f30192b53a81e80f94a0c" exitCode=0 Oct 02 10:56:50 crc kubenswrapper[4766]: I1002 10:56:50.616231 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p827n" event={"ID":"697f9f5a-2f67-4b88-8fab-f29d029c1643","Type":"ContainerDied","Data":"e843acaa4a217f89d5f884184267de5f80d83cfef46f30192b53a81e80f94a0c"} Oct 02 10:56:50 crc kubenswrapper[4766]: I1002 10:56:50.616943 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p827n" event={"ID":"697f9f5a-2f67-4b88-8fab-f29d029c1643","Type":"ContainerStarted","Data":"8785ec9af955239852f53faa07339e07edec0009950c34c72e73823425442376"} Oct 02 10:56:50 crc kubenswrapper[4766]: I1002 10:56:50.619477 4766 generic.go:334] "Generic (PLEG): container finished" podID="9233c36a-a15b-4668-9da2-d7e2a778fa2e" containerID="e7653f2c61400813f8688021146123f67fbc8f451a4fe1dc29ce6477a6698d9e" exitCode=0 Oct 02 10:56:50 crc kubenswrapper[4766]: I1002 10:56:50.620630 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zwjb" event={"ID":"9233c36a-a15b-4668-9da2-d7e2a778fa2e","Type":"ContainerDied","Data":"e7653f2c61400813f8688021146123f67fbc8f451a4fe1dc29ce6477a6698d9e"} Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.233415 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q8cpr"] Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.234565 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q8cpr" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.238413 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.239657 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q8cpr"] Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.317859 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c98005ea-50d3-4b26-9049-2beb07771f21-catalog-content\") pod \"redhat-operators-q8cpr\" (UID: \"c98005ea-50d3-4b26-9049-2beb07771f21\") " pod="openshift-marketplace/redhat-operators-q8cpr" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.318279 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26fjt\" (UniqueName: \"kubernetes.io/projected/c98005ea-50d3-4b26-9049-2beb07771f21-kube-api-access-26fjt\") pod \"redhat-operators-q8cpr\" (UID: \"c98005ea-50d3-4b26-9049-2beb07771f21\") " pod="openshift-marketplace/redhat-operators-q8cpr" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.318311 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c98005ea-50d3-4b26-9049-2beb07771f21-utilities\") pod \"redhat-operators-q8cpr\" (UID: \"c98005ea-50d3-4b26-9049-2beb07771f21\") " pod="openshift-marketplace/redhat-operators-q8cpr" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.419395 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26fjt\" (UniqueName: \"kubernetes.io/projected/c98005ea-50d3-4b26-9049-2beb07771f21-kube-api-access-26fjt\") pod \"redhat-operators-q8cpr\" (UID: \"c98005ea-50d3-4b26-9049-2beb07771f21\") " pod="openshift-marketplace/redhat-operators-q8cpr" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.419442 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c98005ea-50d3-4b26-9049-2beb07771f21-utilities\") pod \"redhat-operators-q8cpr\" (UID: \"c98005ea-50d3-4b26-9049-2beb07771f21\") " pod="openshift-marketplace/redhat-operators-q8cpr" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.419559 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c98005ea-50d3-4b26-9049-2beb07771f21-catalog-content\") pod \"redhat-operators-q8cpr\" (UID: \"c98005ea-50d3-4b26-9049-2beb07771f21\") " pod="openshift-marketplace/redhat-operators-q8cpr" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.420144 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c98005ea-50d3-4b26-9049-2beb07771f21-catalog-content\") pod \"redhat-operators-q8cpr\" (UID: \"c98005ea-50d3-4b26-9049-2beb07771f21\") " pod="openshift-marketplace/redhat-operators-q8cpr" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.420228 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c98005ea-50d3-4b26-9049-2beb07771f21-utilities\") pod \"redhat-operators-q8cpr\" (UID: \"c98005ea-50d3-4b26-9049-2beb07771f21\") " pod="openshift-marketplace/redhat-operators-q8cpr" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.439596 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rlmp7"] Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.440934 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlmp7" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.443189 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.447192 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26fjt\" (UniqueName: \"kubernetes.io/projected/c98005ea-50d3-4b26-9049-2beb07771f21-kube-api-access-26fjt\") pod \"redhat-operators-q8cpr\" (UID: \"c98005ea-50d3-4b26-9049-2beb07771f21\") " pod="openshift-marketplace/redhat-operators-q8cpr" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.461941 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rlmp7"] Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.520142 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aebef18-f17d-487d-a23e-472000a73d87-utilities\") pod \"certified-operators-rlmp7\" (UID: \"9aebef18-f17d-487d-a23e-472000a73d87\") " pod="openshift-marketplace/certified-operators-rlmp7" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.520215 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aebef18-f17d-487d-a23e-472000a73d87-catalog-content\") pod \"certified-operators-rlmp7\" (UID: \"9aebef18-f17d-487d-a23e-472000a73d87\") " pod="openshift-marketplace/certified-operators-rlmp7" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.520247 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72qxr\" (UniqueName: \"kubernetes.io/projected/9aebef18-f17d-487d-a23e-472000a73d87-kube-api-access-72qxr\") pod \"certified-operators-rlmp7\" (UID: \"9aebef18-f17d-487d-a23e-472000a73d87\") " pod="openshift-marketplace/certified-operators-rlmp7" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.556851 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q8cpr" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.621559 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aebef18-f17d-487d-a23e-472000a73d87-catalog-content\") pod \"certified-operators-rlmp7\" (UID: \"9aebef18-f17d-487d-a23e-472000a73d87\") " pod="openshift-marketplace/certified-operators-rlmp7" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.622036 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72qxr\" (UniqueName: \"kubernetes.io/projected/9aebef18-f17d-487d-a23e-472000a73d87-kube-api-access-72qxr\") pod \"certified-operators-rlmp7\" (UID: \"9aebef18-f17d-487d-a23e-472000a73d87\") " pod="openshift-marketplace/certified-operators-rlmp7" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.622074 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aebef18-f17d-487d-a23e-472000a73d87-catalog-content\") pod \"certified-operators-rlmp7\" (UID: \"9aebef18-f17d-487d-a23e-472000a73d87\") " pod="openshift-marketplace/certified-operators-rlmp7" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.622142 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aebef18-f17d-487d-a23e-472000a73d87-utilities\") pod \"certified-operators-rlmp7\" (UID: \"9aebef18-f17d-487d-a23e-472000a73d87\") " pod="openshift-marketplace/certified-operators-rlmp7" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.622386 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aebef18-f17d-487d-a23e-472000a73d87-utilities\") pod \"certified-operators-rlmp7\" (UID: \"9aebef18-f17d-487d-a23e-472000a73d87\") " pod="openshift-marketplace/certified-operators-rlmp7" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.650921 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72qxr\" (UniqueName: \"kubernetes.io/projected/9aebef18-f17d-487d-a23e-472000a73d87-kube-api-access-72qxr\") pod \"certified-operators-rlmp7\" (UID: \"9aebef18-f17d-487d-a23e-472000a73d87\") " pod="openshift-marketplace/certified-operators-rlmp7" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.790842 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlmp7" Oct 02 10:56:51 crc kubenswrapper[4766]: I1002 10:56:51.970302 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q8cpr"] Oct 02 10:56:52 crc kubenswrapper[4766]: I1002 10:56:52.009621 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rlmp7"] Oct 02 10:56:52 crc kubenswrapper[4766]: I1002 10:56:52.630768 4766 generic.go:334] "Generic (PLEG): container finished" podID="9233c36a-a15b-4668-9da2-d7e2a778fa2e" containerID="a22ee22a44df00ec5efef59c0350d7312693c814b2bbc1fa16ff2e542d0951ca" exitCode=0 Oct 02 10:56:52 crc kubenswrapper[4766]: I1002 10:56:52.630877 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zwjb" event={"ID":"9233c36a-a15b-4668-9da2-d7e2a778fa2e","Type":"ContainerDied","Data":"a22ee22a44df00ec5efef59c0350d7312693c814b2bbc1fa16ff2e542d0951ca"} Oct 02 10:56:52 crc kubenswrapper[4766]: I1002 10:56:52.636227 4766 generic.go:334] "Generic (PLEG): container finished" podID="c98005ea-50d3-4b26-9049-2beb07771f21" containerID="d154871ea5374c561719519af23288fec3238750b9ca277442e91446e8220f53" exitCode=0 Oct 02 10:56:52 crc kubenswrapper[4766]: I1002 10:56:52.636305 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8cpr" event={"ID":"c98005ea-50d3-4b26-9049-2beb07771f21","Type":"ContainerDied","Data":"d154871ea5374c561719519af23288fec3238750b9ca277442e91446e8220f53"} Oct 02 10:56:52 crc kubenswrapper[4766]: I1002 10:56:52.636328 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8cpr" event={"ID":"c98005ea-50d3-4b26-9049-2beb07771f21","Type":"ContainerStarted","Data":"832a5d9ee0095da19e52a4fdc13cdc2f61a272a5de6d017ea293095f45466f7f"} Oct 02 10:56:52 crc kubenswrapper[4766]: I1002 10:56:52.641412 4766 generic.go:334] "Generic (PLEG): container finished" podID="697f9f5a-2f67-4b88-8fab-f29d029c1643" containerID="4a472d7ec2cff708ba1e18c1178eba44195a7c8df532227be102f2d79b311118" exitCode=0 Oct 02 10:56:52 crc kubenswrapper[4766]: I1002 10:56:52.641624 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p827n" event={"ID":"697f9f5a-2f67-4b88-8fab-f29d029c1643","Type":"ContainerDied","Data":"4a472d7ec2cff708ba1e18c1178eba44195a7c8df532227be102f2d79b311118"} Oct 02 10:56:52 crc kubenswrapper[4766]: I1002 10:56:52.645641 4766 generic.go:334] "Generic (PLEG): container finished" podID="9aebef18-f17d-487d-a23e-472000a73d87" containerID="ded4a0eaa40a7b949183881f080562b8b4673f8749cb8263cf6514b9bdbfc96a" exitCode=0 Oct 02 10:56:52 crc kubenswrapper[4766]: I1002 10:56:52.645681 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlmp7" event={"ID":"9aebef18-f17d-487d-a23e-472000a73d87","Type":"ContainerDied","Data":"ded4a0eaa40a7b949183881f080562b8b4673f8749cb8263cf6514b9bdbfc96a"} Oct 02 10:56:52 crc kubenswrapper[4766]: I1002 10:56:52.645703 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlmp7" event={"ID":"9aebef18-f17d-487d-a23e-472000a73d87","Type":"ContainerStarted","Data":"a22322b8ddea43b123f6226e24a3243c27789e8d5728ecfe8133a90ac1e8ff56"} Oct 02 10:56:53 crc kubenswrapper[4766]: I1002 10:56:53.654540 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p827n" event={"ID":"697f9f5a-2f67-4b88-8fab-f29d029c1643","Type":"ContainerStarted","Data":"20601acbf8e50cd4531b414c903e37be9e9e2c196f30e56908385f01095148f6"} Oct 02 10:56:53 crc kubenswrapper[4766]: I1002 10:56:53.656473 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlmp7" event={"ID":"9aebef18-f17d-487d-a23e-472000a73d87","Type":"ContainerStarted","Data":"4f2d9d5fc23de7af5795210d175f2c1577a2ad5fe12bdb33df34a7bae5ed2e87"} Oct 02 10:56:53 crc kubenswrapper[4766]: I1002 10:56:53.658131 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4zwjb" event={"ID":"9233c36a-a15b-4668-9da2-d7e2a778fa2e","Type":"ContainerStarted","Data":"2e6e249116d895ad78689af1289fdfa25f07c564ae01639f79b4f94ac2ba74a5"} Oct 02 10:56:53 crc kubenswrapper[4766]: I1002 10:56:53.678742 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p827n" podStartSLOduration=1.959883673 podStartE2EDuration="4.678724242s" podCreationTimestamp="2025-10-02 10:56:49 +0000 UTC" firstStartedPulling="2025-10-02 10:56:50.619917042 +0000 UTC m=+325.562787986" lastFinishedPulling="2025-10-02 10:56:53.338757611 +0000 UTC m=+328.281628555" observedRunningTime="2025-10-02 10:56:53.676843982 +0000 UTC m=+328.619714936" watchObservedRunningTime="2025-10-02 10:56:53.678724242 +0000 UTC m=+328.621595186" Oct 02 10:56:53 crc kubenswrapper[4766]: I1002 10:56:53.693058 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4zwjb" podStartSLOduration=3.300437763 podStartE2EDuration="5.693041918s" podCreationTimestamp="2025-10-02 10:56:48 +0000 UTC" firstStartedPulling="2025-10-02 10:56:50.622096642 +0000 UTC m=+325.564967626" lastFinishedPulling="2025-10-02 10:56:53.014700837 +0000 UTC m=+327.957571781" observedRunningTime="2025-10-02 10:56:53.692089369 +0000 UTC m=+328.634960323" watchObservedRunningTime="2025-10-02 10:56:53.693041918 +0000 UTC m=+328.635912852" Oct 02 10:56:54 crc kubenswrapper[4766]: I1002 10:56:54.665083 4766 generic.go:334] "Generic (PLEG): container finished" podID="9aebef18-f17d-487d-a23e-472000a73d87" containerID="4f2d9d5fc23de7af5795210d175f2c1577a2ad5fe12bdb33df34a7bae5ed2e87" exitCode=0 Oct 02 10:56:54 crc kubenswrapper[4766]: I1002 10:56:54.665188 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlmp7" event={"ID":"9aebef18-f17d-487d-a23e-472000a73d87","Type":"ContainerDied","Data":"4f2d9d5fc23de7af5795210d175f2c1577a2ad5fe12bdb33df34a7bae5ed2e87"} Oct 02 10:56:56 crc kubenswrapper[4766]: I1002 10:56:56.678733 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8cpr" event={"ID":"c98005ea-50d3-4b26-9049-2beb07771f21","Type":"ContainerStarted","Data":"7260419aa64dc42d35a16b34fbfac34067dee6a6e1f5ce9b99f0ce44bb31b2e8"} Oct 02 10:56:57 crc kubenswrapper[4766]: I1002 10:56:57.685865 4766 generic.go:334] "Generic (PLEG): container finished" podID="c98005ea-50d3-4b26-9049-2beb07771f21" containerID="7260419aa64dc42d35a16b34fbfac34067dee6a6e1f5ce9b99f0ce44bb31b2e8" exitCode=0 Oct 02 10:56:57 crc kubenswrapper[4766]: I1002 10:56:57.685943 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8cpr" event={"ID":"c98005ea-50d3-4b26-9049-2beb07771f21","Type":"ContainerDied","Data":"7260419aa64dc42d35a16b34fbfac34067dee6a6e1f5ce9b99f0ce44bb31b2e8"} Oct 02 10:56:59 crc kubenswrapper[4766]: I1002 10:56:59.154577 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4zwjb" Oct 02 10:56:59 crc kubenswrapper[4766]: I1002 10:56:59.154942 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4zwjb" Oct 02 10:56:59 crc kubenswrapper[4766]: I1002 10:56:59.197980 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4zwjb" Oct 02 10:56:59 crc kubenswrapper[4766]: I1002 10:56:59.364112 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p827n" Oct 02 10:56:59 crc kubenswrapper[4766]: I1002 10:56:59.364188 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p827n" Oct 02 10:56:59 crc kubenswrapper[4766]: I1002 10:56:59.420953 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p827n" Oct 02 10:56:59 crc kubenswrapper[4766]: I1002 10:56:59.737164 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4zwjb" Oct 02 10:56:59 crc kubenswrapper[4766]: I1002 10:56:59.757994 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p827n" Oct 02 10:57:02 crc kubenswrapper[4766]: I1002 10:57:02.715045 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlmp7" event={"ID":"9aebef18-f17d-487d-a23e-472000a73d87","Type":"ContainerStarted","Data":"2cccc499a7ffb4a4140490cb2d830bef4f3329fc6097dbe6b03cb7e4feaf5fab"} Oct 02 10:57:03 crc kubenswrapper[4766]: I1002 10:57:03.735887 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rlmp7" podStartSLOduration=3.401721567 podStartE2EDuration="12.735872626s" podCreationTimestamp="2025-10-02 10:56:51 +0000 UTC" firstStartedPulling="2025-10-02 10:56:52.64694439 +0000 UTC m=+327.589815334" lastFinishedPulling="2025-10-02 10:57:01.981095449 +0000 UTC m=+336.923966393" observedRunningTime="2025-10-02 10:57:03.734642697 +0000 UTC m=+338.677513641" watchObservedRunningTime="2025-10-02 10:57:03.735872626 +0000 UTC m=+338.678743570" Oct 02 10:57:10 crc kubenswrapper[4766]: I1002 10:57:10.754025 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q8cpr" event={"ID":"c98005ea-50d3-4b26-9049-2beb07771f21","Type":"ContainerStarted","Data":"076ec3e926bf018eb72ff1ccd08a5a33e23be70aaeaa5517e6256c727c417d08"} Oct 02 10:57:11 crc kubenswrapper[4766]: I1002 10:57:11.792812 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rlmp7" Oct 02 10:57:11 crc kubenswrapper[4766]: I1002 10:57:11.792910 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rlmp7" Oct 02 10:57:11 crc kubenswrapper[4766]: I1002 10:57:11.833025 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rlmp7" Oct 02 10:57:12 crc kubenswrapper[4766]: I1002 10:57:12.779109 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q8cpr" podStartSLOduration=5.427890752 podStartE2EDuration="21.779089046s" podCreationTimestamp="2025-10-02 10:56:51 +0000 UTC" firstStartedPulling="2025-10-02 10:56:52.63909953 +0000 UTC m=+327.581970474" lastFinishedPulling="2025-10-02 10:57:08.990297764 +0000 UTC m=+343.933168768" observedRunningTime="2025-10-02 10:57:12.775592325 +0000 UTC m=+347.718463279" watchObservedRunningTime="2025-10-02 10:57:12.779089046 +0000 UTC m=+347.721959990" Oct 02 10:57:12 crc kubenswrapper[4766]: I1002 10:57:12.805314 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rlmp7" Oct 02 10:57:21 crc kubenswrapper[4766]: I1002 10:57:21.557574 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q8cpr" Oct 02 10:57:21 crc kubenswrapper[4766]: I1002 10:57:21.558110 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q8cpr" Oct 02 10:57:21 crc kubenswrapper[4766]: I1002 10:57:21.598677 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q8cpr" Oct 02 10:57:21 crc kubenswrapper[4766]: I1002 10:57:21.838556 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q8cpr" Oct 02 10:57:24 crc kubenswrapper[4766]: I1002 10:57:24.432788 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 10:57:24 crc kubenswrapper[4766]: I1002 10:57:24.432857 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 10:57:54 crc kubenswrapper[4766]: I1002 10:57:54.432807 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 10:57:54 crc kubenswrapper[4766]: I1002 10:57:54.433438 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 10:58:24 crc kubenswrapper[4766]: I1002 10:58:24.432399 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 10:58:24 crc kubenswrapper[4766]: I1002 10:58:24.433056 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 10:58:24 crc kubenswrapper[4766]: I1002 10:58:24.433149 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 10:58:24 crc kubenswrapper[4766]: I1002 10:58:24.433776 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"769721c28661e517737abb6064d68fa4fa2746b13fc804a3158b4c035b7b61c8"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 10:58:24 crc kubenswrapper[4766]: I1002 10:58:24.433836 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://769721c28661e517737abb6064d68fa4fa2746b13fc804a3158b4c035b7b61c8" gracePeriod=600 Oct 02 10:58:25 crc kubenswrapper[4766]: I1002 10:58:25.170079 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="769721c28661e517737abb6064d68fa4fa2746b13fc804a3158b4c035b7b61c8" exitCode=0 Oct 02 10:58:25 crc kubenswrapper[4766]: I1002 10:58:25.170166 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"769721c28661e517737abb6064d68fa4fa2746b13fc804a3158b4c035b7b61c8"} Oct 02 10:58:25 crc kubenswrapper[4766]: I1002 10:58:25.170517 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"3166d2ef6aeab8513526481d091131174bb783f49cb0e1e12abde1175b4b3aeb"} Oct 02 10:58:25 crc kubenswrapper[4766]: I1002 10:58:25.170558 4766 scope.go:117] "RemoveContainer" containerID="5665bd7da28a1a574ee22d388fc3802c77fc7ca9bf1c6b51ae73146d530b1b65" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.144776 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rkv52"] Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.146192 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.157980 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rkv52"] Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.327901 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51f2e19c-43c1-419e-80ed-6e49b17157a2-trusted-ca\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.328000 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51f2e19c-43c1-419e-80ed-6e49b17157a2-bound-sa-token\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.328039 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.328060 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51f2e19c-43c1-419e-80ed-6e49b17157a2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.328082 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51f2e19c-43c1-419e-80ed-6e49b17157a2-registry-tls\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.328104 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7t6q\" (UniqueName: \"kubernetes.io/projected/51f2e19c-43c1-419e-80ed-6e49b17157a2-kube-api-access-p7t6q\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.328127 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51f2e19c-43c1-419e-80ed-6e49b17157a2-registry-certificates\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.328203 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51f2e19c-43c1-419e-80ed-6e49b17157a2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.351160 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.429113 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51f2e19c-43c1-419e-80ed-6e49b17157a2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.429181 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51f2e19c-43c1-419e-80ed-6e49b17157a2-trusted-ca\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.429203 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51f2e19c-43c1-419e-80ed-6e49b17157a2-bound-sa-token\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.429220 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51f2e19c-43c1-419e-80ed-6e49b17157a2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.429249 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51f2e19c-43c1-419e-80ed-6e49b17157a2-registry-tls\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.429277 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7t6q\" (UniqueName: \"kubernetes.io/projected/51f2e19c-43c1-419e-80ed-6e49b17157a2-kube-api-access-p7t6q\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.429302 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51f2e19c-43c1-419e-80ed-6e49b17157a2-registry-certificates\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.430599 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/51f2e19c-43c1-419e-80ed-6e49b17157a2-registry-certificates\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.430795 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/51f2e19c-43c1-419e-80ed-6e49b17157a2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.430822 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/51f2e19c-43c1-419e-80ed-6e49b17157a2-trusted-ca\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.435734 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/51f2e19c-43c1-419e-80ed-6e49b17157a2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.435782 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/51f2e19c-43c1-419e-80ed-6e49b17157a2-registry-tls\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.445642 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/51f2e19c-43c1-419e-80ed-6e49b17157a2-bound-sa-token\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.446060 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7t6q\" (UniqueName: \"kubernetes.io/projected/51f2e19c-43c1-419e-80ed-6e49b17157a2-kube-api-access-p7t6q\") pod \"image-registry-66df7c8f76-rkv52\" (UID: \"51f2e19c-43c1-419e-80ed-6e49b17157a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.464628 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:07 crc kubenswrapper[4766]: I1002 10:59:07.640143 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-rkv52"] Oct 02 10:59:08 crc kubenswrapper[4766]: I1002 10:59:08.458065 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" event={"ID":"51f2e19c-43c1-419e-80ed-6e49b17157a2","Type":"ContainerStarted","Data":"cb32248db1c792ac528d13c6e18f800b603e18e03f91c47131fbb3e46c047485"} Oct 02 10:59:08 crc kubenswrapper[4766]: I1002 10:59:08.458111 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" event={"ID":"51f2e19c-43c1-419e-80ed-6e49b17157a2","Type":"ContainerStarted","Data":"c1b61139b225892d814be576ac041d30ad316f8377a957ff9cde5150cf74d143"} Oct 02 10:59:09 crc kubenswrapper[4766]: I1002 10:59:09.465463 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:09 crc kubenswrapper[4766]: I1002 10:59:09.500258 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" podStartSLOduration=2.5002327859999998 podStartE2EDuration="2.500232786s" podCreationTimestamp="2025-10-02 10:59:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 10:59:09.493298414 +0000 UTC m=+464.436169388" watchObservedRunningTime="2025-10-02 10:59:09.500232786 +0000 UTC m=+464.443103770" Oct 02 10:59:27 crc kubenswrapper[4766]: I1002 10:59:27.470784 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-rkv52" Oct 02 10:59:27 crc kubenswrapper[4766]: I1002 10:59:27.553087 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-knkk5"] Oct 02 10:59:52 crc kubenswrapper[4766]: I1002 10:59:52.611737 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" podUID="347022cb-d24b-4f67-900e-c2b858cc49fc" containerName="registry" containerID="cri-o://172700260a37b1c421d758df5e909ea4af6599d94b9fea4c53fc2194ff417154" gracePeriod=30 Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.473297 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.611994 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/347022cb-d24b-4f67-900e-c2b858cc49fc-registry-certificates\") pod \"347022cb-d24b-4f67-900e-c2b858cc49fc\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.612058 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/347022cb-d24b-4f67-900e-c2b858cc49fc-installation-pull-secrets\") pod \"347022cb-d24b-4f67-900e-c2b858cc49fc\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.612142 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-registry-tls\") pod \"347022cb-d24b-4f67-900e-c2b858cc49fc\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.612320 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"347022cb-d24b-4f67-900e-c2b858cc49fc\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.612370 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz4lb\" (UniqueName: \"kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-kube-api-access-bz4lb\") pod \"347022cb-d24b-4f67-900e-c2b858cc49fc\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.612394 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/347022cb-d24b-4f67-900e-c2b858cc49fc-trusted-ca\") pod \"347022cb-d24b-4f67-900e-c2b858cc49fc\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.612425 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/347022cb-d24b-4f67-900e-c2b858cc49fc-ca-trust-extracted\") pod \"347022cb-d24b-4f67-900e-c2b858cc49fc\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.612483 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-bound-sa-token\") pod \"347022cb-d24b-4f67-900e-c2b858cc49fc\" (UID: \"347022cb-d24b-4f67-900e-c2b858cc49fc\") " Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.612973 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/347022cb-d24b-4f67-900e-c2b858cc49fc-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "347022cb-d24b-4f67-900e-c2b858cc49fc" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.613581 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/347022cb-d24b-4f67-900e-c2b858cc49fc-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "347022cb-d24b-4f67-900e-c2b858cc49fc" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.619740 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/347022cb-d24b-4f67-900e-c2b858cc49fc-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "347022cb-d24b-4f67-900e-c2b858cc49fc" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.620096 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "347022cb-d24b-4f67-900e-c2b858cc49fc" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.623047 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "347022cb-d24b-4f67-900e-c2b858cc49fc" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.624410 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-kube-api-access-bz4lb" (OuterVolumeSpecName: "kube-api-access-bz4lb") pod "347022cb-d24b-4f67-900e-c2b858cc49fc" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc"). InnerVolumeSpecName "kube-api-access-bz4lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.631259 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/347022cb-d24b-4f67-900e-c2b858cc49fc-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "347022cb-d24b-4f67-900e-c2b858cc49fc" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.632324 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "347022cb-d24b-4f67-900e-c2b858cc49fc" (UID: "347022cb-d24b-4f67-900e-c2b858cc49fc"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.713738 4766 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/347022cb-d24b-4f67-900e-c2b858cc49fc-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.713824 4766 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/347022cb-d24b-4f67-900e-c2b858cc49fc-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.713840 4766 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.713854 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz4lb\" (UniqueName: \"kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-kube-api-access-bz4lb\") on node \"crc\" DevicePath \"\"" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.713868 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/347022cb-d24b-4f67-900e-c2b858cc49fc-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.713879 4766 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/347022cb-d24b-4f67-900e-c2b858cc49fc-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.713890 4766 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/347022cb-d24b-4f67-900e-c2b858cc49fc-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.745436 4766 generic.go:334] "Generic (PLEG): container finished" podID="347022cb-d24b-4f67-900e-c2b858cc49fc" containerID="172700260a37b1c421d758df5e909ea4af6599d94b9fea4c53fc2194ff417154" exitCode=0 Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.745480 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" event={"ID":"347022cb-d24b-4f67-900e-c2b858cc49fc","Type":"ContainerDied","Data":"172700260a37b1c421d758df5e909ea4af6599d94b9fea4c53fc2194ff417154"} Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.745543 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" event={"ID":"347022cb-d24b-4f67-900e-c2b858cc49fc","Type":"ContainerDied","Data":"bf6be94adb900ecf84345c9e790c9c20168dac4044871e877ee7fe3fe3949cdb"} Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.745573 4766 scope.go:117] "RemoveContainer" containerID="172700260a37b1c421d758df5e909ea4af6599d94b9fea4c53fc2194ff417154" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.745710 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-knkk5" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.762807 4766 scope.go:117] "RemoveContainer" containerID="172700260a37b1c421d758df5e909ea4af6599d94b9fea4c53fc2194ff417154" Oct 02 10:59:53 crc kubenswrapper[4766]: E1002 10:59:53.766996 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"172700260a37b1c421d758df5e909ea4af6599d94b9fea4c53fc2194ff417154\": container with ID starting with 172700260a37b1c421d758df5e909ea4af6599d94b9fea4c53fc2194ff417154 not found: ID does not exist" containerID="172700260a37b1c421d758df5e909ea4af6599d94b9fea4c53fc2194ff417154" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.767039 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"172700260a37b1c421d758df5e909ea4af6599d94b9fea4c53fc2194ff417154"} err="failed to get container status \"172700260a37b1c421d758df5e909ea4af6599d94b9fea4c53fc2194ff417154\": rpc error: code = NotFound desc = could not find container \"172700260a37b1c421d758df5e909ea4af6599d94b9fea4c53fc2194ff417154\": container with ID starting with 172700260a37b1c421d758df5e909ea4af6599d94b9fea4c53fc2194ff417154 not found: ID does not exist" Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.772633 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-knkk5"] Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.778903 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-knkk5"] Oct 02 10:59:53 crc kubenswrapper[4766]: I1002 10:59:53.888056 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="347022cb-d24b-4f67-900e-c2b858cc49fc" path="/var/lib/kubelet/pods/347022cb-d24b-4f67-900e-c2b858cc49fc/volumes" Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.124288 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l"] Oct 02 11:00:00 crc kubenswrapper[4766]: E1002 11:00:00.124607 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="347022cb-d24b-4f67-900e-c2b858cc49fc" containerName="registry" Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.124624 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="347022cb-d24b-4f67-900e-c2b858cc49fc" containerName="registry" Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.124755 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="347022cb-d24b-4f67-900e-c2b858cc49fc" containerName="registry" Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.125186 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l" Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.126985 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.133525 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.134789 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l"] Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.291371 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/281a6f64-2f0a-4020-9b9d-af55767e345d-secret-volume\") pod \"collect-profiles-29323380-rvn2l\" (UID: \"281a6f64-2f0a-4020-9b9d-af55767e345d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l" Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.291462 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/281a6f64-2f0a-4020-9b9d-af55767e345d-config-volume\") pod \"collect-profiles-29323380-rvn2l\" (UID: \"281a6f64-2f0a-4020-9b9d-af55767e345d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l" Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.291581 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxgr6\" (UniqueName: \"kubernetes.io/projected/281a6f64-2f0a-4020-9b9d-af55767e345d-kube-api-access-lxgr6\") pod \"collect-profiles-29323380-rvn2l\" (UID: \"281a6f64-2f0a-4020-9b9d-af55767e345d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l" Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.392521 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxgr6\" (UniqueName: \"kubernetes.io/projected/281a6f64-2f0a-4020-9b9d-af55767e345d-kube-api-access-lxgr6\") pod \"collect-profiles-29323380-rvn2l\" (UID: \"281a6f64-2f0a-4020-9b9d-af55767e345d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l" Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.392818 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/281a6f64-2f0a-4020-9b9d-af55767e345d-secret-volume\") pod \"collect-profiles-29323380-rvn2l\" (UID: \"281a6f64-2f0a-4020-9b9d-af55767e345d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l" Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.392891 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/281a6f64-2f0a-4020-9b9d-af55767e345d-config-volume\") pod \"collect-profiles-29323380-rvn2l\" (UID: \"281a6f64-2f0a-4020-9b9d-af55767e345d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l" Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.393873 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/281a6f64-2f0a-4020-9b9d-af55767e345d-config-volume\") pod \"collect-profiles-29323380-rvn2l\" (UID: \"281a6f64-2f0a-4020-9b9d-af55767e345d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l" Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.399034 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/281a6f64-2f0a-4020-9b9d-af55767e345d-secret-volume\") pod \"collect-profiles-29323380-rvn2l\" (UID: \"281a6f64-2f0a-4020-9b9d-af55767e345d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l" Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.407668 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxgr6\" (UniqueName: \"kubernetes.io/projected/281a6f64-2f0a-4020-9b9d-af55767e345d-kube-api-access-lxgr6\") pod \"collect-profiles-29323380-rvn2l\" (UID: \"281a6f64-2f0a-4020-9b9d-af55767e345d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l" Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.446475 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l" Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.609866 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l"] Oct 02 11:00:00 crc kubenswrapper[4766]: W1002 11:00:00.615799 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod281a6f64_2f0a_4020_9b9d_af55767e345d.slice/crio-8bafc7e67fa3243d8ea5999f62d0fe8af5fc2c67ffbfd219e411a23af35c479c WatchSource:0}: Error finding container 8bafc7e67fa3243d8ea5999f62d0fe8af5fc2c67ffbfd219e411a23af35c479c: Status 404 returned error can't find the container with id 8bafc7e67fa3243d8ea5999f62d0fe8af5fc2c67ffbfd219e411a23af35c479c Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.784065 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l" event={"ID":"281a6f64-2f0a-4020-9b9d-af55767e345d","Type":"ContainerStarted","Data":"3e43d0f68fbfaa2ef85508b4393f1f16a73976e3c105f57a2cbaf66524ebe833"} Oct 02 11:00:00 crc kubenswrapper[4766]: I1002 11:00:00.784111 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l" event={"ID":"281a6f64-2f0a-4020-9b9d-af55767e345d","Type":"ContainerStarted","Data":"8bafc7e67fa3243d8ea5999f62d0fe8af5fc2c67ffbfd219e411a23af35c479c"} Oct 02 11:00:01 crc kubenswrapper[4766]: I1002 11:00:01.789154 4766 generic.go:334] "Generic (PLEG): container finished" podID="281a6f64-2f0a-4020-9b9d-af55767e345d" containerID="3e43d0f68fbfaa2ef85508b4393f1f16a73976e3c105f57a2cbaf66524ebe833" exitCode=0 Oct 02 11:00:01 crc kubenswrapper[4766]: I1002 11:00:01.789372 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l" event={"ID":"281a6f64-2f0a-4020-9b9d-af55767e345d","Type":"ContainerDied","Data":"3e43d0f68fbfaa2ef85508b4393f1f16a73976e3c105f57a2cbaf66524ebe833"} Oct 02 11:00:02 crc kubenswrapper[4766]: I1002 11:00:02.980641 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l" Oct 02 11:00:03 crc kubenswrapper[4766]: I1002 11:00:03.124578 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/281a6f64-2f0a-4020-9b9d-af55767e345d-config-volume\") pod \"281a6f64-2f0a-4020-9b9d-af55767e345d\" (UID: \"281a6f64-2f0a-4020-9b9d-af55767e345d\") " Oct 02 11:00:03 crc kubenswrapper[4766]: I1002 11:00:03.124705 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxgr6\" (UniqueName: \"kubernetes.io/projected/281a6f64-2f0a-4020-9b9d-af55767e345d-kube-api-access-lxgr6\") pod \"281a6f64-2f0a-4020-9b9d-af55767e345d\" (UID: \"281a6f64-2f0a-4020-9b9d-af55767e345d\") " Oct 02 11:00:03 crc kubenswrapper[4766]: I1002 11:00:03.124776 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/281a6f64-2f0a-4020-9b9d-af55767e345d-secret-volume\") pod \"281a6f64-2f0a-4020-9b9d-af55767e345d\" (UID: \"281a6f64-2f0a-4020-9b9d-af55767e345d\") " Oct 02 11:00:03 crc kubenswrapper[4766]: I1002 11:00:03.125437 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/281a6f64-2f0a-4020-9b9d-af55767e345d-config-volume" (OuterVolumeSpecName: "config-volume") pod "281a6f64-2f0a-4020-9b9d-af55767e345d" (UID: "281a6f64-2f0a-4020-9b9d-af55767e345d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:00:03 crc kubenswrapper[4766]: I1002 11:00:03.129859 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/281a6f64-2f0a-4020-9b9d-af55767e345d-kube-api-access-lxgr6" (OuterVolumeSpecName: "kube-api-access-lxgr6") pod "281a6f64-2f0a-4020-9b9d-af55767e345d" (UID: "281a6f64-2f0a-4020-9b9d-af55767e345d"). InnerVolumeSpecName "kube-api-access-lxgr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:00:03 crc kubenswrapper[4766]: I1002 11:00:03.131568 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/281a6f64-2f0a-4020-9b9d-af55767e345d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "281a6f64-2f0a-4020-9b9d-af55767e345d" (UID: "281a6f64-2f0a-4020-9b9d-af55767e345d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:00:03 crc kubenswrapper[4766]: I1002 11:00:03.225565 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/281a6f64-2f0a-4020-9b9d-af55767e345d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:03 crc kubenswrapper[4766]: I1002 11:00:03.225601 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxgr6\" (UniqueName: \"kubernetes.io/projected/281a6f64-2f0a-4020-9b9d-af55767e345d-kube-api-access-lxgr6\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:03 crc kubenswrapper[4766]: I1002 11:00:03.225611 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/281a6f64-2f0a-4020-9b9d-af55767e345d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:00:03 crc kubenswrapper[4766]: I1002 11:00:03.801067 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l" event={"ID":"281a6f64-2f0a-4020-9b9d-af55767e345d","Type":"ContainerDied","Data":"8bafc7e67fa3243d8ea5999f62d0fe8af5fc2c67ffbfd219e411a23af35c479c"} Oct 02 11:00:03 crc kubenswrapper[4766]: I1002 11:00:03.801105 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bafc7e67fa3243d8ea5999f62d0fe8af5fc2c67ffbfd219e411a23af35c479c" Oct 02 11:00:03 crc kubenswrapper[4766]: I1002 11:00:03.801130 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l" Oct 02 11:00:24 crc kubenswrapper[4766]: I1002 11:00:24.431586 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:00:24 crc kubenswrapper[4766]: I1002 11:00:24.431979 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:00:54 crc kubenswrapper[4766]: I1002 11:00:54.431803 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:00:54 crc kubenswrapper[4766]: I1002 11:00:54.432467 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:01:24 crc kubenswrapper[4766]: I1002 11:01:24.432471 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:01:24 crc kubenswrapper[4766]: I1002 11:01:24.433835 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:01:24 crc kubenswrapper[4766]: I1002 11:01:24.434146 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 11:01:24 crc kubenswrapper[4766]: I1002 11:01:24.435462 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3166d2ef6aeab8513526481d091131174bb783f49cb0e1e12abde1175b4b3aeb"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:01:24 crc kubenswrapper[4766]: I1002 11:01:24.435654 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://3166d2ef6aeab8513526481d091131174bb783f49cb0e1e12abde1175b4b3aeb" gracePeriod=600 Oct 02 11:01:25 crc kubenswrapper[4766]: I1002 11:01:25.212940 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="3166d2ef6aeab8513526481d091131174bb783f49cb0e1e12abde1175b4b3aeb" exitCode=0 Oct 02 11:01:25 crc kubenswrapper[4766]: I1002 11:01:25.213041 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"3166d2ef6aeab8513526481d091131174bb783f49cb0e1e12abde1175b4b3aeb"} Oct 02 11:01:25 crc kubenswrapper[4766]: I1002 11:01:25.213193 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"853a899410d52955e7ea02637bfd357fd7fb7ea7cb0e26bfedf87008f24173b5"} Oct 02 11:01:25 crc kubenswrapper[4766]: I1002 11:01:25.213217 4766 scope.go:117] "RemoveContainer" containerID="769721c28661e517737abb6064d68fa4fa2746b13fc804a3158b4c035b7b61c8" Oct 02 11:03:24 crc kubenswrapper[4766]: I1002 11:03:24.432419 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:03:24 crc kubenswrapper[4766]: I1002 11:03:24.433018 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:03:54 crc kubenswrapper[4766]: I1002 11:03:54.432259 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:03:54 crc kubenswrapper[4766]: I1002 11:03:54.432898 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.209616 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tmkn8"] Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.210103 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" podUID="afb77d71-ded6-4158-a3fe-461336cece71" containerName="controller-manager" containerID="cri-o://8fcc5113edc86ad8750b351d0d36dc42a366e8f48a8f57b7b05495500f47c9ba" gracePeriod=30 Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.300947 4766 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-tmkn8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.301005 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" podUID="afb77d71-ded6-4158-a3fe-461336cece71" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.339190 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k"] Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.339465 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" podUID="d0980a54-cd9d-4daa-a5ac-7f86e447f646" containerName="route-controller-manager" containerID="cri-o://5890d35b518447fa849728890c19ee94775ae437b93d908795cb79e5044b0274" gracePeriod=30 Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.408391 4766 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-f2c9k container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.408440 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" podUID="d0980a54-cd9d-4daa-a5ac-7f86e447f646" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.656746 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.820316 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afb77d71-ded6-4158-a3fe-461336cece71-serving-cert\") pod \"afb77d71-ded6-4158-a3fe-461336cece71\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.821463 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-config\") pod \"afb77d71-ded6-4158-a3fe-461336cece71\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.821542 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-proxy-ca-bundles\") pod \"afb77d71-ded6-4158-a3fe-461336cece71\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.821571 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wskrf\" (UniqueName: \"kubernetes.io/projected/afb77d71-ded6-4158-a3fe-461336cece71-kube-api-access-wskrf\") pod \"afb77d71-ded6-4158-a3fe-461336cece71\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.821657 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-client-ca\") pod \"afb77d71-ded6-4158-a3fe-461336cece71\" (UID: \"afb77d71-ded6-4158-a3fe-461336cece71\") " Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.822113 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "afb77d71-ded6-4158-a3fe-461336cece71" (UID: "afb77d71-ded6-4158-a3fe-461336cece71"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.822296 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-client-ca" (OuterVolumeSpecName: "client-ca") pod "afb77d71-ded6-4158-a3fe-461336cece71" (UID: "afb77d71-ded6-4158-a3fe-461336cece71"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.822319 4766 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.822918 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-config" (OuterVolumeSpecName: "config") pod "afb77d71-ded6-4158-a3fe-461336cece71" (UID: "afb77d71-ded6-4158-a3fe-461336cece71"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.832463 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb77d71-ded6-4158-a3fe-461336cece71-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "afb77d71-ded6-4158-a3fe-461336cece71" (UID: "afb77d71-ded6-4158-a3fe-461336cece71"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.832877 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb77d71-ded6-4158-a3fe-461336cece71-kube-api-access-wskrf" (OuterVolumeSpecName: "kube-api-access-wskrf") pod "afb77d71-ded6-4158-a3fe-461336cece71" (UID: "afb77d71-ded6-4158-a3fe-461336cece71"). InnerVolumeSpecName "kube-api-access-wskrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.923634 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wskrf\" (UniqueName: \"kubernetes.io/projected/afb77d71-ded6-4158-a3fe-461336cece71-kube-api-access-wskrf\") on node \"crc\" DevicePath \"\"" Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.923664 4766 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.923675 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afb77d71-ded6-4158-a3fe-461336cece71-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:03:59 crc kubenswrapper[4766]: I1002 11:03:59.923684 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afb77d71-ded6-4158-a3fe-461336cece71-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.044980 4766 generic.go:334] "Generic (PLEG): container finished" podID="d0980a54-cd9d-4daa-a5ac-7f86e447f646" containerID="5890d35b518447fa849728890c19ee94775ae437b93d908795cb79e5044b0274" exitCode=0 Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.045045 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" event={"ID":"d0980a54-cd9d-4daa-a5ac-7f86e447f646","Type":"ContainerDied","Data":"5890d35b518447fa849728890c19ee94775ae437b93d908795cb79e5044b0274"} Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.046565 4766 generic.go:334] "Generic (PLEG): container finished" podID="afb77d71-ded6-4158-a3fe-461336cece71" containerID="8fcc5113edc86ad8750b351d0d36dc42a366e8f48a8f57b7b05495500f47c9ba" exitCode=0 Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.046592 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.046608 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" event={"ID":"afb77d71-ded6-4158-a3fe-461336cece71","Type":"ContainerDied","Data":"8fcc5113edc86ad8750b351d0d36dc42a366e8f48a8f57b7b05495500f47c9ba"} Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.046629 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-tmkn8" event={"ID":"afb77d71-ded6-4158-a3fe-461336cece71","Type":"ContainerDied","Data":"85b202174382d1a651104786d8ceb570005debc3028a814727e335e02cff6d6c"} Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.046661 4766 scope.go:117] "RemoveContainer" containerID="8fcc5113edc86ad8750b351d0d36dc42a366e8f48a8f57b7b05495500f47c9ba" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.064613 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tmkn8"] Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.068133 4766 scope.go:117] "RemoveContainer" containerID="8fcc5113edc86ad8750b351d0d36dc42a366e8f48a8f57b7b05495500f47c9ba" Oct 02 11:04:00 crc kubenswrapper[4766]: E1002 11:04:00.068560 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fcc5113edc86ad8750b351d0d36dc42a366e8f48a8f57b7b05495500f47c9ba\": container with ID starting with 8fcc5113edc86ad8750b351d0d36dc42a366e8f48a8f57b7b05495500f47c9ba not found: ID does not exist" containerID="8fcc5113edc86ad8750b351d0d36dc42a366e8f48a8f57b7b05495500f47c9ba" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.068599 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fcc5113edc86ad8750b351d0d36dc42a366e8f48a8f57b7b05495500f47c9ba"} err="failed to get container status \"8fcc5113edc86ad8750b351d0d36dc42a366e8f48a8f57b7b05495500f47c9ba\": rpc error: code = NotFound desc = could not find container \"8fcc5113edc86ad8750b351d0d36dc42a366e8f48a8f57b7b05495500f47c9ba\": container with ID starting with 8fcc5113edc86ad8750b351d0d36dc42a366e8f48a8f57b7b05495500f47c9ba not found: ID does not exist" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.069341 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-tmkn8"] Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.236959 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.328661 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0980a54-cd9d-4daa-a5ac-7f86e447f646-config\") pod \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\" (UID: \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\") " Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.328728 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kzhj\" (UniqueName: \"kubernetes.io/projected/d0980a54-cd9d-4daa-a5ac-7f86e447f646-kube-api-access-2kzhj\") pod \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\" (UID: \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\") " Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.328768 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0980a54-cd9d-4daa-a5ac-7f86e447f646-serving-cert\") pod \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\" (UID: \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\") " Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.328798 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0980a54-cd9d-4daa-a5ac-7f86e447f646-client-ca\") pod \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\" (UID: \"d0980a54-cd9d-4daa-a5ac-7f86e447f646\") " Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.329564 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0980a54-cd9d-4daa-a5ac-7f86e447f646-client-ca" (OuterVolumeSpecName: "client-ca") pod "d0980a54-cd9d-4daa-a5ac-7f86e447f646" (UID: "d0980a54-cd9d-4daa-a5ac-7f86e447f646"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.329576 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0980a54-cd9d-4daa-a5ac-7f86e447f646-config" (OuterVolumeSpecName: "config") pod "d0980a54-cd9d-4daa-a5ac-7f86e447f646" (UID: "d0980a54-cd9d-4daa-a5ac-7f86e447f646"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.333976 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0980a54-cd9d-4daa-a5ac-7f86e447f646-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d0980a54-cd9d-4daa-a5ac-7f86e447f646" (UID: "d0980a54-cd9d-4daa-a5ac-7f86e447f646"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.333984 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0980a54-cd9d-4daa-a5ac-7f86e447f646-kube-api-access-2kzhj" (OuterVolumeSpecName: "kube-api-access-2kzhj") pod "d0980a54-cd9d-4daa-a5ac-7f86e447f646" (UID: "d0980a54-cd9d-4daa-a5ac-7f86e447f646"). InnerVolumeSpecName "kube-api-access-2kzhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.429943 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kzhj\" (UniqueName: \"kubernetes.io/projected/d0980a54-cd9d-4daa-a5ac-7f86e447f646-kube-api-access-2kzhj\") on node \"crc\" DevicePath \"\"" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.429974 4766 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0980a54-cd9d-4daa-a5ac-7f86e447f646-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.429983 4766 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0980a54-cd9d-4daa-a5ac-7f86e447f646-client-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.429991 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0980a54-cd9d-4daa-a5ac-7f86e447f646-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.496370 4766 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.880928 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74c65d689-fcfz4"] Oct 02 11:04:00 crc kubenswrapper[4766]: E1002 11:04:00.881451 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb77d71-ded6-4158-a3fe-461336cece71" containerName="controller-manager" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.881465 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb77d71-ded6-4158-a3fe-461336cece71" containerName="controller-manager" Oct 02 11:04:00 crc kubenswrapper[4766]: E1002 11:04:00.881476 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0980a54-cd9d-4daa-a5ac-7f86e447f646" containerName="route-controller-manager" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.881482 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0980a54-cd9d-4daa-a5ac-7f86e447f646" containerName="route-controller-manager" Oct 02 11:04:00 crc kubenswrapper[4766]: E1002 11:04:00.881513 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="281a6f64-2f0a-4020-9b9d-af55767e345d" containerName="collect-profiles" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.881520 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="281a6f64-2f0a-4020-9b9d-af55767e345d" containerName="collect-profiles" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.881617 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb77d71-ded6-4158-a3fe-461336cece71" containerName="controller-manager" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.881626 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="281a6f64-2f0a-4020-9b9d-af55767e345d" containerName="collect-profiles" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.881646 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0980a54-cd9d-4daa-a5ac-7f86e447f646" containerName="route-controller-manager" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.881987 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.884172 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x"] Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.884862 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.885068 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.885099 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.885587 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.885859 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.886184 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.886343 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.894280 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.895780 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74c65d689-fcfz4"] Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.898779 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x"] Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.936336 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f4935c5-faa6-4446-ae85-c8469f77f126-client-ca\") pod \"route-controller-manager-6b4bdfc9db-5ms7x\" (UID: \"4f4935c5-faa6-4446-ae85-c8469f77f126\") " pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.936395 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe33eeb7-4674-47a0-af2d-918f242f9806-serving-cert\") pod \"controller-manager-74c65d689-fcfz4\" (UID: \"fe33eeb7-4674-47a0-af2d-918f242f9806\") " pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.936482 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f4935c5-faa6-4446-ae85-c8469f77f126-serving-cert\") pod \"route-controller-manager-6b4bdfc9db-5ms7x\" (UID: \"4f4935c5-faa6-4446-ae85-c8469f77f126\") " pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.936760 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f4935c5-faa6-4446-ae85-c8469f77f126-config\") pod \"route-controller-manager-6b4bdfc9db-5ms7x\" (UID: \"4f4935c5-faa6-4446-ae85-c8469f77f126\") " pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.936995 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe33eeb7-4674-47a0-af2d-918f242f9806-client-ca\") pod \"controller-manager-74c65d689-fcfz4\" (UID: \"fe33eeb7-4674-47a0-af2d-918f242f9806\") " pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.937050 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8ftn\" (UniqueName: \"kubernetes.io/projected/4f4935c5-faa6-4446-ae85-c8469f77f126-kube-api-access-s8ftn\") pod \"route-controller-manager-6b4bdfc9db-5ms7x\" (UID: \"4f4935c5-faa6-4446-ae85-c8469f77f126\") " pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.937181 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe33eeb7-4674-47a0-af2d-918f242f9806-config\") pod \"controller-manager-74c65d689-fcfz4\" (UID: \"fe33eeb7-4674-47a0-af2d-918f242f9806\") " pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.937237 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q7lr\" (UniqueName: \"kubernetes.io/projected/fe33eeb7-4674-47a0-af2d-918f242f9806-kube-api-access-6q7lr\") pod \"controller-manager-74c65d689-fcfz4\" (UID: \"fe33eeb7-4674-47a0-af2d-918f242f9806\") " pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:00 crc kubenswrapper[4766]: I1002 11:04:00.937271 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe33eeb7-4674-47a0-af2d-918f242f9806-proxy-ca-bundles\") pod \"controller-manager-74c65d689-fcfz4\" (UID: \"fe33eeb7-4674-47a0-af2d-918f242f9806\") " pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.038049 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe33eeb7-4674-47a0-af2d-918f242f9806-client-ca\") pod \"controller-manager-74c65d689-fcfz4\" (UID: \"fe33eeb7-4674-47a0-af2d-918f242f9806\") " pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.038112 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8ftn\" (UniqueName: \"kubernetes.io/projected/4f4935c5-faa6-4446-ae85-c8469f77f126-kube-api-access-s8ftn\") pod \"route-controller-manager-6b4bdfc9db-5ms7x\" (UID: \"4f4935c5-faa6-4446-ae85-c8469f77f126\") " pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.038142 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe33eeb7-4674-47a0-af2d-918f242f9806-config\") pod \"controller-manager-74c65d689-fcfz4\" (UID: \"fe33eeb7-4674-47a0-af2d-918f242f9806\") " pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.038157 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q7lr\" (UniqueName: \"kubernetes.io/projected/fe33eeb7-4674-47a0-af2d-918f242f9806-kube-api-access-6q7lr\") pod \"controller-manager-74c65d689-fcfz4\" (UID: \"fe33eeb7-4674-47a0-af2d-918f242f9806\") " pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.038173 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe33eeb7-4674-47a0-af2d-918f242f9806-proxy-ca-bundles\") pod \"controller-manager-74c65d689-fcfz4\" (UID: \"fe33eeb7-4674-47a0-af2d-918f242f9806\") " pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.038199 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f4935c5-faa6-4446-ae85-c8469f77f126-client-ca\") pod \"route-controller-manager-6b4bdfc9db-5ms7x\" (UID: \"4f4935c5-faa6-4446-ae85-c8469f77f126\") " pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.038216 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe33eeb7-4674-47a0-af2d-918f242f9806-serving-cert\") pod \"controller-manager-74c65d689-fcfz4\" (UID: \"fe33eeb7-4674-47a0-af2d-918f242f9806\") " pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.038253 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f4935c5-faa6-4446-ae85-c8469f77f126-serving-cert\") pod \"route-controller-manager-6b4bdfc9db-5ms7x\" (UID: \"4f4935c5-faa6-4446-ae85-c8469f77f126\") " pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.038288 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f4935c5-faa6-4446-ae85-c8469f77f126-config\") pod \"route-controller-manager-6b4bdfc9db-5ms7x\" (UID: \"4f4935c5-faa6-4446-ae85-c8469f77f126\") " pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.039401 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f4935c5-faa6-4446-ae85-c8469f77f126-client-ca\") pod \"route-controller-manager-6b4bdfc9db-5ms7x\" (UID: \"4f4935c5-faa6-4446-ae85-c8469f77f126\") " pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.039623 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fe33eeb7-4674-47a0-af2d-918f242f9806-proxy-ca-bundles\") pod \"controller-manager-74c65d689-fcfz4\" (UID: \"fe33eeb7-4674-47a0-af2d-918f242f9806\") " pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.039810 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe33eeb7-4674-47a0-af2d-918f242f9806-config\") pod \"controller-manager-74c65d689-fcfz4\" (UID: \"fe33eeb7-4674-47a0-af2d-918f242f9806\") " pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.040065 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fe33eeb7-4674-47a0-af2d-918f242f9806-client-ca\") pod \"controller-manager-74c65d689-fcfz4\" (UID: \"fe33eeb7-4674-47a0-af2d-918f242f9806\") " pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.040971 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f4935c5-faa6-4446-ae85-c8469f77f126-config\") pod \"route-controller-manager-6b4bdfc9db-5ms7x\" (UID: \"4f4935c5-faa6-4446-ae85-c8469f77f126\") " pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.043455 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f4935c5-faa6-4446-ae85-c8469f77f126-serving-cert\") pod \"route-controller-manager-6b4bdfc9db-5ms7x\" (UID: \"4f4935c5-faa6-4446-ae85-c8469f77f126\") " pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.049088 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe33eeb7-4674-47a0-af2d-918f242f9806-serving-cert\") pod \"controller-manager-74c65d689-fcfz4\" (UID: \"fe33eeb7-4674-47a0-af2d-918f242f9806\") " pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.054410 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" event={"ID":"d0980a54-cd9d-4daa-a5ac-7f86e447f646","Type":"ContainerDied","Data":"7139d003d48b67444f57978c9282ff44d9cc29b1cd4c39f222b42cd8d28acb2d"} Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.054472 4766 scope.go:117] "RemoveContainer" containerID="5890d35b518447fa849728890c19ee94775ae437b93d908795cb79e5044b0274" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.054563 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.055799 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8ftn\" (UniqueName: \"kubernetes.io/projected/4f4935c5-faa6-4446-ae85-c8469f77f126-kube-api-access-s8ftn\") pod \"route-controller-manager-6b4bdfc9db-5ms7x\" (UID: \"4f4935c5-faa6-4446-ae85-c8469f77f126\") " pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.064310 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q7lr\" (UniqueName: \"kubernetes.io/projected/fe33eeb7-4674-47a0-af2d-918f242f9806-kube-api-access-6q7lr\") pod \"controller-manager-74c65d689-fcfz4\" (UID: \"fe33eeb7-4674-47a0-af2d-918f242f9806\") " pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.103485 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k"] Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.106864 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-f2c9k"] Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.198530 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.208772 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.515676 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74c65d689-fcfz4"] Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.652363 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x"] Oct 02 11:04:01 crc kubenswrapper[4766]: W1002 11:04:01.657562 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f4935c5_faa6_4446_ae85_c8469f77f126.slice/crio-f98e9f4a8bba15afc874972edde98c8f2466993d971c925cf283fb90862cb99f WatchSource:0}: Error finding container f98e9f4a8bba15afc874972edde98c8f2466993d971c925cf283fb90862cb99f: Status 404 returned error can't find the container with id f98e9f4a8bba15afc874972edde98c8f2466993d971c925cf283fb90862cb99f Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.893273 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb77d71-ded6-4158-a3fe-461336cece71" path="/var/lib/kubelet/pods/afb77d71-ded6-4158-a3fe-461336cece71/volumes" Oct 02 11:04:01 crc kubenswrapper[4766]: I1002 11:04:01.894379 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0980a54-cd9d-4daa-a5ac-7f86e447f646" path="/var/lib/kubelet/pods/d0980a54-cd9d-4daa-a5ac-7f86e447f646/volumes" Oct 02 11:04:02 crc kubenswrapper[4766]: I1002 11:04:02.063683 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" event={"ID":"4f4935c5-faa6-4446-ae85-c8469f77f126","Type":"ContainerStarted","Data":"14777bf9ac7d032160394a4f206c84955f416888afb6c38772a5404ac18bc82b"} Oct 02 11:04:02 crc kubenswrapper[4766]: I1002 11:04:02.063722 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" event={"ID":"4f4935c5-faa6-4446-ae85-c8469f77f126","Type":"ContainerStarted","Data":"f98e9f4a8bba15afc874972edde98c8f2466993d971c925cf283fb90862cb99f"} Oct 02 11:04:02 crc kubenswrapper[4766]: I1002 11:04:02.063965 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" Oct 02 11:04:02 crc kubenswrapper[4766]: I1002 11:04:02.066918 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" event={"ID":"fe33eeb7-4674-47a0-af2d-918f242f9806","Type":"ContainerStarted","Data":"a5f8ee0e363289a4dc9153c1a3b8490c7c7fb73f7ab5dcfed3753fca0e949d3a"} Oct 02 11:04:02 crc kubenswrapper[4766]: I1002 11:04:02.066952 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" event={"ID":"fe33eeb7-4674-47a0-af2d-918f242f9806","Type":"ContainerStarted","Data":"6244110d2e2530fb707118fe4fc526ffd0fd183f369308060261fd4c8324ec7a"} Oct 02 11:04:02 crc kubenswrapper[4766]: I1002 11:04:02.067536 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:02 crc kubenswrapper[4766]: I1002 11:04:02.084617 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" podStartSLOduration=3.084603356 podStartE2EDuration="3.084603356s" podCreationTimestamp="2025-10-02 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:04:02.081563458 +0000 UTC m=+757.024434402" watchObservedRunningTime="2025-10-02 11:04:02.084603356 +0000 UTC m=+757.027474300" Oct 02 11:04:02 crc kubenswrapper[4766]: I1002 11:04:02.101531 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" podStartSLOduration=3.10149165 podStartE2EDuration="3.10149165s" podCreationTimestamp="2025-10-02 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:04:02.10025336 +0000 UTC m=+757.043124304" watchObservedRunningTime="2025-10-02 11:04:02.10149165 +0000 UTC m=+757.044362594" Oct 02 11:04:02 crc kubenswrapper[4766]: I1002 11:04:02.102723 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74c65d689-fcfz4" Oct 02 11:04:02 crc kubenswrapper[4766]: I1002 11:04:02.380645 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6b4bdfc9db-5ms7x" Oct 02 11:04:24 crc kubenswrapper[4766]: I1002 11:04:24.431916 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:04:24 crc kubenswrapper[4766]: I1002 11:04:24.432552 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:04:24 crc kubenswrapper[4766]: I1002 11:04:24.432609 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 11:04:24 crc kubenswrapper[4766]: I1002 11:04:24.433183 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"853a899410d52955e7ea02637bfd357fd7fb7ea7cb0e26bfedf87008f24173b5"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:04:24 crc kubenswrapper[4766]: I1002 11:04:24.433238 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://853a899410d52955e7ea02637bfd357fd7fb7ea7cb0e26bfedf87008f24173b5" gracePeriod=600 Oct 02 11:04:25 crc kubenswrapper[4766]: I1002 11:04:25.185960 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="853a899410d52955e7ea02637bfd357fd7fb7ea7cb0e26bfedf87008f24173b5" exitCode=0 Oct 02 11:04:25 crc kubenswrapper[4766]: I1002 11:04:25.186031 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"853a899410d52955e7ea02637bfd357fd7fb7ea7cb0e26bfedf87008f24173b5"} Oct 02 11:04:25 crc kubenswrapper[4766]: I1002 11:04:25.186321 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"c19749f939a14cc5cbc026d638c61fa14b50810b4586b8fd36f7ac6b16f32c80"} Oct 02 11:04:25 crc kubenswrapper[4766]: I1002 11:04:25.186342 4766 scope.go:117] "RemoveContainer" containerID="3166d2ef6aeab8513526481d091131174bb783f49cb0e1e12abde1175b4b3aeb" Oct 02 11:05:20 crc kubenswrapper[4766]: I1002 11:05:20.304534 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mkkbm"] Oct 02 11:05:20 crc kubenswrapper[4766]: I1002 11:05:20.306342 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkkbm" Oct 02 11:05:20 crc kubenswrapper[4766]: I1002 11:05:20.319318 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkkbm"] Oct 02 11:05:20 crc kubenswrapper[4766]: I1002 11:05:20.420298 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24872b42-a353-453c-bc81-c710d52a3fe4-utilities\") pod \"redhat-marketplace-mkkbm\" (UID: \"24872b42-a353-453c-bc81-c710d52a3fe4\") " pod="openshift-marketplace/redhat-marketplace-mkkbm" Oct 02 11:05:20 crc kubenswrapper[4766]: I1002 11:05:20.420362 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvgvx\" (UniqueName: \"kubernetes.io/projected/24872b42-a353-453c-bc81-c710d52a3fe4-kube-api-access-qvgvx\") pod \"redhat-marketplace-mkkbm\" (UID: \"24872b42-a353-453c-bc81-c710d52a3fe4\") " pod="openshift-marketplace/redhat-marketplace-mkkbm" Oct 02 11:05:20 crc kubenswrapper[4766]: I1002 11:05:20.420437 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24872b42-a353-453c-bc81-c710d52a3fe4-catalog-content\") pod \"redhat-marketplace-mkkbm\" (UID: \"24872b42-a353-453c-bc81-c710d52a3fe4\") " pod="openshift-marketplace/redhat-marketplace-mkkbm" Oct 02 11:05:20 crc kubenswrapper[4766]: I1002 11:05:20.522003 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24872b42-a353-453c-bc81-c710d52a3fe4-catalog-content\") pod \"redhat-marketplace-mkkbm\" (UID: \"24872b42-a353-453c-bc81-c710d52a3fe4\") " pod="openshift-marketplace/redhat-marketplace-mkkbm" Oct 02 11:05:20 crc kubenswrapper[4766]: I1002 11:05:20.522299 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24872b42-a353-453c-bc81-c710d52a3fe4-utilities\") pod \"redhat-marketplace-mkkbm\" (UID: \"24872b42-a353-453c-bc81-c710d52a3fe4\") " pod="openshift-marketplace/redhat-marketplace-mkkbm" Oct 02 11:05:20 crc kubenswrapper[4766]: I1002 11:05:20.522422 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvgvx\" (UniqueName: \"kubernetes.io/projected/24872b42-a353-453c-bc81-c710d52a3fe4-kube-api-access-qvgvx\") pod \"redhat-marketplace-mkkbm\" (UID: \"24872b42-a353-453c-bc81-c710d52a3fe4\") " pod="openshift-marketplace/redhat-marketplace-mkkbm" Oct 02 11:05:20 crc kubenswrapper[4766]: I1002 11:05:20.522827 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24872b42-a353-453c-bc81-c710d52a3fe4-catalog-content\") pod \"redhat-marketplace-mkkbm\" (UID: \"24872b42-a353-453c-bc81-c710d52a3fe4\") " pod="openshift-marketplace/redhat-marketplace-mkkbm" Oct 02 11:05:20 crc kubenswrapper[4766]: I1002 11:05:20.522841 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24872b42-a353-453c-bc81-c710d52a3fe4-utilities\") pod \"redhat-marketplace-mkkbm\" (UID: \"24872b42-a353-453c-bc81-c710d52a3fe4\") " pod="openshift-marketplace/redhat-marketplace-mkkbm" Oct 02 11:05:20 crc kubenswrapper[4766]: I1002 11:05:20.546653 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvgvx\" (UniqueName: \"kubernetes.io/projected/24872b42-a353-453c-bc81-c710d52a3fe4-kube-api-access-qvgvx\") pod \"redhat-marketplace-mkkbm\" (UID: \"24872b42-a353-453c-bc81-c710d52a3fe4\") " pod="openshift-marketplace/redhat-marketplace-mkkbm" Oct 02 11:05:20 crc kubenswrapper[4766]: I1002 11:05:20.627902 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkkbm" Oct 02 11:05:20 crc kubenswrapper[4766]: I1002 11:05:20.823834 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkkbm"] Oct 02 11:05:21 crc kubenswrapper[4766]: I1002 11:05:21.457577 4766 generic.go:334] "Generic (PLEG): container finished" podID="24872b42-a353-453c-bc81-c710d52a3fe4" containerID="c799ec1b67d88996fe6e53dc9489135b95ec3ae3818cce16a29489ccbb92f998" exitCode=0 Oct 02 11:05:21 crc kubenswrapper[4766]: I1002 11:05:21.457721 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkkbm" event={"ID":"24872b42-a353-453c-bc81-c710d52a3fe4","Type":"ContainerDied","Data":"c799ec1b67d88996fe6e53dc9489135b95ec3ae3818cce16a29489ccbb92f998"} Oct 02 11:05:21 crc kubenswrapper[4766]: I1002 11:05:21.457849 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkkbm" event={"ID":"24872b42-a353-453c-bc81-c710d52a3fe4","Type":"ContainerStarted","Data":"ed0749a3e2a7b6abd09a82b32ad67fc751ba2b5e21e1e48eb6b142cc96befaaa"} Oct 02 11:05:21 crc kubenswrapper[4766]: I1002 11:05:21.459606 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:05:23 crc kubenswrapper[4766]: I1002 11:05:23.282643 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7d9lm"] Oct 02 11:05:23 crc kubenswrapper[4766]: I1002 11:05:23.284139 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7d9lm" Oct 02 11:05:23 crc kubenswrapper[4766]: I1002 11:05:23.295923 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7d9lm"] Oct 02 11:05:23 crc kubenswrapper[4766]: I1002 11:05:23.358087 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgrrz\" (UniqueName: \"kubernetes.io/projected/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-kube-api-access-dgrrz\") pod \"redhat-operators-7d9lm\" (UID: \"d35d6d3a-8e62-4f89-82bd-53c7f05805a0\") " pod="openshift-marketplace/redhat-operators-7d9lm" Oct 02 11:05:23 crc kubenswrapper[4766]: I1002 11:05:23.358467 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-utilities\") pod \"redhat-operators-7d9lm\" (UID: \"d35d6d3a-8e62-4f89-82bd-53c7f05805a0\") " pod="openshift-marketplace/redhat-operators-7d9lm" Oct 02 11:05:23 crc kubenswrapper[4766]: I1002 11:05:23.358651 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-catalog-content\") pod \"redhat-operators-7d9lm\" (UID: \"d35d6d3a-8e62-4f89-82bd-53c7f05805a0\") " pod="openshift-marketplace/redhat-operators-7d9lm" Oct 02 11:05:23 crc kubenswrapper[4766]: I1002 11:05:23.460217 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgrrz\" (UniqueName: \"kubernetes.io/projected/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-kube-api-access-dgrrz\") pod \"redhat-operators-7d9lm\" (UID: \"d35d6d3a-8e62-4f89-82bd-53c7f05805a0\") " pod="openshift-marketplace/redhat-operators-7d9lm" Oct 02 11:05:23 crc kubenswrapper[4766]: I1002 11:05:23.460632 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-utilities\") pod \"redhat-operators-7d9lm\" (UID: \"d35d6d3a-8e62-4f89-82bd-53c7f05805a0\") " pod="openshift-marketplace/redhat-operators-7d9lm" Oct 02 11:05:23 crc kubenswrapper[4766]: I1002 11:05:23.460750 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-catalog-content\") pod \"redhat-operators-7d9lm\" (UID: \"d35d6d3a-8e62-4f89-82bd-53c7f05805a0\") " pod="openshift-marketplace/redhat-operators-7d9lm" Oct 02 11:05:23 crc kubenswrapper[4766]: I1002 11:05:23.461315 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-catalog-content\") pod \"redhat-operators-7d9lm\" (UID: \"d35d6d3a-8e62-4f89-82bd-53c7f05805a0\") " pod="openshift-marketplace/redhat-operators-7d9lm" Oct 02 11:05:23 crc kubenswrapper[4766]: I1002 11:05:23.462016 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-utilities\") pod \"redhat-operators-7d9lm\" (UID: \"d35d6d3a-8e62-4f89-82bd-53c7f05805a0\") " pod="openshift-marketplace/redhat-operators-7d9lm" Oct 02 11:05:23 crc kubenswrapper[4766]: I1002 11:05:23.470618 4766 generic.go:334] "Generic (PLEG): container finished" podID="24872b42-a353-453c-bc81-c710d52a3fe4" containerID="a746b37fd6516c01cb70443ab7a17bcecdbf8aa8b1e93c614c62958e712426f7" exitCode=0 Oct 02 11:05:23 crc kubenswrapper[4766]: I1002 11:05:23.470802 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkkbm" event={"ID":"24872b42-a353-453c-bc81-c710d52a3fe4","Type":"ContainerDied","Data":"a746b37fd6516c01cb70443ab7a17bcecdbf8aa8b1e93c614c62958e712426f7"} Oct 02 11:05:23 crc kubenswrapper[4766]: I1002 11:05:23.482243 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgrrz\" (UniqueName: \"kubernetes.io/projected/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-kube-api-access-dgrrz\") pod \"redhat-operators-7d9lm\" (UID: \"d35d6d3a-8e62-4f89-82bd-53c7f05805a0\") " pod="openshift-marketplace/redhat-operators-7d9lm" Oct 02 11:05:23 crc kubenswrapper[4766]: I1002 11:05:23.611170 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7d9lm" Oct 02 11:05:23 crc kubenswrapper[4766]: I1002 11:05:23.814012 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7d9lm"] Oct 02 11:05:24 crc kubenswrapper[4766]: I1002 11:05:24.477602 4766 generic.go:334] "Generic (PLEG): container finished" podID="d35d6d3a-8e62-4f89-82bd-53c7f05805a0" containerID="cb6382adb92331fd1e695e6a464a1f66e8dd16f4f06d73c36bbc70b93bb61661" exitCode=0 Oct 02 11:05:24 crc kubenswrapper[4766]: I1002 11:05:24.477645 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7d9lm" event={"ID":"d35d6d3a-8e62-4f89-82bd-53c7f05805a0","Type":"ContainerDied","Data":"cb6382adb92331fd1e695e6a464a1f66e8dd16f4f06d73c36bbc70b93bb61661"} Oct 02 11:05:24 crc kubenswrapper[4766]: I1002 11:05:24.477671 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7d9lm" event={"ID":"d35d6d3a-8e62-4f89-82bd-53c7f05805a0","Type":"ContainerStarted","Data":"77c6f305c603291f9c91a0b6bef8c4366846ad8f28c0742baf6e8f692dab686f"} Oct 02 11:05:25 crc kubenswrapper[4766]: I1002 11:05:25.489068 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkkbm" event={"ID":"24872b42-a353-453c-bc81-c710d52a3fe4","Type":"ContainerStarted","Data":"a36a5e53a7f8e33e3921e99f5f65b7625b96049cfa3b6c69dbb1e9b9fd0b64c7"} Oct 02 11:05:25 crc kubenswrapper[4766]: I1002 11:05:25.506564 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mkkbm" podStartSLOduration=2.347277133 podStartE2EDuration="5.50654531s" podCreationTimestamp="2025-10-02 11:05:20 +0000 UTC" firstStartedPulling="2025-10-02 11:05:21.459295234 +0000 UTC m=+836.402166178" lastFinishedPulling="2025-10-02 11:05:24.618563411 +0000 UTC m=+839.561434355" observedRunningTime="2025-10-02 11:05:25.50500373 +0000 UTC m=+840.447874674" watchObservedRunningTime="2025-10-02 11:05:25.50654531 +0000 UTC m=+840.449416254" Oct 02 11:05:27 crc kubenswrapper[4766]: I1002 11:05:27.501533 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7d9lm" event={"ID":"d35d6d3a-8e62-4f89-82bd-53c7f05805a0","Type":"ContainerStarted","Data":"a228685216e7e65003a721fc651309df76b7ed5a7d3b29fd3565b4c666272d79"} Oct 02 11:05:28 crc kubenswrapper[4766]: I1002 11:05:28.508817 4766 generic.go:334] "Generic (PLEG): container finished" podID="d35d6d3a-8e62-4f89-82bd-53c7f05805a0" containerID="a228685216e7e65003a721fc651309df76b7ed5a7d3b29fd3565b4c666272d79" exitCode=0 Oct 02 11:05:28 crc kubenswrapper[4766]: I1002 11:05:28.508856 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7d9lm" event={"ID":"d35d6d3a-8e62-4f89-82bd-53c7f05805a0","Type":"ContainerDied","Data":"a228685216e7e65003a721fc651309df76b7ed5a7d3b29fd3565b4c666272d79"} Oct 02 11:05:30 crc kubenswrapper[4766]: I1002 11:05:30.628791 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mkkbm" Oct 02 11:05:30 crc kubenswrapper[4766]: I1002 11:05:30.629235 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mkkbm" Oct 02 11:05:30 crc kubenswrapper[4766]: I1002 11:05:30.676912 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mkkbm" Oct 02 11:05:31 crc kubenswrapper[4766]: I1002 11:05:31.525746 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7d9lm" event={"ID":"d35d6d3a-8e62-4f89-82bd-53c7f05805a0","Type":"ContainerStarted","Data":"e17f3ea8eca7691af6370f2454872a1fcc908178b3f3d68497513e63f22c0296"} Oct 02 11:05:31 crc kubenswrapper[4766]: I1002 11:05:31.544783 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7d9lm" podStartSLOduration=2.7772816049999998 podStartE2EDuration="8.544765771s" podCreationTimestamp="2025-10-02 11:05:23 +0000 UTC" firstStartedPulling="2025-10-02 11:05:24.586577861 +0000 UTC m=+839.529448805" lastFinishedPulling="2025-10-02 11:05:30.354062027 +0000 UTC m=+845.296932971" observedRunningTime="2025-10-02 11:05:31.542058195 +0000 UTC m=+846.484929139" watchObservedRunningTime="2025-10-02 11:05:31.544765771 +0000 UTC m=+846.487636715" Oct 02 11:05:31 crc kubenswrapper[4766]: I1002 11:05:31.572618 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mkkbm" Oct 02 11:05:32 crc kubenswrapper[4766]: I1002 11:05:32.878803 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkkbm"] Oct 02 11:05:33 crc kubenswrapper[4766]: I1002 11:05:33.540988 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mkkbm" podUID="24872b42-a353-453c-bc81-c710d52a3fe4" containerName="registry-server" containerID="cri-o://a36a5e53a7f8e33e3921e99f5f65b7625b96049cfa3b6c69dbb1e9b9fd0b64c7" gracePeriod=2 Oct 02 11:05:33 crc kubenswrapper[4766]: I1002 11:05:33.611920 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7d9lm" Oct 02 11:05:33 crc kubenswrapper[4766]: I1002 11:05:33.613065 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7d9lm" Oct 02 11:05:34 crc kubenswrapper[4766]: I1002 11:05:34.650958 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7d9lm" podUID="d35d6d3a-8e62-4f89-82bd-53c7f05805a0" containerName="registry-server" probeResult="failure" output=< Oct 02 11:05:34 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Oct 02 11:05:34 crc kubenswrapper[4766]: > Oct 02 11:05:39 crc kubenswrapper[4766]: I1002 11:05:39.394339 4766 generic.go:334] "Generic (PLEG): container finished" podID="24872b42-a353-453c-bc81-c710d52a3fe4" containerID="a36a5e53a7f8e33e3921e99f5f65b7625b96049cfa3b6c69dbb1e9b9fd0b64c7" exitCode=0 Oct 02 11:05:39 crc kubenswrapper[4766]: I1002 11:05:39.394400 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkkbm" event={"ID":"24872b42-a353-453c-bc81-c710d52a3fe4","Type":"ContainerDied","Data":"a36a5e53a7f8e33e3921e99f5f65b7625b96049cfa3b6c69dbb1e9b9fd0b64c7"} Oct 02 11:05:39 crc kubenswrapper[4766]: I1002 11:05:39.621855 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkkbm" Oct 02 11:05:39 crc kubenswrapper[4766]: I1002 11:05:39.758158 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24872b42-a353-453c-bc81-c710d52a3fe4-utilities\") pod \"24872b42-a353-453c-bc81-c710d52a3fe4\" (UID: \"24872b42-a353-453c-bc81-c710d52a3fe4\") " Oct 02 11:05:39 crc kubenswrapper[4766]: I1002 11:05:39.758277 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvgvx\" (UniqueName: \"kubernetes.io/projected/24872b42-a353-453c-bc81-c710d52a3fe4-kube-api-access-qvgvx\") pod \"24872b42-a353-453c-bc81-c710d52a3fe4\" (UID: \"24872b42-a353-453c-bc81-c710d52a3fe4\") " Oct 02 11:05:39 crc kubenswrapper[4766]: I1002 11:05:39.758340 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24872b42-a353-453c-bc81-c710d52a3fe4-catalog-content\") pod \"24872b42-a353-453c-bc81-c710d52a3fe4\" (UID: \"24872b42-a353-453c-bc81-c710d52a3fe4\") " Oct 02 11:05:39 crc kubenswrapper[4766]: I1002 11:05:39.760636 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24872b42-a353-453c-bc81-c710d52a3fe4-utilities" (OuterVolumeSpecName: "utilities") pod "24872b42-a353-453c-bc81-c710d52a3fe4" (UID: "24872b42-a353-453c-bc81-c710d52a3fe4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:05:39 crc kubenswrapper[4766]: I1002 11:05:39.761937 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24872b42-a353-453c-bc81-c710d52a3fe4-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:05:39 crc kubenswrapper[4766]: I1002 11:05:39.763840 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24872b42-a353-453c-bc81-c710d52a3fe4-kube-api-access-qvgvx" (OuterVolumeSpecName: "kube-api-access-qvgvx") pod "24872b42-a353-453c-bc81-c710d52a3fe4" (UID: "24872b42-a353-453c-bc81-c710d52a3fe4"). InnerVolumeSpecName "kube-api-access-qvgvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:05:39 crc kubenswrapper[4766]: I1002 11:05:39.776794 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24872b42-a353-453c-bc81-c710d52a3fe4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24872b42-a353-453c-bc81-c710d52a3fe4" (UID: "24872b42-a353-453c-bc81-c710d52a3fe4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:05:39 crc kubenswrapper[4766]: I1002 11:05:39.863370 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvgvx\" (UniqueName: \"kubernetes.io/projected/24872b42-a353-453c-bc81-c710d52a3fe4-kube-api-access-qvgvx\") on node \"crc\" DevicePath \"\"" Oct 02 11:05:39 crc kubenswrapper[4766]: I1002 11:05:39.863442 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24872b42-a353-453c-bc81-c710d52a3fe4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:05:40 crc kubenswrapper[4766]: I1002 11:05:40.403576 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mkkbm" event={"ID":"24872b42-a353-453c-bc81-c710d52a3fe4","Type":"ContainerDied","Data":"ed0749a3e2a7b6abd09a82b32ad67fc751ba2b5e21e1e48eb6b142cc96befaaa"} Oct 02 11:05:40 crc kubenswrapper[4766]: I1002 11:05:40.403646 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mkkbm" Oct 02 11:05:40 crc kubenswrapper[4766]: I1002 11:05:40.403656 4766 scope.go:117] "RemoveContainer" containerID="a36a5e53a7f8e33e3921e99f5f65b7625b96049cfa3b6c69dbb1e9b9fd0b64c7" Oct 02 11:05:40 crc kubenswrapper[4766]: I1002 11:05:40.422924 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkkbm"] Oct 02 11:05:40 crc kubenswrapper[4766]: I1002 11:05:40.426643 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mkkbm"] Oct 02 11:05:40 crc kubenswrapper[4766]: I1002 11:05:40.426641 4766 scope.go:117] "RemoveContainer" containerID="a746b37fd6516c01cb70443ab7a17bcecdbf8aa8b1e93c614c62958e712426f7" Oct 02 11:05:40 crc kubenswrapper[4766]: I1002 11:05:40.440013 4766 scope.go:117] "RemoveContainer" containerID="c799ec1b67d88996fe6e53dc9489135b95ec3ae3818cce16a29489ccbb92f998" Oct 02 11:05:41 crc kubenswrapper[4766]: I1002 11:05:41.888765 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24872b42-a353-453c-bc81-c710d52a3fe4" path="/var/lib/kubelet/pods/24872b42-a353-453c-bc81-c710d52a3fe4/volumes" Oct 02 11:05:43 crc kubenswrapper[4766]: I1002 11:05:43.654150 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7d9lm" Oct 02 11:05:43 crc kubenswrapper[4766]: I1002 11:05:43.694676 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7d9lm" Oct 02 11:05:43 crc kubenswrapper[4766]: I1002 11:05:43.893326 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7d9lm"] Oct 02 11:05:45 crc kubenswrapper[4766]: I1002 11:05:45.428719 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7d9lm" podUID="d35d6d3a-8e62-4f89-82bd-53c7f05805a0" containerName="registry-server" containerID="cri-o://e17f3ea8eca7691af6370f2454872a1fcc908178b3f3d68497513e63f22c0296" gracePeriod=2 Oct 02 11:05:46 crc kubenswrapper[4766]: I1002 11:05:46.435468 4766 generic.go:334] "Generic (PLEG): container finished" podID="d35d6d3a-8e62-4f89-82bd-53c7f05805a0" containerID="e17f3ea8eca7691af6370f2454872a1fcc908178b3f3d68497513e63f22c0296" exitCode=0 Oct 02 11:05:46 crc kubenswrapper[4766]: I1002 11:05:46.435555 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7d9lm" event={"ID":"d35d6d3a-8e62-4f89-82bd-53c7f05805a0","Type":"ContainerDied","Data":"e17f3ea8eca7691af6370f2454872a1fcc908178b3f3d68497513e63f22c0296"} Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.042276 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7d9lm" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.241649 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-utilities\") pod \"d35d6d3a-8e62-4f89-82bd-53c7f05805a0\" (UID: \"d35d6d3a-8e62-4f89-82bd-53c7f05805a0\") " Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.241800 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgrrz\" (UniqueName: \"kubernetes.io/projected/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-kube-api-access-dgrrz\") pod \"d35d6d3a-8e62-4f89-82bd-53c7f05805a0\" (UID: \"d35d6d3a-8e62-4f89-82bd-53c7f05805a0\") " Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.241824 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-catalog-content\") pod \"d35d6d3a-8e62-4f89-82bd-53c7f05805a0\" (UID: \"d35d6d3a-8e62-4f89-82bd-53c7f05805a0\") " Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.242576 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-utilities" (OuterVolumeSpecName: "utilities") pod "d35d6d3a-8e62-4f89-82bd-53c7f05805a0" (UID: "d35d6d3a-8e62-4f89-82bd-53c7f05805a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.247512 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-kube-api-access-dgrrz" (OuterVolumeSpecName: "kube-api-access-dgrrz") pod "d35d6d3a-8e62-4f89-82bd-53c7f05805a0" (UID: "d35d6d3a-8e62-4f89-82bd-53c7f05805a0"). InnerVolumeSpecName "kube-api-access-dgrrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.291846 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dbg9r"] Oct 02 11:05:47 crc kubenswrapper[4766]: E1002 11:05:47.292096 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24872b42-a353-453c-bc81-c710d52a3fe4" containerName="registry-server" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.292110 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="24872b42-a353-453c-bc81-c710d52a3fe4" containerName="registry-server" Oct 02 11:05:47 crc kubenswrapper[4766]: E1002 11:05:47.292122 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24872b42-a353-453c-bc81-c710d52a3fe4" containerName="extract-utilities" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.292128 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="24872b42-a353-453c-bc81-c710d52a3fe4" containerName="extract-utilities" Oct 02 11:05:47 crc kubenswrapper[4766]: E1002 11:05:47.292141 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24872b42-a353-453c-bc81-c710d52a3fe4" containerName="extract-content" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.292174 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="24872b42-a353-453c-bc81-c710d52a3fe4" containerName="extract-content" Oct 02 11:05:47 crc kubenswrapper[4766]: E1002 11:05:47.292187 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d35d6d3a-8e62-4f89-82bd-53c7f05805a0" containerName="extract-utilities" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.292194 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d35d6d3a-8e62-4f89-82bd-53c7f05805a0" containerName="extract-utilities" Oct 02 11:05:47 crc kubenswrapper[4766]: E1002 11:05:47.292201 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d35d6d3a-8e62-4f89-82bd-53c7f05805a0" containerName="extract-content" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.292207 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d35d6d3a-8e62-4f89-82bd-53c7f05805a0" containerName="extract-content" Oct 02 11:05:47 crc kubenswrapper[4766]: E1002 11:05:47.292238 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d35d6d3a-8e62-4f89-82bd-53c7f05805a0" containerName="registry-server" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.292262 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d35d6d3a-8e62-4f89-82bd-53c7f05805a0" containerName="registry-server" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.293051 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d35d6d3a-8e62-4f89-82bd-53c7f05805a0" containerName="registry-server" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.293089 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="24872b42-a353-453c-bc81-c710d52a3fe4" containerName="registry-server" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.294616 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbg9r" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.299600 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dbg9r"] Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.325580 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d35d6d3a-8e62-4f89-82bd-53c7f05805a0" (UID: "d35d6d3a-8e62-4f89-82bd-53c7f05805a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.342637 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgrrz\" (UniqueName: \"kubernetes.io/projected/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-kube-api-access-dgrrz\") on node \"crc\" DevicePath \"\"" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.342929 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.342965 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d35d6d3a-8e62-4f89-82bd-53c7f05805a0-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.443662 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nn2z\" (UniqueName: \"kubernetes.io/projected/0cd7869a-ede9-4781-b2d2-9dfca0134699-kube-api-access-2nn2z\") pod \"community-operators-dbg9r\" (UID: \"0cd7869a-ede9-4781-b2d2-9dfca0134699\") " pod="openshift-marketplace/community-operators-dbg9r" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.443740 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd7869a-ede9-4781-b2d2-9dfca0134699-utilities\") pod \"community-operators-dbg9r\" (UID: \"0cd7869a-ede9-4781-b2d2-9dfca0134699\") " pod="openshift-marketplace/community-operators-dbg9r" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.443767 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd7869a-ede9-4781-b2d2-9dfca0134699-catalog-content\") pod \"community-operators-dbg9r\" (UID: \"0cd7869a-ede9-4781-b2d2-9dfca0134699\") " pod="openshift-marketplace/community-operators-dbg9r" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.443858 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7d9lm" event={"ID":"d35d6d3a-8e62-4f89-82bd-53c7f05805a0","Type":"ContainerDied","Data":"77c6f305c603291f9c91a0b6bef8c4366846ad8f28c0742baf6e8f692dab686f"} Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.443907 4766 scope.go:117] "RemoveContainer" containerID="e17f3ea8eca7691af6370f2454872a1fcc908178b3f3d68497513e63f22c0296" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.443917 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7d9lm" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.460479 4766 scope.go:117] "RemoveContainer" containerID="a228685216e7e65003a721fc651309df76b7ed5a7d3b29fd3565b4c666272d79" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.468166 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7d9lm"] Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.472830 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7d9lm"] Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.491820 4766 scope.go:117] "RemoveContainer" containerID="cb6382adb92331fd1e695e6a464a1f66e8dd16f4f06d73c36bbc70b93bb61661" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.544372 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd7869a-ede9-4781-b2d2-9dfca0134699-catalog-content\") pod \"community-operators-dbg9r\" (UID: \"0cd7869a-ede9-4781-b2d2-9dfca0134699\") " pod="openshift-marketplace/community-operators-dbg9r" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.544484 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nn2z\" (UniqueName: \"kubernetes.io/projected/0cd7869a-ede9-4781-b2d2-9dfca0134699-kube-api-access-2nn2z\") pod \"community-operators-dbg9r\" (UID: \"0cd7869a-ede9-4781-b2d2-9dfca0134699\") " pod="openshift-marketplace/community-operators-dbg9r" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.544695 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd7869a-ede9-4781-b2d2-9dfca0134699-utilities\") pod \"community-operators-dbg9r\" (UID: \"0cd7869a-ede9-4781-b2d2-9dfca0134699\") " pod="openshift-marketplace/community-operators-dbg9r" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.544959 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd7869a-ede9-4781-b2d2-9dfca0134699-catalog-content\") pod \"community-operators-dbg9r\" (UID: \"0cd7869a-ede9-4781-b2d2-9dfca0134699\") " pod="openshift-marketplace/community-operators-dbg9r" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.545073 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd7869a-ede9-4781-b2d2-9dfca0134699-utilities\") pod \"community-operators-dbg9r\" (UID: \"0cd7869a-ede9-4781-b2d2-9dfca0134699\") " pod="openshift-marketplace/community-operators-dbg9r" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.564176 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nn2z\" (UniqueName: \"kubernetes.io/projected/0cd7869a-ede9-4781-b2d2-9dfca0134699-kube-api-access-2nn2z\") pod \"community-operators-dbg9r\" (UID: \"0cd7869a-ede9-4781-b2d2-9dfca0134699\") " pod="openshift-marketplace/community-operators-dbg9r" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.618643 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbg9r" Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.866323 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dbg9r"] Oct 02 11:05:47 crc kubenswrapper[4766]: I1002 11:05:47.888361 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d35d6d3a-8e62-4f89-82bd-53c7f05805a0" path="/var/lib/kubelet/pods/d35d6d3a-8e62-4f89-82bd-53c7f05805a0/volumes" Oct 02 11:05:48 crc kubenswrapper[4766]: I1002 11:05:48.454090 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbg9r" event={"ID":"0cd7869a-ede9-4781-b2d2-9dfca0134699","Type":"ContainerStarted","Data":"4cb62ab769c86a7d4545eb8191718731ce7f56911a5cf7e3daf45b63570dada3"} Oct 02 11:05:49 crc kubenswrapper[4766]: I1002 11:05:49.461734 4766 generic.go:334] "Generic (PLEG): container finished" podID="0cd7869a-ede9-4781-b2d2-9dfca0134699" containerID="a196b2276ed474afea3a791b410598053dab0ea0d5ac8bb750a7515c5974169d" exitCode=0 Oct 02 11:05:49 crc kubenswrapper[4766]: I1002 11:05:49.461786 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbg9r" event={"ID":"0cd7869a-ede9-4781-b2d2-9dfca0134699","Type":"ContainerDied","Data":"a196b2276ed474afea3a791b410598053dab0ea0d5ac8bb750a7515c5974169d"} Oct 02 11:05:57 crc kubenswrapper[4766]: I1002 11:05:57.508321 4766 generic.go:334] "Generic (PLEG): container finished" podID="0cd7869a-ede9-4781-b2d2-9dfca0134699" containerID="eadee57c98de7210607f34cea08391ff96d4b619fefb9fd754d80bb145685971" exitCode=0 Oct 02 11:05:57 crc kubenswrapper[4766]: I1002 11:05:57.508667 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbg9r" event={"ID":"0cd7869a-ede9-4781-b2d2-9dfca0134699","Type":"ContainerDied","Data":"eadee57c98de7210607f34cea08391ff96d4b619fefb9fd754d80bb145685971"} Oct 02 11:05:59 crc kubenswrapper[4766]: I1002 11:05:59.521359 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbg9r" event={"ID":"0cd7869a-ede9-4781-b2d2-9dfca0134699","Type":"ContainerStarted","Data":"9a9ef589b0b12559af8cd1f1b419ae1595408754f9c9e3f2186b380fc8a159eb"} Oct 02 11:05:59 crc kubenswrapper[4766]: I1002 11:05:59.537791 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dbg9r" podStartSLOduration=3.133955303 podStartE2EDuration="12.537774188s" podCreationTimestamp="2025-10-02 11:05:47 +0000 UTC" firstStartedPulling="2025-10-02 11:05:49.465834745 +0000 UTC m=+864.408705689" lastFinishedPulling="2025-10-02 11:05:58.86965363 +0000 UTC m=+873.812524574" observedRunningTime="2025-10-02 11:05:59.535360921 +0000 UTC m=+874.478231865" watchObservedRunningTime="2025-10-02 11:05:59.537774188 +0000 UTC m=+874.480645132" Oct 02 11:06:02 crc kubenswrapper[4766]: I1002 11:06:02.977714 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k9rz5"] Oct 02 11:06:02 crc kubenswrapper[4766]: I1002 11:06:02.979922 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9rz5" Oct 02 11:06:02 crc kubenswrapper[4766]: I1002 11:06:02.995692 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k9rz5"] Oct 02 11:06:03 crc kubenswrapper[4766]: I1002 11:06:03.140491 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5td8r\" (UniqueName: \"kubernetes.io/projected/58c3749c-9861-41e7-92f7-374e1e21ef6b-kube-api-access-5td8r\") pod \"certified-operators-k9rz5\" (UID: \"58c3749c-9861-41e7-92f7-374e1e21ef6b\") " pod="openshift-marketplace/certified-operators-k9rz5" Oct 02 11:06:03 crc kubenswrapper[4766]: I1002 11:06:03.140573 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58c3749c-9861-41e7-92f7-374e1e21ef6b-utilities\") pod \"certified-operators-k9rz5\" (UID: \"58c3749c-9861-41e7-92f7-374e1e21ef6b\") " pod="openshift-marketplace/certified-operators-k9rz5" Oct 02 11:06:03 crc kubenswrapper[4766]: I1002 11:06:03.140641 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58c3749c-9861-41e7-92f7-374e1e21ef6b-catalog-content\") pod \"certified-operators-k9rz5\" (UID: \"58c3749c-9861-41e7-92f7-374e1e21ef6b\") " pod="openshift-marketplace/certified-operators-k9rz5" Oct 02 11:06:03 crc kubenswrapper[4766]: I1002 11:06:03.242373 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58c3749c-9861-41e7-92f7-374e1e21ef6b-catalog-content\") pod \"certified-operators-k9rz5\" (UID: \"58c3749c-9861-41e7-92f7-374e1e21ef6b\") " pod="openshift-marketplace/certified-operators-k9rz5" Oct 02 11:06:03 crc kubenswrapper[4766]: I1002 11:06:03.242468 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5td8r\" (UniqueName: \"kubernetes.io/projected/58c3749c-9861-41e7-92f7-374e1e21ef6b-kube-api-access-5td8r\") pod \"certified-operators-k9rz5\" (UID: \"58c3749c-9861-41e7-92f7-374e1e21ef6b\") " pod="openshift-marketplace/certified-operators-k9rz5" Oct 02 11:06:03 crc kubenswrapper[4766]: I1002 11:06:03.242525 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58c3749c-9861-41e7-92f7-374e1e21ef6b-utilities\") pod \"certified-operators-k9rz5\" (UID: \"58c3749c-9861-41e7-92f7-374e1e21ef6b\") " pod="openshift-marketplace/certified-operators-k9rz5" Oct 02 11:06:03 crc kubenswrapper[4766]: I1002 11:06:03.242972 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58c3749c-9861-41e7-92f7-374e1e21ef6b-catalog-content\") pod \"certified-operators-k9rz5\" (UID: \"58c3749c-9861-41e7-92f7-374e1e21ef6b\") " pod="openshift-marketplace/certified-operators-k9rz5" Oct 02 11:06:03 crc kubenswrapper[4766]: I1002 11:06:03.243223 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58c3749c-9861-41e7-92f7-374e1e21ef6b-utilities\") pod \"certified-operators-k9rz5\" (UID: \"58c3749c-9861-41e7-92f7-374e1e21ef6b\") " pod="openshift-marketplace/certified-operators-k9rz5" Oct 02 11:06:03 crc kubenswrapper[4766]: I1002 11:06:03.281814 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5td8r\" (UniqueName: \"kubernetes.io/projected/58c3749c-9861-41e7-92f7-374e1e21ef6b-kube-api-access-5td8r\") pod \"certified-operators-k9rz5\" (UID: \"58c3749c-9861-41e7-92f7-374e1e21ef6b\") " pod="openshift-marketplace/certified-operators-k9rz5" Oct 02 11:06:03 crc kubenswrapper[4766]: I1002 11:06:03.301446 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9rz5" Oct 02 11:06:03 crc kubenswrapper[4766]: I1002 11:06:03.596920 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k9rz5"] Oct 02 11:06:03 crc kubenswrapper[4766]: W1002 11:06:03.610784 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58c3749c_9861_41e7_92f7_374e1e21ef6b.slice/crio-046c64b4252a88a1afa85b25ba5b6353f26d63844f3cd8cd461b979d7444ddc5 WatchSource:0}: Error finding container 046c64b4252a88a1afa85b25ba5b6353f26d63844f3cd8cd461b979d7444ddc5: Status 404 returned error can't find the container with id 046c64b4252a88a1afa85b25ba5b6353f26d63844f3cd8cd461b979d7444ddc5 Oct 02 11:06:04 crc kubenswrapper[4766]: I1002 11:06:04.550993 4766 generic.go:334] "Generic (PLEG): container finished" podID="58c3749c-9861-41e7-92f7-374e1e21ef6b" containerID="62f31ad13b4f981552794c8a27f43f3cf56acf0352b80ee61c7dc67450c6e346" exitCode=0 Oct 02 11:06:04 crc kubenswrapper[4766]: I1002 11:06:04.551041 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9rz5" event={"ID":"58c3749c-9861-41e7-92f7-374e1e21ef6b","Type":"ContainerDied","Data":"62f31ad13b4f981552794c8a27f43f3cf56acf0352b80ee61c7dc67450c6e346"} Oct 02 11:06:04 crc kubenswrapper[4766]: I1002 11:06:04.551259 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9rz5" event={"ID":"58c3749c-9861-41e7-92f7-374e1e21ef6b","Type":"ContainerStarted","Data":"046c64b4252a88a1afa85b25ba5b6353f26d63844f3cd8cd461b979d7444ddc5"} Oct 02 11:06:07 crc kubenswrapper[4766]: I1002 11:06:07.619393 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dbg9r" Oct 02 11:06:07 crc kubenswrapper[4766]: I1002 11:06:07.619766 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dbg9r" Oct 02 11:06:07 crc kubenswrapper[4766]: I1002 11:06:07.655605 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dbg9r" Oct 02 11:06:08 crc kubenswrapper[4766]: I1002 11:06:08.616136 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dbg9r" Oct 02 11:06:10 crc kubenswrapper[4766]: I1002 11:06:10.587328 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9rz5" event={"ID":"58c3749c-9861-41e7-92f7-374e1e21ef6b","Type":"ContainerStarted","Data":"074c231838035ece9a770fa1a2dd5ceb74d9ac60b5bb9516c39adf6c3d7812a5"} Oct 02 11:06:11 crc kubenswrapper[4766]: I1002 11:06:11.593845 4766 generic.go:334] "Generic (PLEG): container finished" podID="58c3749c-9861-41e7-92f7-374e1e21ef6b" containerID="074c231838035ece9a770fa1a2dd5ceb74d9ac60b5bb9516c39adf6c3d7812a5" exitCode=0 Oct 02 11:06:11 crc kubenswrapper[4766]: I1002 11:06:11.593915 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9rz5" event={"ID":"58c3749c-9861-41e7-92f7-374e1e21ef6b","Type":"ContainerDied","Data":"074c231838035ece9a770fa1a2dd5ceb74d9ac60b5bb9516c39adf6c3d7812a5"} Oct 02 11:06:11 crc kubenswrapper[4766]: I1002 11:06:11.946852 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dbg9r"] Oct 02 11:06:11 crc kubenswrapper[4766]: I1002 11:06:11.947058 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dbg9r" podUID="0cd7869a-ede9-4781-b2d2-9dfca0134699" containerName="registry-server" containerID="cri-o://9a9ef589b0b12559af8cd1f1b419ae1595408754f9c9e3f2186b380fc8a159eb" gracePeriod=2 Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.364681 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbg9r" Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.375963 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd7869a-ede9-4781-b2d2-9dfca0134699-catalog-content\") pod \"0cd7869a-ede9-4781-b2d2-9dfca0134699\" (UID: \"0cd7869a-ede9-4781-b2d2-9dfca0134699\") " Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.376015 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nn2z\" (UniqueName: \"kubernetes.io/projected/0cd7869a-ede9-4781-b2d2-9dfca0134699-kube-api-access-2nn2z\") pod \"0cd7869a-ede9-4781-b2d2-9dfca0134699\" (UID: \"0cd7869a-ede9-4781-b2d2-9dfca0134699\") " Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.376078 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd7869a-ede9-4781-b2d2-9dfca0134699-utilities\") pod \"0cd7869a-ede9-4781-b2d2-9dfca0134699\" (UID: \"0cd7869a-ede9-4781-b2d2-9dfca0134699\") " Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.377040 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cd7869a-ede9-4781-b2d2-9dfca0134699-utilities" (OuterVolumeSpecName: "utilities") pod "0cd7869a-ede9-4781-b2d2-9dfca0134699" (UID: "0cd7869a-ede9-4781-b2d2-9dfca0134699"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.385384 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cd7869a-ede9-4781-b2d2-9dfca0134699-kube-api-access-2nn2z" (OuterVolumeSpecName: "kube-api-access-2nn2z") pod "0cd7869a-ede9-4781-b2d2-9dfca0134699" (UID: "0cd7869a-ede9-4781-b2d2-9dfca0134699"). InnerVolumeSpecName "kube-api-access-2nn2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.431982 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cd7869a-ede9-4781-b2d2-9dfca0134699-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cd7869a-ede9-4781-b2d2-9dfca0134699" (UID: "0cd7869a-ede9-4781-b2d2-9dfca0134699"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.477514 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cd7869a-ede9-4781-b2d2-9dfca0134699-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.477561 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cd7869a-ede9-4781-b2d2-9dfca0134699-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.477575 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nn2z\" (UniqueName: \"kubernetes.io/projected/0cd7869a-ede9-4781-b2d2-9dfca0134699-kube-api-access-2nn2z\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.600649 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9rz5" event={"ID":"58c3749c-9861-41e7-92f7-374e1e21ef6b","Type":"ContainerStarted","Data":"3aab01b55b49e0bcfbce80fd0e22370f46cfba66bb195c9cc6cd13cb2d12b650"} Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.602330 4766 generic.go:334] "Generic (PLEG): container finished" podID="0cd7869a-ede9-4781-b2d2-9dfca0134699" containerID="9a9ef589b0b12559af8cd1f1b419ae1595408754f9c9e3f2186b380fc8a159eb" exitCode=0 Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.602358 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbg9r" event={"ID":"0cd7869a-ede9-4781-b2d2-9dfca0134699","Type":"ContainerDied","Data":"9a9ef589b0b12559af8cd1f1b419ae1595408754f9c9e3f2186b380fc8a159eb"} Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.602376 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbg9r" event={"ID":"0cd7869a-ede9-4781-b2d2-9dfca0134699","Type":"ContainerDied","Data":"4cb62ab769c86a7d4545eb8191718731ce7f56911a5cf7e3daf45b63570dada3"} Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.602391 4766 scope.go:117] "RemoveContainer" containerID="9a9ef589b0b12559af8cd1f1b419ae1595408754f9c9e3f2186b380fc8a159eb" Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.602411 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbg9r" Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.617251 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k9rz5" podStartSLOduration=2.913532342 podStartE2EDuration="10.61723389s" podCreationTimestamp="2025-10-02 11:06:02 +0000 UTC" firstStartedPulling="2025-10-02 11:06:04.55300797 +0000 UTC m=+879.495878914" lastFinishedPulling="2025-10-02 11:06:12.256709518 +0000 UTC m=+887.199580462" observedRunningTime="2025-10-02 11:06:12.617178919 +0000 UTC m=+887.560049883" watchObservedRunningTime="2025-10-02 11:06:12.61723389 +0000 UTC m=+887.560104834" Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.618271 4766 scope.go:117] "RemoveContainer" containerID="eadee57c98de7210607f34cea08391ff96d4b619fefb9fd754d80bb145685971" Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.635645 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dbg9r"] Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.637284 4766 scope.go:117] "RemoveContainer" containerID="a196b2276ed474afea3a791b410598053dab0ea0d5ac8bb750a7515c5974169d" Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.639948 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dbg9r"] Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.654441 4766 scope.go:117] "RemoveContainer" containerID="9a9ef589b0b12559af8cd1f1b419ae1595408754f9c9e3f2186b380fc8a159eb" Oct 02 11:06:12 crc kubenswrapper[4766]: E1002 11:06:12.655014 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a9ef589b0b12559af8cd1f1b419ae1595408754f9c9e3f2186b380fc8a159eb\": container with ID starting with 9a9ef589b0b12559af8cd1f1b419ae1595408754f9c9e3f2186b380fc8a159eb not found: ID does not exist" containerID="9a9ef589b0b12559af8cd1f1b419ae1595408754f9c9e3f2186b380fc8a159eb" Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.655070 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a9ef589b0b12559af8cd1f1b419ae1595408754f9c9e3f2186b380fc8a159eb"} err="failed to get container status \"9a9ef589b0b12559af8cd1f1b419ae1595408754f9c9e3f2186b380fc8a159eb\": rpc error: code = NotFound desc = could not find container \"9a9ef589b0b12559af8cd1f1b419ae1595408754f9c9e3f2186b380fc8a159eb\": container with ID starting with 9a9ef589b0b12559af8cd1f1b419ae1595408754f9c9e3f2186b380fc8a159eb not found: ID does not exist" Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.655094 4766 scope.go:117] "RemoveContainer" containerID="eadee57c98de7210607f34cea08391ff96d4b619fefb9fd754d80bb145685971" Oct 02 11:06:12 crc kubenswrapper[4766]: E1002 11:06:12.655483 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eadee57c98de7210607f34cea08391ff96d4b619fefb9fd754d80bb145685971\": container with ID starting with eadee57c98de7210607f34cea08391ff96d4b619fefb9fd754d80bb145685971 not found: ID does not exist" containerID="eadee57c98de7210607f34cea08391ff96d4b619fefb9fd754d80bb145685971" Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.655538 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eadee57c98de7210607f34cea08391ff96d4b619fefb9fd754d80bb145685971"} err="failed to get container status \"eadee57c98de7210607f34cea08391ff96d4b619fefb9fd754d80bb145685971\": rpc error: code = NotFound desc = could not find container \"eadee57c98de7210607f34cea08391ff96d4b619fefb9fd754d80bb145685971\": container with ID starting with eadee57c98de7210607f34cea08391ff96d4b619fefb9fd754d80bb145685971 not found: ID does not exist" Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.655560 4766 scope.go:117] "RemoveContainer" containerID="a196b2276ed474afea3a791b410598053dab0ea0d5ac8bb750a7515c5974169d" Oct 02 11:06:12 crc kubenswrapper[4766]: E1002 11:06:12.656018 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a196b2276ed474afea3a791b410598053dab0ea0d5ac8bb750a7515c5974169d\": container with ID starting with a196b2276ed474afea3a791b410598053dab0ea0d5ac8bb750a7515c5974169d not found: ID does not exist" containerID="a196b2276ed474afea3a791b410598053dab0ea0d5ac8bb750a7515c5974169d" Oct 02 11:06:12 crc kubenswrapper[4766]: I1002 11:06:12.656051 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a196b2276ed474afea3a791b410598053dab0ea0d5ac8bb750a7515c5974169d"} err="failed to get container status \"a196b2276ed474afea3a791b410598053dab0ea0d5ac8bb750a7515c5974169d\": rpc error: code = NotFound desc = could not find container \"a196b2276ed474afea3a791b410598053dab0ea0d5ac8bb750a7515c5974169d\": container with ID starting with a196b2276ed474afea3a791b410598053dab0ea0d5ac8bb750a7515c5974169d not found: ID does not exist" Oct 02 11:06:13 crc kubenswrapper[4766]: I1002 11:06:13.302032 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k9rz5" Oct 02 11:06:13 crc kubenswrapper[4766]: I1002 11:06:13.302097 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k9rz5" Oct 02 11:06:13 crc kubenswrapper[4766]: I1002 11:06:13.887097 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cd7869a-ede9-4781-b2d2-9dfca0134699" path="/var/lib/kubelet/pods/0cd7869a-ede9-4781-b2d2-9dfca0134699/volumes" Oct 02 11:06:14 crc kubenswrapper[4766]: I1002 11:06:14.347626 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-k9rz5" podUID="58c3749c-9861-41e7-92f7-374e1e21ef6b" containerName="registry-server" probeResult="failure" output=< Oct 02 11:06:14 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Oct 02 11:06:14 crc kubenswrapper[4766]: > Oct 02 11:06:23 crc kubenswrapper[4766]: I1002 11:06:23.348914 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k9rz5" Oct 02 11:06:23 crc kubenswrapper[4766]: I1002 11:06:23.387925 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k9rz5" Oct 02 11:06:23 crc kubenswrapper[4766]: I1002 11:06:23.579762 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k9rz5"] Oct 02 11:06:24 crc kubenswrapper[4766]: I1002 11:06:24.432706 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:06:24 crc kubenswrapper[4766]: I1002 11:06:24.432771 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:06:24 crc kubenswrapper[4766]: I1002 11:06:24.669633 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k9rz5" podUID="58c3749c-9861-41e7-92f7-374e1e21ef6b" containerName="registry-server" containerID="cri-o://3aab01b55b49e0bcfbce80fd0e22370f46cfba66bb195c9cc6cd13cb2d12b650" gracePeriod=2 Oct 02 11:06:24 crc kubenswrapper[4766]: I1002 11:06:24.983770 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9rz5" Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.144714 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58c3749c-9861-41e7-92f7-374e1e21ef6b-utilities\") pod \"58c3749c-9861-41e7-92f7-374e1e21ef6b\" (UID: \"58c3749c-9861-41e7-92f7-374e1e21ef6b\") " Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.144817 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58c3749c-9861-41e7-92f7-374e1e21ef6b-catalog-content\") pod \"58c3749c-9861-41e7-92f7-374e1e21ef6b\" (UID: \"58c3749c-9861-41e7-92f7-374e1e21ef6b\") " Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.144872 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5td8r\" (UniqueName: \"kubernetes.io/projected/58c3749c-9861-41e7-92f7-374e1e21ef6b-kube-api-access-5td8r\") pod \"58c3749c-9861-41e7-92f7-374e1e21ef6b\" (UID: \"58c3749c-9861-41e7-92f7-374e1e21ef6b\") " Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.146214 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58c3749c-9861-41e7-92f7-374e1e21ef6b-utilities" (OuterVolumeSpecName: "utilities") pod "58c3749c-9861-41e7-92f7-374e1e21ef6b" (UID: "58c3749c-9861-41e7-92f7-374e1e21ef6b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.153635 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58c3749c-9861-41e7-92f7-374e1e21ef6b-kube-api-access-5td8r" (OuterVolumeSpecName: "kube-api-access-5td8r") pod "58c3749c-9861-41e7-92f7-374e1e21ef6b" (UID: "58c3749c-9861-41e7-92f7-374e1e21ef6b"). InnerVolumeSpecName "kube-api-access-5td8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.189420 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58c3749c-9861-41e7-92f7-374e1e21ef6b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58c3749c-9861-41e7-92f7-374e1e21ef6b" (UID: "58c3749c-9861-41e7-92f7-374e1e21ef6b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.246740 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5td8r\" (UniqueName: \"kubernetes.io/projected/58c3749c-9861-41e7-92f7-374e1e21ef6b-kube-api-access-5td8r\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.246795 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58c3749c-9861-41e7-92f7-374e1e21ef6b-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.246809 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58c3749c-9861-41e7-92f7-374e1e21ef6b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.676620 4766 generic.go:334] "Generic (PLEG): container finished" podID="58c3749c-9861-41e7-92f7-374e1e21ef6b" containerID="3aab01b55b49e0bcfbce80fd0e22370f46cfba66bb195c9cc6cd13cb2d12b650" exitCode=0 Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.676664 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9rz5" event={"ID":"58c3749c-9861-41e7-92f7-374e1e21ef6b","Type":"ContainerDied","Data":"3aab01b55b49e0bcfbce80fd0e22370f46cfba66bb195c9cc6cd13cb2d12b650"} Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.676673 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k9rz5" Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.676691 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k9rz5" event={"ID":"58c3749c-9861-41e7-92f7-374e1e21ef6b","Type":"ContainerDied","Data":"046c64b4252a88a1afa85b25ba5b6353f26d63844f3cd8cd461b979d7444ddc5"} Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.676715 4766 scope.go:117] "RemoveContainer" containerID="3aab01b55b49e0bcfbce80fd0e22370f46cfba66bb195c9cc6cd13cb2d12b650" Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.693620 4766 scope.go:117] "RemoveContainer" containerID="074c231838035ece9a770fa1a2dd5ceb74d9ac60b5bb9516c39adf6c3d7812a5" Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.704218 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k9rz5"] Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.708833 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k9rz5"] Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.722144 4766 scope.go:117] "RemoveContainer" containerID="62f31ad13b4f981552794c8a27f43f3cf56acf0352b80ee61c7dc67450c6e346" Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.733252 4766 scope.go:117] "RemoveContainer" containerID="3aab01b55b49e0bcfbce80fd0e22370f46cfba66bb195c9cc6cd13cb2d12b650" Oct 02 11:06:25 crc kubenswrapper[4766]: E1002 11:06:25.733566 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aab01b55b49e0bcfbce80fd0e22370f46cfba66bb195c9cc6cd13cb2d12b650\": container with ID starting with 3aab01b55b49e0bcfbce80fd0e22370f46cfba66bb195c9cc6cd13cb2d12b650 not found: ID does not exist" containerID="3aab01b55b49e0bcfbce80fd0e22370f46cfba66bb195c9cc6cd13cb2d12b650" Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.733597 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aab01b55b49e0bcfbce80fd0e22370f46cfba66bb195c9cc6cd13cb2d12b650"} err="failed to get container status \"3aab01b55b49e0bcfbce80fd0e22370f46cfba66bb195c9cc6cd13cb2d12b650\": rpc error: code = NotFound desc = could not find container \"3aab01b55b49e0bcfbce80fd0e22370f46cfba66bb195c9cc6cd13cb2d12b650\": container with ID starting with 3aab01b55b49e0bcfbce80fd0e22370f46cfba66bb195c9cc6cd13cb2d12b650 not found: ID does not exist" Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.733618 4766 scope.go:117] "RemoveContainer" containerID="074c231838035ece9a770fa1a2dd5ceb74d9ac60b5bb9516c39adf6c3d7812a5" Oct 02 11:06:25 crc kubenswrapper[4766]: E1002 11:06:25.733863 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"074c231838035ece9a770fa1a2dd5ceb74d9ac60b5bb9516c39adf6c3d7812a5\": container with ID starting with 074c231838035ece9a770fa1a2dd5ceb74d9ac60b5bb9516c39adf6c3d7812a5 not found: ID does not exist" containerID="074c231838035ece9a770fa1a2dd5ceb74d9ac60b5bb9516c39adf6c3d7812a5" Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.733905 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"074c231838035ece9a770fa1a2dd5ceb74d9ac60b5bb9516c39adf6c3d7812a5"} err="failed to get container status \"074c231838035ece9a770fa1a2dd5ceb74d9ac60b5bb9516c39adf6c3d7812a5\": rpc error: code = NotFound desc = could not find container \"074c231838035ece9a770fa1a2dd5ceb74d9ac60b5bb9516c39adf6c3d7812a5\": container with ID starting with 074c231838035ece9a770fa1a2dd5ceb74d9ac60b5bb9516c39adf6c3d7812a5 not found: ID does not exist" Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.733940 4766 scope.go:117] "RemoveContainer" containerID="62f31ad13b4f981552794c8a27f43f3cf56acf0352b80ee61c7dc67450c6e346" Oct 02 11:06:25 crc kubenswrapper[4766]: E1002 11:06:25.734208 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62f31ad13b4f981552794c8a27f43f3cf56acf0352b80ee61c7dc67450c6e346\": container with ID starting with 62f31ad13b4f981552794c8a27f43f3cf56acf0352b80ee61c7dc67450c6e346 not found: ID does not exist" containerID="62f31ad13b4f981552794c8a27f43f3cf56acf0352b80ee61c7dc67450c6e346" Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.734228 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62f31ad13b4f981552794c8a27f43f3cf56acf0352b80ee61c7dc67450c6e346"} err="failed to get container status \"62f31ad13b4f981552794c8a27f43f3cf56acf0352b80ee61c7dc67450c6e346\": rpc error: code = NotFound desc = could not find container \"62f31ad13b4f981552794c8a27f43f3cf56acf0352b80ee61c7dc67450c6e346\": container with ID starting with 62f31ad13b4f981552794c8a27f43f3cf56acf0352b80ee61c7dc67450c6e346 not found: ID does not exist" Oct 02 11:06:25 crc kubenswrapper[4766]: I1002 11:06:25.892183 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58c3749c-9861-41e7-92f7-374e1e21ef6b" path="/var/lib/kubelet/pods/58c3749c-9861-41e7-92f7-374e1e21ef6b/volumes" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.017278 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-27vgl"] Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.018303 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovn-controller" containerID="cri-o://394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b" gracePeriod=30 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.018364 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="nbdb" containerID="cri-o://d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7" gracePeriod=30 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.018434 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="northd" containerID="cri-o://236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7" gracePeriod=30 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.018481 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f" gracePeriod=30 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.018557 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="kube-rbac-proxy-node" containerID="cri-o://c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a" gracePeriod=30 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.018610 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovn-acl-logging" containerID="cri-o://649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3" gracePeriod=30 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.018691 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="sbdb" containerID="cri-o://8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69" gracePeriod=30 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.075146 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovnkube-controller" containerID="cri-o://5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8" gracePeriod=30 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.407601 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovnkube-controller/3.log" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.410697 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovn-acl-logging/0.log" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.411349 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovn-controller/0.log" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.411766 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.468994 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bfb4z"] Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469021 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-systemd\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469068 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-systemd-units\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469088 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-var-lib-openvswitch\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469116 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovnkube-config\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469145 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-ovn\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469165 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fb7h\" (UniqueName: \"kubernetes.io/projected/11cc785e-5bdc-4827-913a-4d899eb5a83c-kube-api-access-9fb7h\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469182 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-env-overrides\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469200 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-run-netns\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.469208 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c3749c-9861-41e7-92f7-374e1e21ef6b" containerName="extract-utilities" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469221 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c3749c-9861-41e7-92f7-374e1e21ef6b" containerName="extract-utilities" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469228 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-cni-netd\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469244 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-run-ovn-kubernetes\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469264 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-openvswitch\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469284 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-node-log\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469280 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469311 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovnkube-script-lib\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469323 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-cni-bin\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469350 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-kubelet\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469382 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovn-node-metrics-cert\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469398 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-log-socket\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469415 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-slash\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469431 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-etc-openvswitch\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469451 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"11cc785e-5bdc-4827-913a-4d899eb5a83c\" (UID: \"11cc785e-5bdc-4827-913a-4d899eb5a83c\") " Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469633 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-node-log" (OuterVolumeSpecName: "node-log") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469665 4766 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469701 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469736 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-log-socket" (OuterVolumeSpecName: "log-socket") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469756 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-slash" (OuterVolumeSpecName: "host-slash") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469771 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469790 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469810 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.469827 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470289 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470317 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470336 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470320 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.469230 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovnkube-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470366 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovnkube-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.470383 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovn-acl-logging" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470390 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovn-acl-logging" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.470404 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovnkube-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470412 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovnkube-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.470420 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovnkube-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470425 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovnkube-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.470436 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c3749c-9861-41e7-92f7-374e1e21ef6b" containerName="extract-content" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470443 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c3749c-9861-41e7-92f7-374e1e21ef6b" containerName="extract-content" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470375 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470409 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.470454 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="sbdb" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470534 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="sbdb" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.470563 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd7869a-ede9-4781-b2d2-9dfca0134699" containerName="extract-utilities" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470573 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd7869a-ede9-4781-b2d2-9dfca0134699" containerName="extract-utilities" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.470585 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="nbdb" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470592 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="nbdb" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.470607 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="northd" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470615 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="northd" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.470626 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd7869a-ede9-4781-b2d2-9dfca0134699" containerName="registry-server" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470634 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd7869a-ede9-4781-b2d2-9dfca0134699" containerName="registry-server" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.470648 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="kubecfg-setup" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470656 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="kubecfg-setup" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.470669 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="kube-rbac-proxy-node" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470677 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="kube-rbac-proxy-node" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.470685 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd7869a-ede9-4781-b2d2-9dfca0134699" containerName="extract-content" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470692 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd7869a-ede9-4781-b2d2-9dfca0134699" containerName="extract-content" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.470699 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58c3749c-9861-41e7-92f7-374e1e21ef6b" containerName="registry-server" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470705 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="58c3749c-9861-41e7-92f7-374e1e21ef6b" containerName="registry-server" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.470717 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovn-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470723 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovn-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.470731 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470736 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470774 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470829 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470929 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovnkube-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470942 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovnkube-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470949 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="58c3749c-9861-41e7-92f7-374e1e21ef6b" containerName="registry-server" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470956 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovn-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470963 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="northd" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470970 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="kube-rbac-proxy-ovn-metrics" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470977 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cd7869a-ede9-4781-b2d2-9dfca0134699" containerName="registry-server" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470984 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovnkube-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.470992 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="kube-rbac-proxy-node" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.471000 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovnkube-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.471007 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="sbdb" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.471020 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovn-acl-logging" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.471030 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="nbdb" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.471140 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovnkube-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.471150 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovnkube-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.471159 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovnkube-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.471166 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovnkube-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.471264 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerName="ovnkube-controller" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.472968 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.475932 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11cc785e-5bdc-4827-913a-4d899eb5a83c-kube-api-access-9fb7h" (OuterVolumeSpecName: "kube-api-access-9fb7h") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "kube-api-access-9fb7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.479432 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.484897 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "11cc785e-5bdc-4827-913a-4d899eb5a83c" (UID: "11cc785e-5bdc-4827-913a-4d899eb5a83c"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571007 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-cni-netd\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571076 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/635fd08a-5ab5-4684-adba-65ea28f8c2bc-env-overrides\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571107 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-run-openvswitch\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571167 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571201 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-kubelet\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571226 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/635fd08a-5ab5-4684-adba-65ea28f8c2bc-ovnkube-config\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571256 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-systemd-units\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571277 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-cni-bin\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571349 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-slash\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571441 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/635fd08a-5ab5-4684-adba-65ea28f8c2bc-ovnkube-script-lib\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571547 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-var-lib-openvswitch\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571586 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/635fd08a-5ab5-4684-adba-65ea28f8c2bc-ovn-node-metrics-cert\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571618 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-run-ovn-kubernetes\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571651 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-log-socket\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571676 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-run-ovn\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571697 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-node-log\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571719 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8tvd\" (UniqueName: \"kubernetes.io/projected/635fd08a-5ab5-4684-adba-65ea28f8c2bc-kube-api-access-c8tvd\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571744 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-run-systemd\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571765 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-run-netns\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571791 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-etc-openvswitch\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571907 4766 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-slash\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571928 4766 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571940 4766 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571957 4766 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571985 4766 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.571997 4766 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.572009 4766 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.572021 4766 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.572031 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fb7h\" (UniqueName: \"kubernetes.io/projected/11cc785e-5bdc-4827-913a-4d899eb5a83c-kube-api-access-9fb7h\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.572042 4766 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.572055 4766 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.572065 4766 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.572088 4766 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.572099 4766 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-node-log\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.572111 4766 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.572125 4766 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.572136 4766 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.572147 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/11cc785e-5bdc-4827-913a-4d899eb5a83c-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.572160 4766 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/11cc785e-5bdc-4827-913a-4d899eb5a83c-log-socket\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673074 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673131 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-kubelet\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673154 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/635fd08a-5ab5-4684-adba-65ea28f8c2bc-ovnkube-config\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673179 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-systemd-units\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673197 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-cni-bin\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673215 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-slash\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673234 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/635fd08a-5ab5-4684-adba-65ea28f8c2bc-ovnkube-script-lib\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673239 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673289 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-var-lib-openvswitch\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673304 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-systemd-units\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673251 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-var-lib-openvswitch\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673239 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-kubelet\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673323 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-slash\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673343 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/635fd08a-5ab5-4684-adba-65ea28f8c2bc-ovn-node-metrics-cert\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673373 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-run-ovn-kubernetes\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673393 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-log-socket\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673403 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-run-ovn-kubernetes\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673445 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-run-ovn\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673459 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-log-socket\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673333 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-cni-bin\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673416 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-run-ovn\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673581 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-node-log\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673611 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8tvd\" (UniqueName: \"kubernetes.io/projected/635fd08a-5ab5-4684-adba-65ea28f8c2bc-kube-api-access-c8tvd\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673641 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-run-systemd\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673663 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-run-netns\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673667 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-node-log\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673725 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-run-netns\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673731 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-run-systemd\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673755 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-etc-openvswitch\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673768 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-etc-openvswitch\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673805 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-cni-netd\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673838 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/635fd08a-5ab5-4684-adba-65ea28f8c2bc-env-overrides\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673866 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-run-openvswitch\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673938 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-host-cni-netd\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.673956 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/635fd08a-5ab5-4684-adba-65ea28f8c2bc-run-openvswitch\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.674489 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/635fd08a-5ab5-4684-adba-65ea28f8c2bc-env-overrides\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.674586 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/635fd08a-5ab5-4684-adba-65ea28f8c2bc-ovnkube-script-lib\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.674589 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/635fd08a-5ab5-4684-adba-65ea28f8c2bc-ovnkube-config\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.678233 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/635fd08a-5ab5-4684-adba-65ea28f8c2bc-ovn-node-metrics-cert\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.695038 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8tvd\" (UniqueName: \"kubernetes.io/projected/635fd08a-5ab5-4684-adba-65ea28f8c2bc-kube-api-access-c8tvd\") pod \"ovnkube-node-bfb4z\" (UID: \"635fd08a-5ab5-4684-adba-65ea28f8c2bc\") " pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.718460 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2jxdg_a6aa81c2-8c87-43df-badb-7b9dbef84ccf/kube-multus/2.log" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.718932 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2jxdg_a6aa81c2-8c87-43df-badb-7b9dbef84ccf/kube-multus/1.log" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.718985 4766 generic.go:334] "Generic (PLEG): container finished" podID="a6aa81c2-8c87-43df-badb-7b9dbef84ccf" containerID="1f29fcf0f6187d7194dae698016fffc20d300b88e7467e1fcd6a97ddd9243ac7" exitCode=2 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.719053 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2jxdg" event={"ID":"a6aa81c2-8c87-43df-badb-7b9dbef84ccf","Type":"ContainerDied","Data":"1f29fcf0f6187d7194dae698016fffc20d300b88e7467e1fcd6a97ddd9243ac7"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.719091 4766 scope.go:117] "RemoveContainer" containerID="ff77e4fb340919ea122bf7a3ecdab638bcb0d9dde19ec12b466f14a2cf2e753f" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.719606 4766 scope.go:117] "RemoveContainer" containerID="1f29fcf0f6187d7194dae698016fffc20d300b88e7467e1fcd6a97ddd9243ac7" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.723037 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovnkube-controller/3.log" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.730670 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovn-acl-logging/0.log" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731095 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-27vgl_11cc785e-5bdc-4827-913a-4d899eb5a83c/ovn-controller/0.log" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731436 4766 generic.go:334] "Generic (PLEG): container finished" podID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerID="5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8" exitCode=0 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731470 4766 generic.go:334] "Generic (PLEG): container finished" podID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerID="8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69" exitCode=0 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731482 4766 generic.go:334] "Generic (PLEG): container finished" podID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerID="d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7" exitCode=0 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731493 4766 generic.go:334] "Generic (PLEG): container finished" podID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerID="236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7" exitCode=0 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731516 4766 generic.go:334] "Generic (PLEG): container finished" podID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerID="82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f" exitCode=0 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731525 4766 generic.go:334] "Generic (PLEG): container finished" podID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerID="c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a" exitCode=0 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731536 4766 generic.go:334] "Generic (PLEG): container finished" podID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerID="649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3" exitCode=143 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731530 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerDied","Data":"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731565 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731570 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerDied","Data":"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731664 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerDied","Data":"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731545 4766 generic.go:334] "Generic (PLEG): container finished" podID="11cc785e-5bdc-4827-913a-4d899eb5a83c" containerID="394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b" exitCode=143 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731683 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerDied","Data":"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731695 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerDied","Data":"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731725 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerDied","Data":"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731738 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731751 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731758 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731765 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731771 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731779 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731785 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731792 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731798 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731804 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731813 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerDied","Data":"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731822 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731829 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731836 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731842 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731848 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731856 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731862 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731868 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731883 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731889 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731899 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerDied","Data":"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731913 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731920 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731926 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731933 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731940 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731946 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731953 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731961 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731967 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731973 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731983 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-27vgl" event={"ID":"11cc785e-5bdc-4827-913a-4d899eb5a83c","Type":"ContainerDied","Data":"ef3a0757191602abc70c5d4b1c5a8b503649299161f2b42799c0b1d3c4d87647"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.731992 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.732000 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.732007 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.732014 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.732020 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.732027 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.732034 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.732040 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.732047 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.732053 4766 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d"} Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.756378 4766 scope.go:117] "RemoveContainer" containerID="5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.773591 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-27vgl"] Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.775132 4766 scope.go:117] "RemoveContainer" containerID="a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.777353 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-27vgl"] Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.790802 4766 scope.go:117] "RemoveContainer" containerID="8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.794030 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.812105 4766 scope.go:117] "RemoveContainer" containerID="d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7" Oct 02 11:06:32 crc kubenswrapper[4766]: W1002 11:06:32.821596 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod635fd08a_5ab5_4684_adba_65ea28f8c2bc.slice/crio-5cee989d160a04fcb469e658e58ee9060f55108b1c5fbfc2bb940b5f252f8969 WatchSource:0}: Error finding container 5cee989d160a04fcb469e658e58ee9060f55108b1c5fbfc2bb940b5f252f8969: Status 404 returned error can't find the container with id 5cee989d160a04fcb469e658e58ee9060f55108b1c5fbfc2bb940b5f252f8969 Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.827268 4766 scope.go:117] "RemoveContainer" containerID="236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.851752 4766 scope.go:117] "RemoveContainer" containerID="82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.874341 4766 scope.go:117] "RemoveContainer" containerID="c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.893115 4766 scope.go:117] "RemoveContainer" containerID="649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.916349 4766 scope.go:117] "RemoveContainer" containerID="394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.930783 4766 scope.go:117] "RemoveContainer" containerID="897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.945591 4766 scope.go:117] "RemoveContainer" containerID="5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.946228 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8\": container with ID starting with 5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8 not found: ID does not exist" containerID="5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.946277 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8"} err="failed to get container status \"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8\": rpc error: code = NotFound desc = could not find container \"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8\": container with ID starting with 5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.946304 4766 scope.go:117] "RemoveContainer" containerID="a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.946768 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\": container with ID starting with a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560 not found: ID does not exist" containerID="a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.946822 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560"} err="failed to get container status \"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\": rpc error: code = NotFound desc = could not find container \"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\": container with ID starting with a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.946902 4766 scope.go:117] "RemoveContainer" containerID="8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.947343 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\": container with ID starting with 8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69 not found: ID does not exist" containerID="8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.947373 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69"} err="failed to get container status \"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\": rpc error: code = NotFound desc = could not find container \"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\": container with ID starting with 8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.947394 4766 scope.go:117] "RemoveContainer" containerID="d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.947832 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\": container with ID starting with d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7 not found: ID does not exist" containerID="d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.947858 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7"} err="failed to get container status \"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\": rpc error: code = NotFound desc = could not find container \"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\": container with ID starting with d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.947882 4766 scope.go:117] "RemoveContainer" containerID="236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.948158 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\": container with ID starting with 236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7 not found: ID does not exist" containerID="236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.948185 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7"} err="failed to get container status \"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\": rpc error: code = NotFound desc = could not find container \"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\": container with ID starting with 236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.948201 4766 scope.go:117] "RemoveContainer" containerID="82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.948642 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\": container with ID starting with 82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f not found: ID does not exist" containerID="82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.948663 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f"} err="failed to get container status \"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\": rpc error: code = NotFound desc = could not find container \"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\": container with ID starting with 82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.948677 4766 scope.go:117] "RemoveContainer" containerID="c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.949091 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\": container with ID starting with c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a not found: ID does not exist" containerID="c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.949113 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a"} err="failed to get container status \"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\": rpc error: code = NotFound desc = could not find container \"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\": container with ID starting with c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.949129 4766 scope.go:117] "RemoveContainer" containerID="649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.949399 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\": container with ID starting with 649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3 not found: ID does not exist" containerID="649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.949431 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3"} err="failed to get container status \"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\": rpc error: code = NotFound desc = could not find container \"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\": container with ID starting with 649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.949449 4766 scope.go:117] "RemoveContainer" containerID="394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.949738 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\": container with ID starting with 394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b not found: ID does not exist" containerID="394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.949770 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b"} err="failed to get container status \"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\": rpc error: code = NotFound desc = could not find container \"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\": container with ID starting with 394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.949789 4766 scope.go:117] "RemoveContainer" containerID="897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d" Oct 02 11:06:32 crc kubenswrapper[4766]: E1002 11:06:32.950046 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\": container with ID starting with 897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d not found: ID does not exist" containerID="897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.950063 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d"} err="failed to get container status \"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\": rpc error: code = NotFound desc = could not find container \"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\": container with ID starting with 897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.950076 4766 scope.go:117] "RemoveContainer" containerID="5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.950294 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8"} err="failed to get container status \"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8\": rpc error: code = NotFound desc = could not find container \"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8\": container with ID starting with 5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.950318 4766 scope.go:117] "RemoveContainer" containerID="a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.950660 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560"} err="failed to get container status \"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\": rpc error: code = NotFound desc = could not find container \"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\": container with ID starting with a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.950703 4766 scope.go:117] "RemoveContainer" containerID="8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.951080 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69"} err="failed to get container status \"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\": rpc error: code = NotFound desc = could not find container \"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\": container with ID starting with 8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.951108 4766 scope.go:117] "RemoveContainer" containerID="d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.951464 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7"} err="failed to get container status \"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\": rpc error: code = NotFound desc = could not find container \"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\": container with ID starting with d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.951484 4766 scope.go:117] "RemoveContainer" containerID="236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.951768 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7"} err="failed to get container status \"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\": rpc error: code = NotFound desc = could not find container \"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\": container with ID starting with 236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.951795 4766 scope.go:117] "RemoveContainer" containerID="82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.952041 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f"} err="failed to get container status \"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\": rpc error: code = NotFound desc = could not find container \"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\": container with ID starting with 82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.952061 4766 scope.go:117] "RemoveContainer" containerID="c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.952312 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a"} err="failed to get container status \"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\": rpc error: code = NotFound desc = could not find container \"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\": container with ID starting with c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.952334 4766 scope.go:117] "RemoveContainer" containerID="649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.952612 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3"} err="failed to get container status \"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\": rpc error: code = NotFound desc = could not find container \"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\": container with ID starting with 649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.952630 4766 scope.go:117] "RemoveContainer" containerID="394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.952870 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b"} err="failed to get container status \"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\": rpc error: code = NotFound desc = could not find container \"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\": container with ID starting with 394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.952888 4766 scope.go:117] "RemoveContainer" containerID="897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.953179 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d"} err="failed to get container status \"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\": rpc error: code = NotFound desc = could not find container \"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\": container with ID starting with 897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.953198 4766 scope.go:117] "RemoveContainer" containerID="5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.953746 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8"} err="failed to get container status \"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8\": rpc error: code = NotFound desc = could not find container \"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8\": container with ID starting with 5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.953762 4766 scope.go:117] "RemoveContainer" containerID="a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.954024 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560"} err="failed to get container status \"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\": rpc error: code = NotFound desc = could not find container \"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\": container with ID starting with a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.954053 4766 scope.go:117] "RemoveContainer" containerID="8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.954591 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69"} err="failed to get container status \"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\": rpc error: code = NotFound desc = could not find container \"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\": container with ID starting with 8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.954611 4766 scope.go:117] "RemoveContainer" containerID="d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.954871 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7"} err="failed to get container status \"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\": rpc error: code = NotFound desc = could not find container \"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\": container with ID starting with d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.954929 4766 scope.go:117] "RemoveContainer" containerID="236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.955190 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7"} err="failed to get container status \"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\": rpc error: code = NotFound desc = could not find container \"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\": container with ID starting with 236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.955204 4766 scope.go:117] "RemoveContainer" containerID="82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.955435 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f"} err="failed to get container status \"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\": rpc error: code = NotFound desc = could not find container \"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\": container with ID starting with 82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.955451 4766 scope.go:117] "RemoveContainer" containerID="c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.955675 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a"} err="failed to get container status \"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\": rpc error: code = NotFound desc = could not find container \"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\": container with ID starting with c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.955693 4766 scope.go:117] "RemoveContainer" containerID="649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.955949 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3"} err="failed to get container status \"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\": rpc error: code = NotFound desc = could not find container \"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\": container with ID starting with 649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.955974 4766 scope.go:117] "RemoveContainer" containerID="394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.956354 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b"} err="failed to get container status \"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\": rpc error: code = NotFound desc = could not find container \"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\": container with ID starting with 394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.956399 4766 scope.go:117] "RemoveContainer" containerID="897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.956760 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d"} err="failed to get container status \"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\": rpc error: code = NotFound desc = could not find container \"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\": container with ID starting with 897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.956788 4766 scope.go:117] "RemoveContainer" containerID="5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.957063 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8"} err="failed to get container status \"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8\": rpc error: code = NotFound desc = could not find container \"5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8\": container with ID starting with 5f027e2d4bd16ab1cc379f4991282fd73fe4bddd20920ed00adf5c63c897b2e8 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.957091 4766 scope.go:117] "RemoveContainer" containerID="a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.957344 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560"} err="failed to get container status \"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\": rpc error: code = NotFound desc = could not find container \"a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560\": container with ID starting with a056cb95b425937cee28b4833ba5c972d37336db4c44371a8d3846eb9cf0f560 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.957366 4766 scope.go:117] "RemoveContainer" containerID="8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.957736 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69"} err="failed to get container status \"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\": rpc error: code = NotFound desc = could not find container \"8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69\": container with ID starting with 8765ca7a5c33b519258dee9ccedc33bcac52da33aa70d5470793eeaf48cb3e69 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.957764 4766 scope.go:117] "RemoveContainer" containerID="d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.958053 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7"} err="failed to get container status \"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\": rpc error: code = NotFound desc = could not find container \"d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7\": container with ID starting with d6075cadb92fb2b9ec6fe0a2ac2535e26b1c65f9717b7877912bf493096266a7 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.958081 4766 scope.go:117] "RemoveContainer" containerID="236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.958466 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7"} err="failed to get container status \"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\": rpc error: code = NotFound desc = could not find container \"236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7\": container with ID starting with 236364faf42dd07350b21e9b001d4f798cff8e9e9c40ef821181b573139b0ea7 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.958485 4766 scope.go:117] "RemoveContainer" containerID="82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.958811 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f"} err="failed to get container status \"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\": rpc error: code = NotFound desc = could not find container \"82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f\": container with ID starting with 82f0c723d607e5ba10945a757a13caf3159583deec2c78eb9eb1ac7e5d58314f not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.958843 4766 scope.go:117] "RemoveContainer" containerID="c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.959109 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a"} err="failed to get container status \"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\": rpc error: code = NotFound desc = could not find container \"c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a\": container with ID starting with c042c48d15e61a72b40f7e2ffb242e0c097d26860bdbb41ec01447b9b35ee43a not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.959131 4766 scope.go:117] "RemoveContainer" containerID="649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.959336 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3"} err="failed to get container status \"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\": rpc error: code = NotFound desc = could not find container \"649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3\": container with ID starting with 649bc7443638347fc664cbc01fd8cf3a4f0eda2e730680802e6cc21f4f8600f3 not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.959365 4766 scope.go:117] "RemoveContainer" containerID="394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.959605 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b"} err="failed to get container status \"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\": rpc error: code = NotFound desc = could not find container \"394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b\": container with ID starting with 394100a15a13cf1ebffb60caff4fe74289bcdf1cf9d5f46692f4c0f0237b6f1b not found: ID does not exist" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.959625 4766 scope.go:117] "RemoveContainer" containerID="897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d" Oct 02 11:06:32 crc kubenswrapper[4766]: I1002 11:06:32.959840 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d"} err="failed to get container status \"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\": rpc error: code = NotFound desc = could not find container \"897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d\": container with ID starting with 897d4dc945cbe1b3440ceb58116685c63880951487dfe4e120b8a1874c714d3d not found: ID does not exist" Oct 02 11:06:33 crc kubenswrapper[4766]: I1002 11:06:33.739454 4766 generic.go:334] "Generic (PLEG): container finished" podID="635fd08a-5ab5-4684-adba-65ea28f8c2bc" containerID="ea67d89af3e01bba6323b0a08d7fbe59066024d8f321512bef1579a26868b7d7" exitCode=0 Oct 02 11:06:33 crc kubenswrapper[4766]: I1002 11:06:33.739999 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" event={"ID":"635fd08a-5ab5-4684-adba-65ea28f8c2bc","Type":"ContainerDied","Data":"ea67d89af3e01bba6323b0a08d7fbe59066024d8f321512bef1579a26868b7d7"} Oct 02 11:06:33 crc kubenswrapper[4766]: I1002 11:06:33.742791 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" event={"ID":"635fd08a-5ab5-4684-adba-65ea28f8c2bc","Type":"ContainerStarted","Data":"5cee989d160a04fcb469e658e58ee9060f55108b1c5fbfc2bb940b5f252f8969"} Oct 02 11:06:33 crc kubenswrapper[4766]: I1002 11:06:33.746880 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-2jxdg_a6aa81c2-8c87-43df-badb-7b9dbef84ccf/kube-multus/2.log" Oct 02 11:06:33 crc kubenswrapper[4766]: I1002 11:06:33.747018 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-2jxdg" event={"ID":"a6aa81c2-8c87-43df-badb-7b9dbef84ccf","Type":"ContainerStarted","Data":"076c29311053258744848a0dcad300e6487e310d55d5fc3fe99a6ce845d2d80c"} Oct 02 11:06:33 crc kubenswrapper[4766]: I1002 11:06:33.890574 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11cc785e-5bdc-4827-913a-4d899eb5a83c" path="/var/lib/kubelet/pods/11cc785e-5bdc-4827-913a-4d899eb5a83c/volumes" Oct 02 11:06:34 crc kubenswrapper[4766]: I1002 11:06:34.757789 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" event={"ID":"635fd08a-5ab5-4684-adba-65ea28f8c2bc","Type":"ContainerStarted","Data":"cc9fcfec48ce2090947abdc61f6c0e94b40e7536b73b6de384fc9e450589e16a"} Oct 02 11:06:34 crc kubenswrapper[4766]: I1002 11:06:34.758097 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" event={"ID":"635fd08a-5ab5-4684-adba-65ea28f8c2bc","Type":"ContainerStarted","Data":"2d8798718deebdd7b5b79cd3a3b1e80576bfab6ded31048684640ea9d60d45a3"} Oct 02 11:06:34 crc kubenswrapper[4766]: I1002 11:06:34.758109 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" event={"ID":"635fd08a-5ab5-4684-adba-65ea28f8c2bc","Type":"ContainerStarted","Data":"18c85f95e94a5d3ac1240ba49701e0402bf67501770f0ab3642f047c418ada11"} Oct 02 11:06:34 crc kubenswrapper[4766]: I1002 11:06:34.758118 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" event={"ID":"635fd08a-5ab5-4684-adba-65ea28f8c2bc","Type":"ContainerStarted","Data":"3764fa4f0930274f53fc82c303ea097bf2c1e4c3664b3ad80e1ce043fdc9721b"} Oct 02 11:06:34 crc kubenswrapper[4766]: I1002 11:06:34.758127 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" event={"ID":"635fd08a-5ab5-4684-adba-65ea28f8c2bc","Type":"ContainerStarted","Data":"d5b98d3e5aae4aa8a0200e7e2e9b5f9ab286b577249c8cdc8a7f2a9326f348e3"} Oct 02 11:06:35 crc kubenswrapper[4766]: I1002 11:06:35.766893 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" event={"ID":"635fd08a-5ab5-4684-adba-65ea28f8c2bc","Type":"ContainerStarted","Data":"75086c31e62971e5cd32e6207ebab0cd141cbe3cefadfe300580fb9f392e283a"} Oct 02 11:06:37 crc kubenswrapper[4766]: I1002 11:06:37.783539 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" event={"ID":"635fd08a-5ab5-4684-adba-65ea28f8c2bc","Type":"ContainerStarted","Data":"a35a5843c6df3e22d0b92dd75a5e6814e9317ef323bf0ad813ed3fdd097ad7f9"} Oct 02 11:06:38 crc kubenswrapper[4766]: I1002 11:06:38.253227 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-srn9g"] Oct 02 11:06:38 crc kubenswrapper[4766]: I1002 11:06:38.253858 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:38 crc kubenswrapper[4766]: I1002 11:06:38.256646 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 02 11:06:38 crc kubenswrapper[4766]: I1002 11:06:38.257201 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 02 11:06:38 crc kubenswrapper[4766]: I1002 11:06:38.257543 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 02 11:06:38 crc kubenswrapper[4766]: I1002 11:06:38.257808 4766 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-rlbp5" Oct 02 11:06:38 crc kubenswrapper[4766]: I1002 11:06:38.340237 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd47b711-0420-4131-b61b-28b954f4fc9d-crc-storage\") pod \"crc-storage-crc-srn9g\" (UID: \"fd47b711-0420-4131-b61b-28b954f4fc9d\") " pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:38 crc kubenswrapper[4766]: I1002 11:06:38.340465 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkxgf\" (UniqueName: \"kubernetes.io/projected/fd47b711-0420-4131-b61b-28b954f4fc9d-kube-api-access-tkxgf\") pod \"crc-storage-crc-srn9g\" (UID: \"fd47b711-0420-4131-b61b-28b954f4fc9d\") " pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:38 crc kubenswrapper[4766]: I1002 11:06:38.340541 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd47b711-0420-4131-b61b-28b954f4fc9d-node-mnt\") pod \"crc-storage-crc-srn9g\" (UID: \"fd47b711-0420-4131-b61b-28b954f4fc9d\") " pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:38 crc kubenswrapper[4766]: I1002 11:06:38.442070 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd47b711-0420-4131-b61b-28b954f4fc9d-crc-storage\") pod \"crc-storage-crc-srn9g\" (UID: \"fd47b711-0420-4131-b61b-28b954f4fc9d\") " pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:38 crc kubenswrapper[4766]: I1002 11:06:38.442217 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkxgf\" (UniqueName: \"kubernetes.io/projected/fd47b711-0420-4131-b61b-28b954f4fc9d-kube-api-access-tkxgf\") pod \"crc-storage-crc-srn9g\" (UID: \"fd47b711-0420-4131-b61b-28b954f4fc9d\") " pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:38 crc kubenswrapper[4766]: I1002 11:06:38.442249 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd47b711-0420-4131-b61b-28b954f4fc9d-node-mnt\") pod \"crc-storage-crc-srn9g\" (UID: \"fd47b711-0420-4131-b61b-28b954f4fc9d\") " pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:38 crc kubenswrapper[4766]: I1002 11:06:38.442702 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd47b711-0420-4131-b61b-28b954f4fc9d-node-mnt\") pod \"crc-storage-crc-srn9g\" (UID: \"fd47b711-0420-4131-b61b-28b954f4fc9d\") " pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:38 crc kubenswrapper[4766]: I1002 11:06:38.443157 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd47b711-0420-4131-b61b-28b954f4fc9d-crc-storage\") pod \"crc-storage-crc-srn9g\" (UID: \"fd47b711-0420-4131-b61b-28b954f4fc9d\") " pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:38 crc kubenswrapper[4766]: I1002 11:06:38.463249 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkxgf\" (UniqueName: \"kubernetes.io/projected/fd47b711-0420-4131-b61b-28b954f4fc9d-kube-api-access-tkxgf\") pod \"crc-storage-crc-srn9g\" (UID: \"fd47b711-0420-4131-b61b-28b954f4fc9d\") " pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:38 crc kubenswrapper[4766]: I1002 11:06:38.571214 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:38 crc kubenswrapper[4766]: E1002 11:06:38.603692 4766 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-srn9g_crc-storage_fd47b711-0420-4131-b61b-28b954f4fc9d_0(65f8e92f1690ad4bbb9998ecd1266590e70e924c13ec8352395710311845251d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 11:06:38 crc kubenswrapper[4766]: E1002 11:06:38.603807 4766 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-srn9g_crc-storage_fd47b711-0420-4131-b61b-28b954f4fc9d_0(65f8e92f1690ad4bbb9998ecd1266590e70e924c13ec8352395710311845251d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:38 crc kubenswrapper[4766]: E1002 11:06:38.603837 4766 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-srn9g_crc-storage_fd47b711-0420-4131-b61b-28b954f4fc9d_0(65f8e92f1690ad4bbb9998ecd1266590e70e924c13ec8352395710311845251d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:38 crc kubenswrapper[4766]: E1002 11:06:38.603906 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-srn9g_crc-storage(fd47b711-0420-4131-b61b-28b954f4fc9d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-srn9g_crc-storage(fd47b711-0420-4131-b61b-28b954f4fc9d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-srn9g_crc-storage_fd47b711-0420-4131-b61b-28b954f4fc9d_0(65f8e92f1690ad4bbb9998ecd1266590e70e924c13ec8352395710311845251d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-srn9g" podUID="fd47b711-0420-4131-b61b-28b954f4fc9d" Oct 02 11:06:40 crc kubenswrapper[4766]: I1002 11:06:40.800604 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" event={"ID":"635fd08a-5ab5-4684-adba-65ea28f8c2bc","Type":"ContainerStarted","Data":"ae14b658fabbd11ceefd8c4ba544ed1997ef6b288ecac70920b76a5cc57c4256"} Oct 02 11:06:40 crc kubenswrapper[4766]: I1002 11:06:40.801172 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:40 crc kubenswrapper[4766]: I1002 11:06:40.801185 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:40 crc kubenswrapper[4766]: I1002 11:06:40.801192 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:40 crc kubenswrapper[4766]: I1002 11:06:40.826619 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:40 crc kubenswrapper[4766]: I1002 11:06:40.829405 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" podStartSLOduration=8.829389604 podStartE2EDuration="8.829389604s" podCreationTimestamp="2025-10-02 11:06:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:06:40.826928215 +0000 UTC m=+915.769799179" watchObservedRunningTime="2025-10-02 11:06:40.829389604 +0000 UTC m=+915.772260548" Oct 02 11:06:40 crc kubenswrapper[4766]: I1002 11:06:40.833273 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:06:41 crc kubenswrapper[4766]: I1002 11:06:41.082436 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-srn9g"] Oct 02 11:06:41 crc kubenswrapper[4766]: I1002 11:06:41.082642 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:41 crc kubenswrapper[4766]: I1002 11:06:41.083226 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:41 crc kubenswrapper[4766]: E1002 11:06:41.112414 4766 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-srn9g_crc-storage_fd47b711-0420-4131-b61b-28b954f4fc9d_0(84897ec30719bd35735e446d832c182a7c2cc914909a5c15d0663cab77300d2a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 02 11:06:41 crc kubenswrapper[4766]: E1002 11:06:41.112518 4766 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-srn9g_crc-storage_fd47b711-0420-4131-b61b-28b954f4fc9d_0(84897ec30719bd35735e446d832c182a7c2cc914909a5c15d0663cab77300d2a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:41 crc kubenswrapper[4766]: E1002 11:06:41.112541 4766 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-srn9g_crc-storage_fd47b711-0420-4131-b61b-28b954f4fc9d_0(84897ec30719bd35735e446d832c182a7c2cc914909a5c15d0663cab77300d2a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:41 crc kubenswrapper[4766]: E1002 11:06:41.112588 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-srn9g_crc-storage(fd47b711-0420-4131-b61b-28b954f4fc9d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-srn9g_crc-storage(fd47b711-0420-4131-b61b-28b954f4fc9d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-srn9g_crc-storage_fd47b711-0420-4131-b61b-28b954f4fc9d_0(84897ec30719bd35735e446d832c182a7c2cc914909a5c15d0663cab77300d2a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-srn9g" podUID="fd47b711-0420-4131-b61b-28b954f4fc9d" Oct 02 11:06:54 crc kubenswrapper[4766]: I1002 11:06:54.432924 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:06:54 crc kubenswrapper[4766]: I1002 11:06:54.433639 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:06:54 crc kubenswrapper[4766]: I1002 11:06:54.880985 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:54 crc kubenswrapper[4766]: I1002 11:06:54.881462 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:55 crc kubenswrapper[4766]: I1002 11:06:55.083250 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-srn9g"] Oct 02 11:06:55 crc kubenswrapper[4766]: I1002 11:06:55.890106 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-srn9g" event={"ID":"fd47b711-0420-4131-b61b-28b954f4fc9d","Type":"ContainerStarted","Data":"cb12331b40e6069479057f72385ac35be5fbc8b4d6844315972136b1a3aaa510"} Oct 02 11:06:56 crc kubenswrapper[4766]: I1002 11:06:56.893903 4766 generic.go:334] "Generic (PLEG): container finished" podID="fd47b711-0420-4131-b61b-28b954f4fc9d" containerID="edeabddb523219c5c33771eb075692170d60afac7520813de77b0e4eb035b79d" exitCode=0 Oct 02 11:06:56 crc kubenswrapper[4766]: I1002 11:06:56.893967 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-srn9g" event={"ID":"fd47b711-0420-4131-b61b-28b954f4fc9d","Type":"ContainerDied","Data":"edeabddb523219c5c33771eb075692170d60afac7520813de77b0e4eb035b79d"} Oct 02 11:06:58 crc kubenswrapper[4766]: I1002 11:06:58.100789 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:06:58 crc kubenswrapper[4766]: I1002 11:06:58.200374 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd47b711-0420-4131-b61b-28b954f4fc9d-crc-storage\") pod \"fd47b711-0420-4131-b61b-28b954f4fc9d\" (UID: \"fd47b711-0420-4131-b61b-28b954f4fc9d\") " Oct 02 11:06:58 crc kubenswrapper[4766]: I1002 11:06:58.200441 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkxgf\" (UniqueName: \"kubernetes.io/projected/fd47b711-0420-4131-b61b-28b954f4fc9d-kube-api-access-tkxgf\") pod \"fd47b711-0420-4131-b61b-28b954f4fc9d\" (UID: \"fd47b711-0420-4131-b61b-28b954f4fc9d\") " Oct 02 11:06:58 crc kubenswrapper[4766]: I1002 11:06:58.200556 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd47b711-0420-4131-b61b-28b954f4fc9d-node-mnt\") pod \"fd47b711-0420-4131-b61b-28b954f4fc9d\" (UID: \"fd47b711-0420-4131-b61b-28b954f4fc9d\") " Oct 02 11:06:58 crc kubenswrapper[4766]: I1002 11:06:58.200693 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd47b711-0420-4131-b61b-28b954f4fc9d-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "fd47b711-0420-4131-b61b-28b954f4fc9d" (UID: "fd47b711-0420-4131-b61b-28b954f4fc9d"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:06:58 crc kubenswrapper[4766]: I1002 11:06:58.201085 4766 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fd47b711-0420-4131-b61b-28b954f4fc9d-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:58 crc kubenswrapper[4766]: I1002 11:06:58.205997 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd47b711-0420-4131-b61b-28b954f4fc9d-kube-api-access-tkxgf" (OuterVolumeSpecName: "kube-api-access-tkxgf") pod "fd47b711-0420-4131-b61b-28b954f4fc9d" (UID: "fd47b711-0420-4131-b61b-28b954f4fc9d"). InnerVolumeSpecName "kube-api-access-tkxgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:06:58 crc kubenswrapper[4766]: I1002 11:06:58.216109 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd47b711-0420-4131-b61b-28b954f4fc9d-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "fd47b711-0420-4131-b61b-28b954f4fc9d" (UID: "fd47b711-0420-4131-b61b-28b954f4fc9d"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:06:58 crc kubenswrapper[4766]: I1002 11:06:58.302520 4766 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fd47b711-0420-4131-b61b-28b954f4fc9d-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:58 crc kubenswrapper[4766]: I1002 11:06:58.302570 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkxgf\" (UniqueName: \"kubernetes.io/projected/fd47b711-0420-4131-b61b-28b954f4fc9d-kube-api-access-tkxgf\") on node \"crc\" DevicePath \"\"" Oct 02 11:06:58 crc kubenswrapper[4766]: I1002 11:06:58.906148 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-srn9g" event={"ID":"fd47b711-0420-4131-b61b-28b954f4fc9d","Type":"ContainerDied","Data":"cb12331b40e6069479057f72385ac35be5fbc8b4d6844315972136b1a3aaa510"} Oct 02 11:06:58 crc kubenswrapper[4766]: I1002 11:06:58.906189 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb12331b40e6069479057f72385ac35be5fbc8b4d6844315972136b1a3aaa510" Oct 02 11:06:58 crc kubenswrapper[4766]: I1002 11:06:58.906220 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-srn9g" Oct 02 11:07:02 crc kubenswrapper[4766]: I1002 11:07:02.814977 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bfb4z" Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.084195 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch"] Oct 02 11:07:05 crc kubenswrapper[4766]: E1002 11:07:05.084740 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd47b711-0420-4131-b61b-28b954f4fc9d" containerName="storage" Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.084756 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd47b711-0420-4131-b61b-28b954f4fc9d" containerName="storage" Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.084887 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd47b711-0420-4131-b61b-28b954f4fc9d" containerName="storage" Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.085550 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.087803 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.099726 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch"] Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.185906 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch\" (UID: \"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.185956 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5jw7\" (UniqueName: \"kubernetes.io/projected/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-kube-api-access-m5jw7\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch\" (UID: \"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.186193 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch\" (UID: \"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.287677 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch\" (UID: \"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.287736 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5jw7\" (UniqueName: \"kubernetes.io/projected/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-kube-api-access-m5jw7\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch\" (UID: \"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.287777 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch\" (UID: \"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.288177 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch\" (UID: \"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.288189 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch\" (UID: \"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.309651 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5jw7\" (UniqueName: \"kubernetes.io/projected/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-kube-api-access-m5jw7\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch\" (UID: \"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.399977 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.783021 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch"] Oct 02 11:07:05 crc kubenswrapper[4766]: I1002 11:07:05.940811 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" event={"ID":"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da","Type":"ContainerStarted","Data":"c25a74157ad6ea7e8fdf3a4aab404df69cea701b8085ace8010cf35f203a4203"} Oct 02 11:07:06 crc kubenswrapper[4766]: I1002 11:07:06.949432 4766 generic.go:334] "Generic (PLEG): container finished" podID="7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da" containerID="d74fec5e346554a22244bd5ed69cfdc7a7e930c1b6981c48ea2e7baa796c3b31" exitCode=0 Oct 02 11:07:06 crc kubenswrapper[4766]: I1002 11:07:06.949551 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" event={"ID":"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da","Type":"ContainerDied","Data":"d74fec5e346554a22244bd5ed69cfdc7a7e930c1b6981c48ea2e7baa796c3b31"} Oct 02 11:07:08 crc kubenswrapper[4766]: I1002 11:07:08.966387 4766 generic.go:334] "Generic (PLEG): container finished" podID="7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da" containerID="26e36a2a073a90a2ab02c03784fb623fef37d105ff11ebeee1164b4dd79507bb" exitCode=0 Oct 02 11:07:08 crc kubenswrapper[4766]: I1002 11:07:08.966442 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" event={"ID":"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da","Type":"ContainerDied","Data":"26e36a2a073a90a2ab02c03784fb623fef37d105ff11ebeee1164b4dd79507bb"} Oct 02 11:07:09 crc kubenswrapper[4766]: I1002 11:07:09.973262 4766 generic.go:334] "Generic (PLEG): container finished" podID="7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da" containerID="0abed237e55a0862c860c8d48b9a24b15a5b28383206c7b59fcdef409e00e785" exitCode=0 Oct 02 11:07:09 crc kubenswrapper[4766]: I1002 11:07:09.973304 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" event={"ID":"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da","Type":"ContainerDied","Data":"0abed237e55a0862c860c8d48b9a24b15a5b28383206c7b59fcdef409e00e785"} Oct 02 11:07:11 crc kubenswrapper[4766]: I1002 11:07:11.200236 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" Oct 02 11:07:11 crc kubenswrapper[4766]: I1002 11:07:11.366854 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-bundle\") pod \"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da\" (UID: \"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da\") " Oct 02 11:07:11 crc kubenswrapper[4766]: I1002 11:07:11.366920 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5jw7\" (UniqueName: \"kubernetes.io/projected/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-kube-api-access-m5jw7\") pod \"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da\" (UID: \"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da\") " Oct 02 11:07:11 crc kubenswrapper[4766]: I1002 11:07:11.366992 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-util\") pod \"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da\" (UID: \"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da\") " Oct 02 11:07:11 crc kubenswrapper[4766]: I1002 11:07:11.367790 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-bundle" (OuterVolumeSpecName: "bundle") pod "7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da" (UID: "7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:07:11 crc kubenswrapper[4766]: I1002 11:07:11.381547 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-util" (OuterVolumeSpecName: "util") pod "7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da" (UID: "7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:07:11 crc kubenswrapper[4766]: I1002 11:07:11.398737 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-kube-api-access-m5jw7" (OuterVolumeSpecName: "kube-api-access-m5jw7") pod "7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da" (UID: "7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da"). InnerVolumeSpecName "kube-api-access-m5jw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:07:11 crc kubenswrapper[4766]: I1002 11:07:11.468446 4766 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-util\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:11 crc kubenswrapper[4766]: I1002 11:07:11.468493 4766 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:11 crc kubenswrapper[4766]: I1002 11:07:11.468532 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5jw7\" (UniqueName: \"kubernetes.io/projected/7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da-kube-api-access-m5jw7\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:11 crc kubenswrapper[4766]: I1002 11:07:11.986072 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" event={"ID":"7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da","Type":"ContainerDied","Data":"c25a74157ad6ea7e8fdf3a4aab404df69cea701b8085ace8010cf35f203a4203"} Oct 02 11:07:11 crc kubenswrapper[4766]: I1002 11:07:11.986325 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c25a74157ad6ea7e8fdf3a4aab404df69cea701b8085ace8010cf35f203a4203" Oct 02 11:07:11 crc kubenswrapper[4766]: I1002 11:07:11.986196 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch" Oct 02 11:07:13 crc kubenswrapper[4766]: I1002 11:07:13.029942 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-5hsll"] Oct 02 11:07:13 crc kubenswrapper[4766]: E1002 11:07:13.030182 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da" containerName="util" Oct 02 11:07:13 crc kubenswrapper[4766]: I1002 11:07:13.030197 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da" containerName="util" Oct 02 11:07:13 crc kubenswrapper[4766]: E1002 11:07:13.030217 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da" containerName="extract" Oct 02 11:07:13 crc kubenswrapper[4766]: I1002 11:07:13.030225 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da" containerName="extract" Oct 02 11:07:13 crc kubenswrapper[4766]: E1002 11:07:13.030241 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da" containerName="pull" Oct 02 11:07:13 crc kubenswrapper[4766]: I1002 11:07:13.030250 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da" containerName="pull" Oct 02 11:07:13 crc kubenswrapper[4766]: I1002 11:07:13.030366 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da" containerName="extract" Oct 02 11:07:13 crc kubenswrapper[4766]: I1002 11:07:13.030857 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5hsll" Oct 02 11:07:13 crc kubenswrapper[4766]: I1002 11:07:13.032158 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 02 11:07:13 crc kubenswrapper[4766]: I1002 11:07:13.032910 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-45zx8" Oct 02 11:07:13 crc kubenswrapper[4766]: I1002 11:07:13.035283 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 02 11:07:13 crc kubenswrapper[4766]: I1002 11:07:13.071211 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-5hsll"] Oct 02 11:07:13 crc kubenswrapper[4766]: I1002 11:07:13.188965 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whffr\" (UniqueName: \"kubernetes.io/projected/dd8848bc-a5ea-40ae-9e27-eadbeef93edb-kube-api-access-whffr\") pod \"nmstate-operator-858ddd8f98-5hsll\" (UID: \"dd8848bc-a5ea-40ae-9e27-eadbeef93edb\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-5hsll" Oct 02 11:07:13 crc kubenswrapper[4766]: I1002 11:07:13.290295 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whffr\" (UniqueName: \"kubernetes.io/projected/dd8848bc-a5ea-40ae-9e27-eadbeef93edb-kube-api-access-whffr\") pod \"nmstate-operator-858ddd8f98-5hsll\" (UID: \"dd8848bc-a5ea-40ae-9e27-eadbeef93edb\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-5hsll" Oct 02 11:07:13 crc kubenswrapper[4766]: I1002 11:07:13.310899 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whffr\" (UniqueName: \"kubernetes.io/projected/dd8848bc-a5ea-40ae-9e27-eadbeef93edb-kube-api-access-whffr\") pod \"nmstate-operator-858ddd8f98-5hsll\" (UID: \"dd8848bc-a5ea-40ae-9e27-eadbeef93edb\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-5hsll" Oct 02 11:07:13 crc kubenswrapper[4766]: I1002 11:07:13.344295 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5hsll" Oct 02 11:07:13 crc kubenswrapper[4766]: I1002 11:07:13.516880 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-5hsll"] Oct 02 11:07:14 crc kubenswrapper[4766]: I1002 11:07:14.006440 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5hsll" event={"ID":"dd8848bc-a5ea-40ae-9e27-eadbeef93edb","Type":"ContainerStarted","Data":"d8b8e03212376932133a9556805bb8137187e3c1a3d69aef0f2dc664593d3a81"} Oct 02 11:07:16 crc kubenswrapper[4766]: I1002 11:07:16.017650 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5hsll" event={"ID":"dd8848bc-a5ea-40ae-9e27-eadbeef93edb","Type":"ContainerStarted","Data":"0268b8bd22904d775833bf2d040264794a43d98e25fceb9f44af68220221c95a"} Oct 02 11:07:16 crc kubenswrapper[4766]: I1002 11:07:16.899962 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5hsll" podStartSLOduration=1.663399823 podStartE2EDuration="3.899946939s" podCreationTimestamp="2025-10-02 11:07:13 +0000 UTC" firstStartedPulling="2025-10-02 11:07:13.527214887 +0000 UTC m=+948.470085831" lastFinishedPulling="2025-10-02 11:07:15.763762003 +0000 UTC m=+950.706632947" observedRunningTime="2025-10-02 11:07:16.031034367 +0000 UTC m=+950.973905321" watchObservedRunningTime="2025-10-02 11:07:16.899946939 +0000 UTC m=+951.842817883" Oct 02 11:07:16 crc kubenswrapper[4766]: I1002 11:07:16.900512 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-dbwjf"] Oct 02 11:07:16 crc kubenswrapper[4766]: I1002 11:07:16.901283 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dbwjf" Oct 02 11:07:16 crc kubenswrapper[4766]: W1002 11:07:16.904151 4766 reflector.go:561] object-"openshift-nmstate"/"nmstate-handler-dockercfg-hwlwd": failed to list *v1.Secret: secrets "nmstate-handler-dockercfg-hwlwd" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-nmstate": no relationship found between node 'crc' and this object Oct 02 11:07:16 crc kubenswrapper[4766]: E1002 11:07:16.904194 4766 reflector.go:158] "Unhandled Error" err="object-\"openshift-nmstate\"/\"nmstate-handler-dockercfg-hwlwd\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"nmstate-handler-dockercfg-hwlwd\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-nmstate\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 11:07:16 crc kubenswrapper[4766]: I1002 11:07:16.914290 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-v9vf7"] Oct 02 11:07:16 crc kubenswrapper[4766]: I1002 11:07:16.915026 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-v9vf7" Oct 02 11:07:16 crc kubenswrapper[4766]: I1002 11:07:16.916905 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 02 11:07:16 crc kubenswrapper[4766]: I1002 11:07:16.929483 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-dbwjf"] Oct 02 11:07:16 crc kubenswrapper[4766]: I1002 11:07:16.940950 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-v9vf7"] Oct 02 11:07:16 crc kubenswrapper[4766]: I1002 11:07:16.943559 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-mg42f"] Oct 02 11:07:16 crc kubenswrapper[4766]: I1002 11:07:16.944196 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mg42f" Oct 02 11:07:16 crc kubenswrapper[4766]: I1002 11:07:16.945245 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg98s\" (UniqueName: \"kubernetes.io/projected/5504968e-95f4-4664-bbd0-958eb8efb21e-kube-api-access-jg98s\") pod \"nmstate-metrics-fdff9cb8d-dbwjf\" (UID: \"5504968e-95f4-4664-bbd0-958eb8efb21e\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dbwjf" Oct 02 11:07:16 crc kubenswrapper[4766]: I1002 11:07:16.945273 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc8bq\" (UniqueName: \"kubernetes.io/projected/841c7884-b0c7-45fc-9032-9d5c27cd862a-kube-api-access-xc8bq\") pod \"nmstate-webhook-6cdbc54649-v9vf7\" (UID: \"841c7884-b0c7-45fc-9032-9d5c27cd862a\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-v9vf7" Oct 02 11:07:16 crc kubenswrapper[4766]: I1002 11:07:16.945294 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/841c7884-b0c7-45fc-9032-9d5c27cd862a-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-v9vf7\" (UID: \"841c7884-b0c7-45fc-9032-9d5c27cd862a\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-v9vf7" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.045700 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5834ff77-38e7-40e3-a6a0-ce908f1343f0-dbus-socket\") pod \"nmstate-handler-mg42f\" (UID: \"5834ff77-38e7-40e3-a6a0-ce908f1343f0\") " pod="openshift-nmstate/nmstate-handler-mg42f" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.045763 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6nwr\" (UniqueName: \"kubernetes.io/projected/5834ff77-38e7-40e3-a6a0-ce908f1343f0-kube-api-access-v6nwr\") pod \"nmstate-handler-mg42f\" (UID: \"5834ff77-38e7-40e3-a6a0-ce908f1343f0\") " pod="openshift-nmstate/nmstate-handler-mg42f" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.046095 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5834ff77-38e7-40e3-a6a0-ce908f1343f0-ovs-socket\") pod \"nmstate-handler-mg42f\" (UID: \"5834ff77-38e7-40e3-a6a0-ce908f1343f0\") " pod="openshift-nmstate/nmstate-handler-mg42f" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.046215 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg98s\" (UniqueName: \"kubernetes.io/projected/5504968e-95f4-4664-bbd0-958eb8efb21e-kube-api-access-jg98s\") pod \"nmstate-metrics-fdff9cb8d-dbwjf\" (UID: \"5504968e-95f4-4664-bbd0-958eb8efb21e\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dbwjf" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.046257 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc8bq\" (UniqueName: \"kubernetes.io/projected/841c7884-b0c7-45fc-9032-9d5c27cd862a-kube-api-access-xc8bq\") pod \"nmstate-webhook-6cdbc54649-v9vf7\" (UID: \"841c7884-b0c7-45fc-9032-9d5c27cd862a\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-v9vf7" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.046315 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5834ff77-38e7-40e3-a6a0-ce908f1343f0-nmstate-lock\") pod \"nmstate-handler-mg42f\" (UID: \"5834ff77-38e7-40e3-a6a0-ce908f1343f0\") " pod="openshift-nmstate/nmstate-handler-mg42f" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.046352 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/841c7884-b0c7-45fc-9032-9d5c27cd862a-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-v9vf7\" (UID: \"841c7884-b0c7-45fc-9032-9d5c27cd862a\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-v9vf7" Oct 02 11:07:17 crc kubenswrapper[4766]: E1002 11:07:17.046724 4766 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 02 11:07:17 crc kubenswrapper[4766]: E1002 11:07:17.046914 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/841c7884-b0c7-45fc-9032-9d5c27cd862a-tls-key-pair podName:841c7884-b0c7-45fc-9032-9d5c27cd862a nodeName:}" failed. No retries permitted until 2025-10-02 11:07:17.546880528 +0000 UTC m=+952.489751472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/841c7884-b0c7-45fc-9032-9d5c27cd862a-tls-key-pair") pod "nmstate-webhook-6cdbc54649-v9vf7" (UID: "841c7884-b0c7-45fc-9032-9d5c27cd862a") : secret "openshift-nmstate-webhook" not found Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.053966 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2"] Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.056009 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.058971 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.059903 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-5mcrl" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.061242 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.067472 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2"] Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.075177 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg98s\" (UniqueName: \"kubernetes.io/projected/5504968e-95f4-4664-bbd0-958eb8efb21e-kube-api-access-jg98s\") pod \"nmstate-metrics-fdff9cb8d-dbwjf\" (UID: \"5504968e-95f4-4664-bbd0-958eb8efb21e\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dbwjf" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.079704 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc8bq\" (UniqueName: \"kubernetes.io/projected/841c7884-b0c7-45fc-9032-9d5c27cd862a-kube-api-access-xc8bq\") pod \"nmstate-webhook-6cdbc54649-v9vf7\" (UID: \"841c7884-b0c7-45fc-9032-9d5c27cd862a\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-v9vf7" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.147065 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5834ff77-38e7-40e3-a6a0-ce908f1343f0-ovs-socket\") pod \"nmstate-handler-mg42f\" (UID: \"5834ff77-38e7-40e3-a6a0-ce908f1343f0\") " pod="openshift-nmstate/nmstate-handler-mg42f" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.147142 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5834ff77-38e7-40e3-a6a0-ce908f1343f0-nmstate-lock\") pod \"nmstate-handler-mg42f\" (UID: \"5834ff77-38e7-40e3-a6a0-ce908f1343f0\") " pod="openshift-nmstate/nmstate-handler-mg42f" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.147176 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/87c86148-6c6d-48a2-bd6c-4004f6d782e8-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-q5bx2\" (UID: \"87c86148-6c6d-48a2-bd6c-4004f6d782e8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.147202 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/87c86148-6c6d-48a2-bd6c-4004f6d782e8-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-q5bx2\" (UID: \"87c86148-6c6d-48a2-bd6c-4004f6d782e8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.147230 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5834ff77-38e7-40e3-a6a0-ce908f1343f0-dbus-socket\") pod \"nmstate-handler-mg42f\" (UID: \"5834ff77-38e7-40e3-a6a0-ce908f1343f0\") " pod="openshift-nmstate/nmstate-handler-mg42f" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.147255 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6nwr\" (UniqueName: \"kubernetes.io/projected/5834ff77-38e7-40e3-a6a0-ce908f1343f0-kube-api-access-v6nwr\") pod \"nmstate-handler-mg42f\" (UID: \"5834ff77-38e7-40e3-a6a0-ce908f1343f0\") " pod="openshift-nmstate/nmstate-handler-mg42f" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.147299 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncjpd\" (UniqueName: \"kubernetes.io/projected/87c86148-6c6d-48a2-bd6c-4004f6d782e8-kube-api-access-ncjpd\") pod \"nmstate-console-plugin-6b874cbd85-q5bx2\" (UID: \"87c86148-6c6d-48a2-bd6c-4004f6d782e8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.147363 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5834ff77-38e7-40e3-a6a0-ce908f1343f0-ovs-socket\") pod \"nmstate-handler-mg42f\" (UID: \"5834ff77-38e7-40e3-a6a0-ce908f1343f0\") " pod="openshift-nmstate/nmstate-handler-mg42f" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.147419 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5834ff77-38e7-40e3-a6a0-ce908f1343f0-nmstate-lock\") pod \"nmstate-handler-mg42f\" (UID: \"5834ff77-38e7-40e3-a6a0-ce908f1343f0\") " pod="openshift-nmstate/nmstate-handler-mg42f" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.147665 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5834ff77-38e7-40e3-a6a0-ce908f1343f0-dbus-socket\") pod \"nmstate-handler-mg42f\" (UID: \"5834ff77-38e7-40e3-a6a0-ce908f1343f0\") " pod="openshift-nmstate/nmstate-handler-mg42f" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.177169 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6nwr\" (UniqueName: \"kubernetes.io/projected/5834ff77-38e7-40e3-a6a0-ce908f1343f0-kube-api-access-v6nwr\") pod \"nmstate-handler-mg42f\" (UID: \"5834ff77-38e7-40e3-a6a0-ce908f1343f0\") " pod="openshift-nmstate/nmstate-handler-mg42f" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.248089 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncjpd\" (UniqueName: \"kubernetes.io/projected/87c86148-6c6d-48a2-bd6c-4004f6d782e8-kube-api-access-ncjpd\") pod \"nmstate-console-plugin-6b874cbd85-q5bx2\" (UID: \"87c86148-6c6d-48a2-bd6c-4004f6d782e8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.248479 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/87c86148-6c6d-48a2-bd6c-4004f6d782e8-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-q5bx2\" (UID: \"87c86148-6c6d-48a2-bd6c-4004f6d782e8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.248590 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/87c86148-6c6d-48a2-bd6c-4004f6d782e8-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-q5bx2\" (UID: \"87c86148-6c6d-48a2-bd6c-4004f6d782e8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2" Oct 02 11:07:17 crc kubenswrapper[4766]: E1002 11:07:17.248760 4766 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Oct 02 11:07:17 crc kubenswrapper[4766]: E1002 11:07:17.248854 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87c86148-6c6d-48a2-bd6c-4004f6d782e8-plugin-serving-cert podName:87c86148-6c6d-48a2-bd6c-4004f6d782e8 nodeName:}" failed. No retries permitted until 2025-10-02 11:07:17.748831559 +0000 UTC m=+952.691702503 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/87c86148-6c6d-48a2-bd6c-4004f6d782e8-plugin-serving-cert") pod "nmstate-console-plugin-6b874cbd85-q5bx2" (UID: "87c86148-6c6d-48a2-bd6c-4004f6d782e8") : secret "plugin-serving-cert" not found Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.249304 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/87c86148-6c6d-48a2-bd6c-4004f6d782e8-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-q5bx2\" (UID: \"87c86148-6c6d-48a2-bd6c-4004f6d782e8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.271627 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-66579686d4-2bxjr"] Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.271683 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncjpd\" (UniqueName: \"kubernetes.io/projected/87c86148-6c6d-48a2-bd6c-4004f6d782e8-kube-api-access-ncjpd\") pod \"nmstate-console-plugin-6b874cbd85-q5bx2\" (UID: \"87c86148-6c6d-48a2-bd6c-4004f6d782e8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.272288 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.291898 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66579686d4-2bxjr"] Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.349540 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/26c85d16-5fb3-443c-82cf-00db59fb5b00-oauth-serving-cert\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.349584 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/26c85d16-5fb3-443c-82cf-00db59fb5b00-console-serving-cert\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.349624 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/26c85d16-5fb3-443c-82cf-00db59fb5b00-service-ca\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.349643 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62bnk\" (UniqueName: \"kubernetes.io/projected/26c85d16-5fb3-443c-82cf-00db59fb5b00-kube-api-access-62bnk\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.349659 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/26c85d16-5fb3-443c-82cf-00db59fb5b00-console-oauth-config\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.349699 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26c85d16-5fb3-443c-82cf-00db59fb5b00-trusted-ca-bundle\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.349727 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/26c85d16-5fb3-443c-82cf-00db59fb5b00-console-config\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.450538 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/26c85d16-5fb3-443c-82cf-00db59fb5b00-service-ca\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.450892 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62bnk\" (UniqueName: \"kubernetes.io/projected/26c85d16-5fb3-443c-82cf-00db59fb5b00-kube-api-access-62bnk\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.451006 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/26c85d16-5fb3-443c-82cf-00db59fb5b00-console-oauth-config\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.451163 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26c85d16-5fb3-443c-82cf-00db59fb5b00-trusted-ca-bundle\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.451277 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/26c85d16-5fb3-443c-82cf-00db59fb5b00-console-config\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.451389 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/26c85d16-5fb3-443c-82cf-00db59fb5b00-service-ca\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.451486 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/26c85d16-5fb3-443c-82cf-00db59fb5b00-oauth-serving-cert\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.451635 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/26c85d16-5fb3-443c-82cf-00db59fb5b00-console-serving-cert\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.452035 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26c85d16-5fb3-443c-82cf-00db59fb5b00-trusted-ca-bundle\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.452064 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/26c85d16-5fb3-443c-82cf-00db59fb5b00-oauth-serving-cert\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.452528 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/26c85d16-5fb3-443c-82cf-00db59fb5b00-console-config\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.453935 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/26c85d16-5fb3-443c-82cf-00db59fb5b00-console-oauth-config\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.455132 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/26c85d16-5fb3-443c-82cf-00db59fb5b00-console-serving-cert\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.468834 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62bnk\" (UniqueName: \"kubernetes.io/projected/26c85d16-5fb3-443c-82cf-00db59fb5b00-kube-api-access-62bnk\") pod \"console-66579686d4-2bxjr\" (UID: \"26c85d16-5fb3-443c-82cf-00db59fb5b00\") " pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.552155 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/841c7884-b0c7-45fc-9032-9d5c27cd862a-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-v9vf7\" (UID: \"841c7884-b0c7-45fc-9032-9d5c27cd862a\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-v9vf7" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.555217 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/841c7884-b0c7-45fc-9032-9d5c27cd862a-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-v9vf7\" (UID: \"841c7884-b0c7-45fc-9032-9d5c27cd862a\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-v9vf7" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.600276 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.754835 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/87c86148-6c6d-48a2-bd6c-4004f6d782e8-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-q5bx2\" (UID: \"87c86148-6c6d-48a2-bd6c-4004f6d782e8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.758830 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-hwlwd" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.760613 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mg42f" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.766354 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/87c86148-6c6d-48a2-bd6c-4004f6d782e8-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-q5bx2\" (UID: \"87c86148-6c6d-48a2-bd6c-4004f6d782e8\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.766433 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dbwjf" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.803007 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66579686d4-2bxjr"] Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.828689 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-v9vf7" Oct 02 11:07:17 crc kubenswrapper[4766]: I1002 11:07:17.956277 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-dbwjf"] Oct 02 11:07:17 crc kubenswrapper[4766]: W1002 11:07:17.963483 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5504968e_95f4_4664_bbd0_958eb8efb21e.slice/crio-818af4c6dc2dd94554393f1acc8a3c64984846d99767f253d2cc329dd4b5705b WatchSource:0}: Error finding container 818af4c6dc2dd94554393f1acc8a3c64984846d99767f253d2cc329dd4b5705b: Status 404 returned error can't find the container with id 818af4c6dc2dd94554393f1acc8a3c64984846d99767f253d2cc329dd4b5705b Oct 02 11:07:18 crc kubenswrapper[4766]: I1002 11:07:18.012000 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2" Oct 02 11:07:18 crc kubenswrapper[4766]: I1002 11:07:18.031672 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66579686d4-2bxjr" event={"ID":"26c85d16-5fb3-443c-82cf-00db59fb5b00","Type":"ContainerStarted","Data":"5607878e1acec5a3526ad828241ec0b948362eb6e5c00bb2529696b9178b0045"} Oct 02 11:07:18 crc kubenswrapper[4766]: I1002 11:07:18.031721 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66579686d4-2bxjr" event={"ID":"26c85d16-5fb3-443c-82cf-00db59fb5b00","Type":"ContainerStarted","Data":"23045bbbcdd20995d1c592b916efc2ff102e0763f203b0963b75a31edd7e6296"} Oct 02 11:07:18 crc kubenswrapper[4766]: I1002 11:07:18.033933 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mg42f" event={"ID":"5834ff77-38e7-40e3-a6a0-ce908f1343f0","Type":"ContainerStarted","Data":"334da28b42c00babbf28360cb9930aaa54b0c2ed64392f046a2c23f2a54e5e6a"} Oct 02 11:07:18 crc kubenswrapper[4766]: I1002 11:07:18.036252 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dbwjf" event={"ID":"5504968e-95f4-4664-bbd0-958eb8efb21e","Type":"ContainerStarted","Data":"818af4c6dc2dd94554393f1acc8a3c64984846d99767f253d2cc329dd4b5705b"} Oct 02 11:07:18 crc kubenswrapper[4766]: W1002 11:07:18.038483 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod841c7884_b0c7_45fc_9032_9d5c27cd862a.slice/crio-e2c2eace3f5940d4ecfa84e5aca155978380c72b987b7a04dc5eb67405f13fd6 WatchSource:0}: Error finding container e2c2eace3f5940d4ecfa84e5aca155978380c72b987b7a04dc5eb67405f13fd6: Status 404 returned error can't find the container with id e2c2eace3f5940d4ecfa84e5aca155978380c72b987b7a04dc5eb67405f13fd6 Oct 02 11:07:18 crc kubenswrapper[4766]: I1002 11:07:18.040749 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-v9vf7"] Oct 02 11:07:18 crc kubenswrapper[4766]: I1002 11:07:18.052581 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66579686d4-2bxjr" podStartSLOduration=1.052560813 podStartE2EDuration="1.052560813s" podCreationTimestamp="2025-10-02 11:07:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:07:18.051768258 +0000 UTC m=+952.994639202" watchObservedRunningTime="2025-10-02 11:07:18.052560813 +0000 UTC m=+952.995431757" Oct 02 11:07:18 crc kubenswrapper[4766]: I1002 11:07:18.216248 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2"] Oct 02 11:07:18 crc kubenswrapper[4766]: W1002 11:07:18.224698 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87c86148_6c6d_48a2_bd6c_4004f6d782e8.slice/crio-01188b2474dacf88e6b01b0b85ae380198eca75a5e24827fc60636365893db8d WatchSource:0}: Error finding container 01188b2474dacf88e6b01b0b85ae380198eca75a5e24827fc60636365893db8d: Status 404 returned error can't find the container with id 01188b2474dacf88e6b01b0b85ae380198eca75a5e24827fc60636365893db8d Oct 02 11:07:19 crc kubenswrapper[4766]: I1002 11:07:19.043730 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2" event={"ID":"87c86148-6c6d-48a2-bd6c-4004f6d782e8","Type":"ContainerStarted","Data":"01188b2474dacf88e6b01b0b85ae380198eca75a5e24827fc60636365893db8d"} Oct 02 11:07:19 crc kubenswrapper[4766]: I1002 11:07:19.045480 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-v9vf7" event={"ID":"841c7884-b0c7-45fc-9032-9d5c27cd862a","Type":"ContainerStarted","Data":"e2c2eace3f5940d4ecfa84e5aca155978380c72b987b7a04dc5eb67405f13fd6"} Oct 02 11:07:21 crc kubenswrapper[4766]: I1002 11:07:21.056369 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2" event={"ID":"87c86148-6c6d-48a2-bd6c-4004f6d782e8","Type":"ContainerStarted","Data":"fdad52af9fdbd283c8b1fa9334b4997f1f69cb6fb94731838bf24076295f9ed6"} Oct 02 11:07:21 crc kubenswrapper[4766]: I1002 11:07:21.059399 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-v9vf7" event={"ID":"841c7884-b0c7-45fc-9032-9d5c27cd862a","Type":"ContainerStarted","Data":"b6d2df5700a0dfc734d7e62ef4ddebb4c141a0620fb83487961e0e66e46b230a"} Oct 02 11:07:21 crc kubenswrapper[4766]: I1002 11:07:21.060015 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-v9vf7" Oct 02 11:07:21 crc kubenswrapper[4766]: I1002 11:07:21.061451 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dbwjf" event={"ID":"5504968e-95f4-4664-bbd0-958eb8efb21e","Type":"ContainerStarted","Data":"3555c80b254b84a151f4b35864d4d0eab6e1f35c082b785e0aece3fe1464b09b"} Oct 02 11:07:21 crc kubenswrapper[4766]: I1002 11:07:21.073031 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-q5bx2" podStartSLOduration=1.483202903 podStartE2EDuration="4.073010248s" podCreationTimestamp="2025-10-02 11:07:17 +0000 UTC" firstStartedPulling="2025-10-02 11:07:18.226392734 +0000 UTC m=+953.169263688" lastFinishedPulling="2025-10-02 11:07:20.816200089 +0000 UTC m=+955.759071033" observedRunningTime="2025-10-02 11:07:21.071886841 +0000 UTC m=+956.014757795" watchObservedRunningTime="2025-10-02 11:07:21.073010248 +0000 UTC m=+956.015881192" Oct 02 11:07:21 crc kubenswrapper[4766]: I1002 11:07:21.091765 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-v9vf7" podStartSLOduration=2.315713026 podStartE2EDuration="5.091747338s" podCreationTimestamp="2025-10-02 11:07:16 +0000 UTC" firstStartedPulling="2025-10-02 11:07:18.041054665 +0000 UTC m=+952.983925609" lastFinishedPulling="2025-10-02 11:07:20.817088977 +0000 UTC m=+955.759959921" observedRunningTime="2025-10-02 11:07:21.090327343 +0000 UTC m=+956.033198297" watchObservedRunningTime="2025-10-02 11:07:21.091747338 +0000 UTC m=+956.034618282" Oct 02 11:07:22 crc kubenswrapper[4766]: I1002 11:07:22.068292 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mg42f" event={"ID":"5834ff77-38e7-40e3-a6a0-ce908f1343f0","Type":"ContainerStarted","Data":"392968409efd7bfb56e95de64c88dd566a2bd6a077b672c972c761f00044263c"} Oct 02 11:07:22 crc kubenswrapper[4766]: I1002 11:07:22.087178 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-mg42f" podStartSLOduration=3.073147927 podStartE2EDuration="6.087147154s" podCreationTimestamp="2025-10-02 11:07:16 +0000 UTC" firstStartedPulling="2025-10-02 11:07:17.801154758 +0000 UTC m=+952.744025702" lastFinishedPulling="2025-10-02 11:07:20.815153985 +0000 UTC m=+955.758024929" observedRunningTime="2025-10-02 11:07:22.083412124 +0000 UTC m=+957.026283078" watchObservedRunningTime="2025-10-02 11:07:22.087147154 +0000 UTC m=+957.030018138" Oct 02 11:07:22 crc kubenswrapper[4766]: I1002 11:07:22.760754 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-mg42f" Oct 02 11:07:24 crc kubenswrapper[4766]: I1002 11:07:24.080693 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dbwjf" event={"ID":"5504968e-95f4-4664-bbd0-958eb8efb21e","Type":"ContainerStarted","Data":"4a7df86351caaa6dc3b357235d296b2b382311b1054782ec06b60d54c4200fe5"} Oct 02 11:07:24 crc kubenswrapper[4766]: I1002 11:07:24.095219 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-dbwjf" podStartSLOduration=2.774784305 podStartE2EDuration="8.095199337s" podCreationTimestamp="2025-10-02 11:07:16 +0000 UTC" firstStartedPulling="2025-10-02 11:07:17.965852415 +0000 UTC m=+952.908723359" lastFinishedPulling="2025-10-02 11:07:23.286267447 +0000 UTC m=+958.229138391" observedRunningTime="2025-10-02 11:07:24.095182077 +0000 UTC m=+959.038053031" watchObservedRunningTime="2025-10-02 11:07:24.095199337 +0000 UTC m=+959.038070281" Oct 02 11:07:24 crc kubenswrapper[4766]: I1002 11:07:24.432494 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:07:24 crc kubenswrapper[4766]: I1002 11:07:24.432597 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:07:24 crc kubenswrapper[4766]: I1002 11:07:24.432648 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 11:07:24 crc kubenswrapper[4766]: I1002 11:07:24.433310 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c19749f939a14cc5cbc026d638c61fa14b50810b4586b8fd36f7ac6b16f32c80"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:07:24 crc kubenswrapper[4766]: I1002 11:07:24.433370 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://c19749f939a14cc5cbc026d638c61fa14b50810b4586b8fd36f7ac6b16f32c80" gracePeriod=600 Oct 02 11:07:25 crc kubenswrapper[4766]: I1002 11:07:25.091161 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="c19749f939a14cc5cbc026d638c61fa14b50810b4586b8fd36f7ac6b16f32c80" exitCode=0 Oct 02 11:07:25 crc kubenswrapper[4766]: I1002 11:07:25.091246 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"c19749f939a14cc5cbc026d638c61fa14b50810b4586b8fd36f7ac6b16f32c80"} Oct 02 11:07:25 crc kubenswrapper[4766]: I1002 11:07:25.091537 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"e9d8027960aa5ff2fdb64c8c9c88c1508201265b3f2ec5d57d7c673e50cbb5eb"} Oct 02 11:07:25 crc kubenswrapper[4766]: I1002 11:07:25.091556 4766 scope.go:117] "RemoveContainer" containerID="853a899410d52955e7ea02637bfd357fd7fb7ea7cb0e26bfedf87008f24173b5" Oct 02 11:07:27 crc kubenswrapper[4766]: I1002 11:07:27.601838 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:27 crc kubenswrapper[4766]: I1002 11:07:27.602466 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:27 crc kubenswrapper[4766]: I1002 11:07:27.607021 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:27 crc kubenswrapper[4766]: I1002 11:07:27.784039 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-mg42f" Oct 02 11:07:28 crc kubenswrapper[4766]: I1002 11:07:28.112829 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66579686d4-2bxjr" Oct 02 11:07:28 crc kubenswrapper[4766]: I1002 11:07:28.161234 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8z2x2"] Oct 02 11:07:37 crc kubenswrapper[4766]: I1002 11:07:37.835368 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-v9vf7" Oct 02 11:07:51 crc kubenswrapper[4766]: I1002 11:07:51.109839 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45"] Oct 02 11:07:51 crc kubenswrapper[4766]: I1002 11:07:51.111791 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" Oct 02 11:07:51 crc kubenswrapper[4766]: I1002 11:07:51.113812 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 11:07:51 crc kubenswrapper[4766]: I1002 11:07:51.119441 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45"] Oct 02 11:07:51 crc kubenswrapper[4766]: I1002 11:07:51.151450 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/148906ba-bbc3-498d-91e3-b542ebf88b0e-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45\" (UID: \"148906ba-bbc3-498d-91e3-b542ebf88b0e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" Oct 02 11:07:51 crc kubenswrapper[4766]: I1002 11:07:51.151524 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/148906ba-bbc3-498d-91e3-b542ebf88b0e-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45\" (UID: \"148906ba-bbc3-498d-91e3-b542ebf88b0e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" Oct 02 11:07:51 crc kubenswrapper[4766]: I1002 11:07:51.151708 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psbzt\" (UniqueName: \"kubernetes.io/projected/148906ba-bbc3-498d-91e3-b542ebf88b0e-kube-api-access-psbzt\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45\" (UID: \"148906ba-bbc3-498d-91e3-b542ebf88b0e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" Oct 02 11:07:51 crc kubenswrapper[4766]: I1002 11:07:51.252807 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/148906ba-bbc3-498d-91e3-b542ebf88b0e-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45\" (UID: \"148906ba-bbc3-498d-91e3-b542ebf88b0e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" Oct 02 11:07:51 crc kubenswrapper[4766]: I1002 11:07:51.252873 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/148906ba-bbc3-498d-91e3-b542ebf88b0e-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45\" (UID: \"148906ba-bbc3-498d-91e3-b542ebf88b0e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" Oct 02 11:07:51 crc kubenswrapper[4766]: I1002 11:07:51.252916 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psbzt\" (UniqueName: \"kubernetes.io/projected/148906ba-bbc3-498d-91e3-b542ebf88b0e-kube-api-access-psbzt\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45\" (UID: \"148906ba-bbc3-498d-91e3-b542ebf88b0e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" Oct 02 11:07:51 crc kubenswrapper[4766]: I1002 11:07:51.253414 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/148906ba-bbc3-498d-91e3-b542ebf88b0e-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45\" (UID: \"148906ba-bbc3-498d-91e3-b542ebf88b0e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" Oct 02 11:07:51 crc kubenswrapper[4766]: I1002 11:07:51.253488 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/148906ba-bbc3-498d-91e3-b542ebf88b0e-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45\" (UID: \"148906ba-bbc3-498d-91e3-b542ebf88b0e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" Oct 02 11:07:51 crc kubenswrapper[4766]: I1002 11:07:51.271414 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psbzt\" (UniqueName: \"kubernetes.io/projected/148906ba-bbc3-498d-91e3-b542ebf88b0e-kube-api-access-psbzt\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45\" (UID: \"148906ba-bbc3-498d-91e3-b542ebf88b0e\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" Oct 02 11:07:51 crc kubenswrapper[4766]: I1002 11:07:51.432230 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" Oct 02 11:07:51 crc kubenswrapper[4766]: I1002 11:07:51.616005 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45"] Oct 02 11:07:52 crc kubenswrapper[4766]: I1002 11:07:52.246357 4766 generic.go:334] "Generic (PLEG): container finished" podID="148906ba-bbc3-498d-91e3-b542ebf88b0e" containerID="13a9f41450f77600d83522be1f891ea65318d2887ac046272c9b9a6cee6c2c3a" exitCode=0 Oct 02 11:07:52 crc kubenswrapper[4766]: I1002 11:07:52.246460 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" event={"ID":"148906ba-bbc3-498d-91e3-b542ebf88b0e","Type":"ContainerDied","Data":"13a9f41450f77600d83522be1f891ea65318d2887ac046272c9b9a6cee6c2c3a"} Oct 02 11:07:52 crc kubenswrapper[4766]: I1002 11:07:52.246712 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" event={"ID":"148906ba-bbc3-498d-91e3-b542ebf88b0e","Type":"ContainerStarted","Data":"a109f7ac9f795e0a8f260444a637cb06c36f3c1fe38c91ad0c81bcae2a7c2c0b"} Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.204377 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-8z2x2" podUID="581ea4c4-072a-4bba-afc9-2f82918ac0c9" containerName="console" containerID="cri-o://346d18e5f3d1757e06ff483e7ccf11c59e3fe37ba77d78dcbcbf3627115e5afb" gracePeriod=15 Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.577132 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8z2x2_581ea4c4-072a-4bba-afc9-2f82918ac0c9/console/0.log" Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.577404 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.699252 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-oauth-serving-cert\") pod \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.699326 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-oauth-config\") pod \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.699352 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-service-ca\") pod \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.699383 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8h42\" (UniqueName: \"kubernetes.io/projected/581ea4c4-072a-4bba-afc9-2f82918ac0c9-kube-api-access-f8h42\") pod \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.699412 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-config\") pod \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.699427 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-serving-cert\") pod \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.699447 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-trusted-ca-bundle\") pod \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\" (UID: \"581ea4c4-072a-4bba-afc9-2f82918ac0c9\") " Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.700463 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-service-ca" (OuterVolumeSpecName: "service-ca") pod "581ea4c4-072a-4bba-afc9-2f82918ac0c9" (UID: "581ea4c4-072a-4bba-afc9-2f82918ac0c9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.700474 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "581ea4c4-072a-4bba-afc9-2f82918ac0c9" (UID: "581ea4c4-072a-4bba-afc9-2f82918ac0c9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.701390 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-config" (OuterVolumeSpecName: "console-config") pod "581ea4c4-072a-4bba-afc9-2f82918ac0c9" (UID: "581ea4c4-072a-4bba-afc9-2f82918ac0c9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.704585 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "581ea4c4-072a-4bba-afc9-2f82918ac0c9" (UID: "581ea4c4-072a-4bba-afc9-2f82918ac0c9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.719174 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/581ea4c4-072a-4bba-afc9-2f82918ac0c9-kube-api-access-f8h42" (OuterVolumeSpecName: "kube-api-access-f8h42") pod "581ea4c4-072a-4bba-afc9-2f82918ac0c9" (UID: "581ea4c4-072a-4bba-afc9-2f82918ac0c9"). InnerVolumeSpecName "kube-api-access-f8h42". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.719854 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "581ea4c4-072a-4bba-afc9-2f82918ac0c9" (UID: "581ea4c4-072a-4bba-afc9-2f82918ac0c9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.720014 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "581ea4c4-072a-4bba-afc9-2f82918ac0c9" (UID: "581ea4c4-072a-4bba-afc9-2f82918ac0c9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.800857 4766 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.800904 4766 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.800921 4766 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.800933 4766 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-service-ca\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.800945 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8h42\" (UniqueName: \"kubernetes.io/projected/581ea4c4-072a-4bba-afc9-2f82918ac0c9-kube-api-access-f8h42\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.800957 4766 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:53 crc kubenswrapper[4766]: I1002 11:07:53.800970 4766 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/581ea4c4-072a-4bba-afc9-2f82918ac0c9-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:54 crc kubenswrapper[4766]: I1002 11:07:54.259799 4766 generic.go:334] "Generic (PLEG): container finished" podID="148906ba-bbc3-498d-91e3-b542ebf88b0e" containerID="12680dbc0b7dd3b5efc35889141f1eeb11fcf97de6c2e1072e4ff5d68d7a2823" exitCode=0 Oct 02 11:07:54 crc kubenswrapper[4766]: I1002 11:07:54.259901 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" event={"ID":"148906ba-bbc3-498d-91e3-b542ebf88b0e","Type":"ContainerDied","Data":"12680dbc0b7dd3b5efc35889141f1eeb11fcf97de6c2e1072e4ff5d68d7a2823"} Oct 02 11:07:54 crc kubenswrapper[4766]: I1002 11:07:54.263340 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-8z2x2_581ea4c4-072a-4bba-afc9-2f82918ac0c9/console/0.log" Oct 02 11:07:54 crc kubenswrapper[4766]: I1002 11:07:54.263375 4766 generic.go:334] "Generic (PLEG): container finished" podID="581ea4c4-072a-4bba-afc9-2f82918ac0c9" containerID="346d18e5f3d1757e06ff483e7ccf11c59e3fe37ba77d78dcbcbf3627115e5afb" exitCode=2 Oct 02 11:07:54 crc kubenswrapper[4766]: I1002 11:07:54.263412 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-8z2x2" Oct 02 11:07:54 crc kubenswrapper[4766]: I1002 11:07:54.263412 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8z2x2" event={"ID":"581ea4c4-072a-4bba-afc9-2f82918ac0c9","Type":"ContainerDied","Data":"346d18e5f3d1757e06ff483e7ccf11c59e3fe37ba77d78dcbcbf3627115e5afb"} Oct 02 11:07:54 crc kubenswrapper[4766]: I1002 11:07:54.263556 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-8z2x2" event={"ID":"581ea4c4-072a-4bba-afc9-2f82918ac0c9","Type":"ContainerDied","Data":"4b5cdf54df5f023e04a3209ab1a317a8f42af073dc3cc27fdfa7c34c59f3f98d"} Oct 02 11:07:54 crc kubenswrapper[4766]: I1002 11:07:54.263580 4766 scope.go:117] "RemoveContainer" containerID="346d18e5f3d1757e06ff483e7ccf11c59e3fe37ba77d78dcbcbf3627115e5afb" Oct 02 11:07:54 crc kubenswrapper[4766]: I1002 11:07:54.289707 4766 scope.go:117] "RemoveContainer" containerID="346d18e5f3d1757e06ff483e7ccf11c59e3fe37ba77d78dcbcbf3627115e5afb" Oct 02 11:07:54 crc kubenswrapper[4766]: E1002 11:07:54.290314 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"346d18e5f3d1757e06ff483e7ccf11c59e3fe37ba77d78dcbcbf3627115e5afb\": container with ID starting with 346d18e5f3d1757e06ff483e7ccf11c59e3fe37ba77d78dcbcbf3627115e5afb not found: ID does not exist" containerID="346d18e5f3d1757e06ff483e7ccf11c59e3fe37ba77d78dcbcbf3627115e5afb" Oct 02 11:07:54 crc kubenswrapper[4766]: I1002 11:07:54.290355 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346d18e5f3d1757e06ff483e7ccf11c59e3fe37ba77d78dcbcbf3627115e5afb"} err="failed to get container status \"346d18e5f3d1757e06ff483e7ccf11c59e3fe37ba77d78dcbcbf3627115e5afb\": rpc error: code = NotFound desc = could not find container \"346d18e5f3d1757e06ff483e7ccf11c59e3fe37ba77d78dcbcbf3627115e5afb\": container with ID starting with 346d18e5f3d1757e06ff483e7ccf11c59e3fe37ba77d78dcbcbf3627115e5afb not found: ID does not exist" Oct 02 11:07:54 crc kubenswrapper[4766]: I1002 11:07:54.300211 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-8z2x2"] Oct 02 11:07:54 crc kubenswrapper[4766]: I1002 11:07:54.302991 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-8z2x2"] Oct 02 11:07:55 crc kubenswrapper[4766]: I1002 11:07:55.270366 4766 generic.go:334] "Generic (PLEG): container finished" podID="148906ba-bbc3-498d-91e3-b542ebf88b0e" containerID="c096c68abe124121407a47f0dc626667d0797e23088d44448d53fe659f359ae2" exitCode=0 Oct 02 11:07:55 crc kubenswrapper[4766]: I1002 11:07:55.270416 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" event={"ID":"148906ba-bbc3-498d-91e3-b542ebf88b0e","Type":"ContainerDied","Data":"c096c68abe124121407a47f0dc626667d0797e23088d44448d53fe659f359ae2"} Oct 02 11:07:55 crc kubenswrapper[4766]: I1002 11:07:55.888915 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="581ea4c4-072a-4bba-afc9-2f82918ac0c9" path="/var/lib/kubelet/pods/581ea4c4-072a-4bba-afc9-2f82918ac0c9/volumes" Oct 02 11:07:56 crc kubenswrapper[4766]: I1002 11:07:56.479984 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" Oct 02 11:07:56 crc kubenswrapper[4766]: I1002 11:07:56.536543 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/148906ba-bbc3-498d-91e3-b542ebf88b0e-bundle\") pod \"148906ba-bbc3-498d-91e3-b542ebf88b0e\" (UID: \"148906ba-bbc3-498d-91e3-b542ebf88b0e\") " Oct 02 11:07:56 crc kubenswrapper[4766]: I1002 11:07:56.536909 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psbzt\" (UniqueName: \"kubernetes.io/projected/148906ba-bbc3-498d-91e3-b542ebf88b0e-kube-api-access-psbzt\") pod \"148906ba-bbc3-498d-91e3-b542ebf88b0e\" (UID: \"148906ba-bbc3-498d-91e3-b542ebf88b0e\") " Oct 02 11:07:56 crc kubenswrapper[4766]: I1002 11:07:56.537014 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/148906ba-bbc3-498d-91e3-b542ebf88b0e-util\") pod \"148906ba-bbc3-498d-91e3-b542ebf88b0e\" (UID: \"148906ba-bbc3-498d-91e3-b542ebf88b0e\") " Oct 02 11:07:56 crc kubenswrapper[4766]: I1002 11:07:56.546308 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/148906ba-bbc3-498d-91e3-b542ebf88b0e-bundle" (OuterVolumeSpecName: "bundle") pod "148906ba-bbc3-498d-91e3-b542ebf88b0e" (UID: "148906ba-bbc3-498d-91e3-b542ebf88b0e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:07:56 crc kubenswrapper[4766]: I1002 11:07:56.546526 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/148906ba-bbc3-498d-91e3-b542ebf88b0e-kube-api-access-psbzt" (OuterVolumeSpecName: "kube-api-access-psbzt") pod "148906ba-bbc3-498d-91e3-b542ebf88b0e" (UID: "148906ba-bbc3-498d-91e3-b542ebf88b0e"). InnerVolumeSpecName "kube-api-access-psbzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:07:56 crc kubenswrapper[4766]: I1002 11:07:56.555356 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/148906ba-bbc3-498d-91e3-b542ebf88b0e-util" (OuterVolumeSpecName: "util") pod "148906ba-bbc3-498d-91e3-b542ebf88b0e" (UID: "148906ba-bbc3-498d-91e3-b542ebf88b0e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:07:56 crc kubenswrapper[4766]: I1002 11:07:56.638172 4766 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/148906ba-bbc3-498d-91e3-b542ebf88b0e-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:56 crc kubenswrapper[4766]: I1002 11:07:56.638206 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psbzt\" (UniqueName: \"kubernetes.io/projected/148906ba-bbc3-498d-91e3-b542ebf88b0e-kube-api-access-psbzt\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:56 crc kubenswrapper[4766]: I1002 11:07:56.638216 4766 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/148906ba-bbc3-498d-91e3-b542ebf88b0e-util\") on node \"crc\" DevicePath \"\"" Oct 02 11:07:57 crc kubenswrapper[4766]: I1002 11:07:57.284916 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" event={"ID":"148906ba-bbc3-498d-91e3-b542ebf88b0e","Type":"ContainerDied","Data":"a109f7ac9f795e0a8f260444a637cb06c36f3c1fe38c91ad0c81bcae2a7c2c0b"} Oct 02 11:07:57 crc kubenswrapper[4766]: I1002 11:07:57.284959 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a109f7ac9f795e0a8f260444a637cb06c36f3c1fe38c91ad0c81bcae2a7c2c0b" Oct 02 11:07:57 crc kubenswrapper[4766]: I1002 11:07:57.285020 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45" Oct 02 11:08:06 crc kubenswrapper[4766]: I1002 11:08:06.884018 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-79dc498c69-l856r"] Oct 02 11:08:06 crc kubenswrapper[4766]: E1002 11:08:06.884842 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="581ea4c4-072a-4bba-afc9-2f82918ac0c9" containerName="console" Oct 02 11:08:06 crc kubenswrapper[4766]: I1002 11:08:06.884859 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="581ea4c4-072a-4bba-afc9-2f82918ac0c9" containerName="console" Oct 02 11:08:06 crc kubenswrapper[4766]: E1002 11:08:06.884884 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148906ba-bbc3-498d-91e3-b542ebf88b0e" containerName="util" Oct 02 11:08:06 crc kubenswrapper[4766]: I1002 11:08:06.884896 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="148906ba-bbc3-498d-91e3-b542ebf88b0e" containerName="util" Oct 02 11:08:06 crc kubenswrapper[4766]: E1002 11:08:06.884906 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148906ba-bbc3-498d-91e3-b542ebf88b0e" containerName="pull" Oct 02 11:08:06 crc kubenswrapper[4766]: I1002 11:08:06.884913 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="148906ba-bbc3-498d-91e3-b542ebf88b0e" containerName="pull" Oct 02 11:08:06 crc kubenswrapper[4766]: E1002 11:08:06.884926 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="148906ba-bbc3-498d-91e3-b542ebf88b0e" containerName="extract" Oct 02 11:08:06 crc kubenswrapper[4766]: I1002 11:08:06.884932 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="148906ba-bbc3-498d-91e3-b542ebf88b0e" containerName="extract" Oct 02 11:08:06 crc kubenswrapper[4766]: I1002 11:08:06.885063 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="581ea4c4-072a-4bba-afc9-2f82918ac0c9" containerName="console" Oct 02 11:08:06 crc kubenswrapper[4766]: I1002 11:08:06.885078 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="148906ba-bbc3-498d-91e3-b542ebf88b0e" containerName="extract" Oct 02 11:08:06 crc kubenswrapper[4766]: I1002 11:08:06.885480 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-79dc498c69-l856r" Oct 02 11:08:06 crc kubenswrapper[4766]: I1002 11:08:06.892548 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 02 11:08:06 crc kubenswrapper[4766]: I1002 11:08:06.894554 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 02 11:08:06 crc kubenswrapper[4766]: I1002 11:08:06.894930 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 02 11:08:06 crc kubenswrapper[4766]: I1002 11:08:06.895248 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-gl2mq" Oct 02 11:08:06 crc kubenswrapper[4766]: I1002 11:08:06.895004 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 02 11:08:06 crc kubenswrapper[4766]: I1002 11:08:06.920973 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-79dc498c69-l856r"] Oct 02 11:08:06 crc kubenswrapper[4766]: I1002 11:08:06.970779 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a212e302-c57c-4d73-a1b3-94e720468352-apiservice-cert\") pod \"metallb-operator-controller-manager-79dc498c69-l856r\" (UID: \"a212e302-c57c-4d73-a1b3-94e720468352\") " pod="metallb-system/metallb-operator-controller-manager-79dc498c69-l856r" Oct 02 11:08:06 crc kubenswrapper[4766]: I1002 11:08:06.971178 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a212e302-c57c-4d73-a1b3-94e720468352-webhook-cert\") pod \"metallb-operator-controller-manager-79dc498c69-l856r\" (UID: \"a212e302-c57c-4d73-a1b3-94e720468352\") " pod="metallb-system/metallb-operator-controller-manager-79dc498c69-l856r" Oct 02 11:08:06 crc kubenswrapper[4766]: I1002 11:08:06.971313 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc6zd\" (UniqueName: \"kubernetes.io/projected/a212e302-c57c-4d73-a1b3-94e720468352-kube-api-access-hc6zd\") pod \"metallb-operator-controller-manager-79dc498c69-l856r\" (UID: \"a212e302-c57c-4d73-a1b3-94e720468352\") " pod="metallb-system/metallb-operator-controller-manager-79dc498c69-l856r" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.072986 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a212e302-c57c-4d73-a1b3-94e720468352-apiservice-cert\") pod \"metallb-operator-controller-manager-79dc498c69-l856r\" (UID: \"a212e302-c57c-4d73-a1b3-94e720468352\") " pod="metallb-system/metallb-operator-controller-manager-79dc498c69-l856r" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.073053 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a212e302-c57c-4d73-a1b3-94e720468352-webhook-cert\") pod \"metallb-operator-controller-manager-79dc498c69-l856r\" (UID: \"a212e302-c57c-4d73-a1b3-94e720468352\") " pod="metallb-system/metallb-operator-controller-manager-79dc498c69-l856r" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.073075 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc6zd\" (UniqueName: \"kubernetes.io/projected/a212e302-c57c-4d73-a1b3-94e720468352-kube-api-access-hc6zd\") pod \"metallb-operator-controller-manager-79dc498c69-l856r\" (UID: \"a212e302-c57c-4d73-a1b3-94e720468352\") " pod="metallb-system/metallb-operator-controller-manager-79dc498c69-l856r" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.079208 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a212e302-c57c-4d73-a1b3-94e720468352-apiservice-cert\") pod \"metallb-operator-controller-manager-79dc498c69-l856r\" (UID: \"a212e302-c57c-4d73-a1b3-94e720468352\") " pod="metallb-system/metallb-operator-controller-manager-79dc498c69-l856r" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.089605 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a212e302-c57c-4d73-a1b3-94e720468352-webhook-cert\") pod \"metallb-operator-controller-manager-79dc498c69-l856r\" (UID: \"a212e302-c57c-4d73-a1b3-94e720468352\") " pod="metallb-system/metallb-operator-controller-manager-79dc498c69-l856r" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.100407 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc6zd\" (UniqueName: \"kubernetes.io/projected/a212e302-c57c-4d73-a1b3-94e720468352-kube-api-access-hc6zd\") pod \"metallb-operator-controller-manager-79dc498c69-l856r\" (UID: \"a212e302-c57c-4d73-a1b3-94e720468352\") " pod="metallb-system/metallb-operator-controller-manager-79dc498c69-l856r" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.204858 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-79dc498c69-l856r" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.226263 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-596877795c-zts7d"] Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.226942 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-596877795c-zts7d" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.228323 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-9h65b" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.228542 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.228848 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.243766 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-596877795c-zts7d"] Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.275234 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/636126b8-3906-49f5-8434-c324ab667177-apiservice-cert\") pod \"metallb-operator-webhook-server-596877795c-zts7d\" (UID: \"636126b8-3906-49f5-8434-c324ab667177\") " pod="metallb-system/metallb-operator-webhook-server-596877795c-zts7d" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.275291 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x6wk\" (UniqueName: \"kubernetes.io/projected/636126b8-3906-49f5-8434-c324ab667177-kube-api-access-6x6wk\") pod \"metallb-operator-webhook-server-596877795c-zts7d\" (UID: \"636126b8-3906-49f5-8434-c324ab667177\") " pod="metallb-system/metallb-operator-webhook-server-596877795c-zts7d" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.275314 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/636126b8-3906-49f5-8434-c324ab667177-webhook-cert\") pod \"metallb-operator-webhook-server-596877795c-zts7d\" (UID: \"636126b8-3906-49f5-8434-c324ab667177\") " pod="metallb-system/metallb-operator-webhook-server-596877795c-zts7d" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.377168 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x6wk\" (UniqueName: \"kubernetes.io/projected/636126b8-3906-49f5-8434-c324ab667177-kube-api-access-6x6wk\") pod \"metallb-operator-webhook-server-596877795c-zts7d\" (UID: \"636126b8-3906-49f5-8434-c324ab667177\") " pod="metallb-system/metallb-operator-webhook-server-596877795c-zts7d" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.377729 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/636126b8-3906-49f5-8434-c324ab667177-webhook-cert\") pod \"metallb-operator-webhook-server-596877795c-zts7d\" (UID: \"636126b8-3906-49f5-8434-c324ab667177\") " pod="metallb-system/metallb-operator-webhook-server-596877795c-zts7d" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.377814 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/636126b8-3906-49f5-8434-c324ab667177-apiservice-cert\") pod \"metallb-operator-webhook-server-596877795c-zts7d\" (UID: \"636126b8-3906-49f5-8434-c324ab667177\") " pod="metallb-system/metallb-operator-webhook-server-596877795c-zts7d" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.384036 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/636126b8-3906-49f5-8434-c324ab667177-apiservice-cert\") pod \"metallb-operator-webhook-server-596877795c-zts7d\" (UID: \"636126b8-3906-49f5-8434-c324ab667177\") " pod="metallb-system/metallb-operator-webhook-server-596877795c-zts7d" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.384105 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/636126b8-3906-49f5-8434-c324ab667177-webhook-cert\") pod \"metallb-operator-webhook-server-596877795c-zts7d\" (UID: \"636126b8-3906-49f5-8434-c324ab667177\") " pod="metallb-system/metallb-operator-webhook-server-596877795c-zts7d" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.406098 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x6wk\" (UniqueName: \"kubernetes.io/projected/636126b8-3906-49f5-8434-c324ab667177-kube-api-access-6x6wk\") pod \"metallb-operator-webhook-server-596877795c-zts7d\" (UID: \"636126b8-3906-49f5-8434-c324ab667177\") " pod="metallb-system/metallb-operator-webhook-server-596877795c-zts7d" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.492431 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-79dc498c69-l856r"] Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.577531 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-596877795c-zts7d" Oct 02 11:08:07 crc kubenswrapper[4766]: I1002 11:08:07.819750 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-596877795c-zts7d"] Oct 02 11:08:07 crc kubenswrapper[4766]: W1002 11:08:07.831743 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod636126b8_3906_49f5_8434_c324ab667177.slice/crio-8322a30226cf2d5da7b83271731bddf6d7618f364699a4c57a8375674bd70399 WatchSource:0}: Error finding container 8322a30226cf2d5da7b83271731bddf6d7618f364699a4c57a8375674bd70399: Status 404 returned error can't find the container with id 8322a30226cf2d5da7b83271731bddf6d7618f364699a4c57a8375674bd70399 Oct 02 11:08:08 crc kubenswrapper[4766]: I1002 11:08:08.338211 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-79dc498c69-l856r" event={"ID":"a212e302-c57c-4d73-a1b3-94e720468352","Type":"ContainerStarted","Data":"c817b18a2f091223ad0d4b9107a6c9bc6f1f9f89fea531b6a684fe531eb8278c"} Oct 02 11:08:08 crc kubenswrapper[4766]: I1002 11:08:08.341311 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-596877795c-zts7d" event={"ID":"636126b8-3906-49f5-8434-c324ab667177","Type":"ContainerStarted","Data":"8322a30226cf2d5da7b83271731bddf6d7618f364699a4c57a8375674bd70399"} Oct 02 11:08:13 crc kubenswrapper[4766]: I1002 11:08:13.380242 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-79dc498c69-l856r" event={"ID":"a212e302-c57c-4d73-a1b3-94e720468352","Type":"ContainerStarted","Data":"55df4e514cabc7af3fc15c18a1989778d9106a1073e62030376b21d273acba94"} Oct 02 11:08:13 crc kubenswrapper[4766]: I1002 11:08:13.381150 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-79dc498c69-l856r" Oct 02 11:08:13 crc kubenswrapper[4766]: I1002 11:08:13.381772 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-596877795c-zts7d" event={"ID":"636126b8-3906-49f5-8434-c324ab667177","Type":"ContainerStarted","Data":"a3446c707a12ec24e5817c3f7b23ba1d0e4e9a14a5b46666344c83358285d821"} Oct 02 11:08:13 crc kubenswrapper[4766]: I1002 11:08:13.381917 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-596877795c-zts7d" Oct 02 11:08:13 crc kubenswrapper[4766]: I1002 11:08:13.404767 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-79dc498c69-l856r" podStartSLOduration=2.416594497 podStartE2EDuration="7.404745992s" podCreationTimestamp="2025-10-02 11:08:06 +0000 UTC" firstStartedPulling="2025-10-02 11:08:07.50295332 +0000 UTC m=+1002.445824264" lastFinishedPulling="2025-10-02 11:08:12.491104815 +0000 UTC m=+1007.433975759" observedRunningTime="2025-10-02 11:08:13.401122815 +0000 UTC m=+1008.343993779" watchObservedRunningTime="2025-10-02 11:08:13.404745992 +0000 UTC m=+1008.347616946" Oct 02 11:08:13 crc kubenswrapper[4766]: I1002 11:08:13.423464 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-596877795c-zts7d" podStartSLOduration=1.748093769 podStartE2EDuration="6.42343081s" podCreationTimestamp="2025-10-02 11:08:07 +0000 UTC" firstStartedPulling="2025-10-02 11:08:07.835201217 +0000 UTC m=+1002.778072161" lastFinishedPulling="2025-10-02 11:08:12.510538258 +0000 UTC m=+1007.453409202" observedRunningTime="2025-10-02 11:08:13.420369413 +0000 UTC m=+1008.363240357" watchObservedRunningTime="2025-10-02 11:08:13.42343081 +0000 UTC m=+1008.366301754" Oct 02 11:08:27 crc kubenswrapper[4766]: I1002 11:08:27.582390 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-596877795c-zts7d" Oct 02 11:08:47 crc kubenswrapper[4766]: I1002 11:08:47.210128 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-79dc498c69-l856r" Oct 02 11:08:47 crc kubenswrapper[4766]: I1002 11:08:47.887847 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-52rmm"] Oct 02 11:08:47 crc kubenswrapper[4766]: I1002 11:08:47.888602 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-j49dz"] Oct 02 11:08:47 crc kubenswrapper[4766]: I1002 11:08:47.888830 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-52rmm" Oct 02 11:08:47 crc kubenswrapper[4766]: I1002 11:08:47.891630 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:47 crc kubenswrapper[4766]: I1002 11:08:47.891666 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 02 11:08:47 crc kubenswrapper[4766]: I1002 11:08:47.892262 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-fmqn7" Oct 02 11:08:47 crc kubenswrapper[4766]: I1002 11:08:47.893358 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 02 11:08:47 crc kubenswrapper[4766]: I1002 11:08:47.893968 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 02 11:08:47 crc kubenswrapper[4766]: I1002 11:08:47.899303 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-52rmm"] Oct 02 11:08:47 crc kubenswrapper[4766]: I1002 11:08:47.980564 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-qzdkz"] Oct 02 11:08:47 crc kubenswrapper[4766]: I1002 11:08:47.981693 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qzdkz" Oct 02 11:08:47 crc kubenswrapper[4766]: I1002 11:08:47.985938 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 02 11:08:47 crc kubenswrapper[4766]: I1002 11:08:47.986103 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-tzm4w" Oct 02 11:08:47 crc kubenswrapper[4766]: I1002 11:08:47.985944 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:47.994546 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:47.997335 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-l68rk"] Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:47.999580 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-l68rk" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.009088 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.024315 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-l68rk"] Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.025112 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eadfcf0-faf8-455c-a0f7-f49298dffdee-metrics-certs\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.025252 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9402a3a2-7e5c-4d01-bd76-27ac148ca1cb-cert\") pod \"frr-k8s-webhook-server-64bf5d555-52rmm\" (UID: \"9402a3a2-7e5c-4d01-bd76-27ac148ca1cb\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-52rmm" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.025364 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nqgv\" (UniqueName: \"kubernetes.io/projected/9402a3a2-7e5c-4d01-bd76-27ac148ca1cb-kube-api-access-7nqgv\") pod \"frr-k8s-webhook-server-64bf5d555-52rmm\" (UID: \"9402a3a2-7e5c-4d01-bd76-27ac148ca1cb\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-52rmm" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.025491 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4eadfcf0-faf8-455c-a0f7-f49298dffdee-frr-startup\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.025614 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4eadfcf0-faf8-455c-a0f7-f49298dffdee-reloader\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.025711 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtsnj\" (UniqueName: \"kubernetes.io/projected/4eadfcf0-faf8-455c-a0f7-f49298dffdee-kube-api-access-wtsnj\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.025795 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4eadfcf0-faf8-455c-a0f7-f49298dffdee-frr-conf\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.025871 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4eadfcf0-faf8-455c-a0f7-f49298dffdee-metrics\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.025943 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4eadfcf0-faf8-455c-a0f7-f49298dffdee-frr-sockets\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.127057 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-memberlist\") pod \"speaker-qzdkz\" (UID: \"18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd\") " pod="metallb-system/speaker-qzdkz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.127120 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b03add67-e52f-47b3-9936-b029f88e9f1b-cert\") pod \"controller-68d546b9d8-l68rk\" (UID: \"b03add67-e52f-47b3-9936-b029f88e9f1b\") " pod="metallb-system/controller-68d546b9d8-l68rk" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.127179 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-metallb-excludel2\") pod \"speaker-qzdkz\" (UID: \"18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd\") " pod="metallb-system/speaker-qzdkz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.127217 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eadfcf0-faf8-455c-a0f7-f49298dffdee-metrics-certs\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.127243 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hczbz\" (UniqueName: \"kubernetes.io/projected/b03add67-e52f-47b3-9936-b029f88e9f1b-kube-api-access-hczbz\") pod \"controller-68d546b9d8-l68rk\" (UID: \"b03add67-e52f-47b3-9936-b029f88e9f1b\") " pod="metallb-system/controller-68d546b9d8-l68rk" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.127271 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9402a3a2-7e5c-4d01-bd76-27ac148ca1cb-cert\") pod \"frr-k8s-webhook-server-64bf5d555-52rmm\" (UID: \"9402a3a2-7e5c-4d01-bd76-27ac148ca1cb\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-52rmm" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.127299 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nqgv\" (UniqueName: \"kubernetes.io/projected/9402a3a2-7e5c-4d01-bd76-27ac148ca1cb-kube-api-access-7nqgv\") pod \"frr-k8s-webhook-server-64bf5d555-52rmm\" (UID: \"9402a3a2-7e5c-4d01-bd76-27ac148ca1cb\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-52rmm" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.127322 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-metrics-certs\") pod \"speaker-qzdkz\" (UID: \"18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd\") " pod="metallb-system/speaker-qzdkz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.127338 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cldph\" (UniqueName: \"kubernetes.io/projected/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-kube-api-access-cldph\") pod \"speaker-qzdkz\" (UID: \"18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd\") " pod="metallb-system/speaker-qzdkz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.127363 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4eadfcf0-faf8-455c-a0f7-f49298dffdee-frr-startup\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: E1002 11:08:48.127365 4766 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.127378 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4eadfcf0-faf8-455c-a0f7-f49298dffdee-reloader\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.127395 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtsnj\" (UniqueName: \"kubernetes.io/projected/4eadfcf0-faf8-455c-a0f7-f49298dffdee-kube-api-access-wtsnj\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: E1002 11:08:48.127415 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eadfcf0-faf8-455c-a0f7-f49298dffdee-metrics-certs podName:4eadfcf0-faf8-455c-a0f7-f49298dffdee nodeName:}" failed. No retries permitted until 2025-10-02 11:08:48.627397758 +0000 UTC m=+1043.570268702 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4eadfcf0-faf8-455c-a0f7-f49298dffdee-metrics-certs") pod "frr-k8s-j49dz" (UID: "4eadfcf0-faf8-455c-a0f7-f49298dffdee") : secret "frr-k8s-certs-secret" not found Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.128105 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b03add67-e52f-47b3-9936-b029f88e9f1b-metrics-certs\") pod \"controller-68d546b9d8-l68rk\" (UID: \"b03add67-e52f-47b3-9936-b029f88e9f1b\") " pod="metallb-system/controller-68d546b9d8-l68rk" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.128239 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4eadfcf0-faf8-455c-a0f7-f49298dffdee-frr-conf\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.128354 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4eadfcf0-faf8-455c-a0f7-f49298dffdee-metrics\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.128540 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4eadfcf0-faf8-455c-a0f7-f49298dffdee-frr-sockets\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.128690 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4eadfcf0-faf8-455c-a0f7-f49298dffdee-metrics\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.128245 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4eadfcf0-faf8-455c-a0f7-f49298dffdee-reloader\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.128617 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4eadfcf0-faf8-455c-a0f7-f49298dffdee-frr-startup\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.128562 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4eadfcf0-faf8-455c-a0f7-f49298dffdee-frr-conf\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.128867 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4eadfcf0-faf8-455c-a0f7-f49298dffdee-frr-sockets\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.133356 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9402a3a2-7e5c-4d01-bd76-27ac148ca1cb-cert\") pod \"frr-k8s-webhook-server-64bf5d555-52rmm\" (UID: \"9402a3a2-7e5c-4d01-bd76-27ac148ca1cb\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-52rmm" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.144398 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nqgv\" (UniqueName: \"kubernetes.io/projected/9402a3a2-7e5c-4d01-bd76-27ac148ca1cb-kube-api-access-7nqgv\") pod \"frr-k8s-webhook-server-64bf5d555-52rmm\" (UID: \"9402a3a2-7e5c-4d01-bd76-27ac148ca1cb\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-52rmm" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.144733 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtsnj\" (UniqueName: \"kubernetes.io/projected/4eadfcf0-faf8-455c-a0f7-f49298dffdee-kube-api-access-wtsnj\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.211209 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-52rmm" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.229837 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b03add67-e52f-47b3-9936-b029f88e9f1b-metrics-certs\") pod \"controller-68d546b9d8-l68rk\" (UID: \"b03add67-e52f-47b3-9936-b029f88e9f1b\") " pod="metallb-system/controller-68d546b9d8-l68rk" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.229923 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-memberlist\") pod \"speaker-qzdkz\" (UID: \"18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd\") " pod="metallb-system/speaker-qzdkz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.229951 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b03add67-e52f-47b3-9936-b029f88e9f1b-cert\") pod \"controller-68d546b9d8-l68rk\" (UID: \"b03add67-e52f-47b3-9936-b029f88e9f1b\") " pod="metallb-system/controller-68d546b9d8-l68rk" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.229976 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-metallb-excludel2\") pod \"speaker-qzdkz\" (UID: \"18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd\") " pod="metallb-system/speaker-qzdkz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.230009 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hczbz\" (UniqueName: \"kubernetes.io/projected/b03add67-e52f-47b3-9936-b029f88e9f1b-kube-api-access-hczbz\") pod \"controller-68d546b9d8-l68rk\" (UID: \"b03add67-e52f-47b3-9936-b029f88e9f1b\") " pod="metallb-system/controller-68d546b9d8-l68rk" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.230034 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-metrics-certs\") pod \"speaker-qzdkz\" (UID: \"18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd\") " pod="metallb-system/speaker-qzdkz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.230054 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cldph\" (UniqueName: \"kubernetes.io/projected/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-kube-api-access-cldph\") pod \"speaker-qzdkz\" (UID: \"18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd\") " pod="metallb-system/speaker-qzdkz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.233319 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b03add67-e52f-47b3-9936-b029f88e9f1b-metrics-certs\") pod \"controller-68d546b9d8-l68rk\" (UID: \"b03add67-e52f-47b3-9936-b029f88e9f1b\") " pod="metallb-system/controller-68d546b9d8-l68rk" Oct 02 11:08:48 crc kubenswrapper[4766]: E1002 11:08:48.233406 4766 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 11:08:48 crc kubenswrapper[4766]: E1002 11:08:48.233450 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-memberlist podName:18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd nodeName:}" failed. No retries permitted until 2025-10-02 11:08:48.733435405 +0000 UTC m=+1043.676306349 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-memberlist") pod "speaker-qzdkz" (UID: "18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd") : secret "metallb-memberlist" not found Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.234540 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-metallb-excludel2\") pod \"speaker-qzdkz\" (UID: \"18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd\") " pod="metallb-system/speaker-qzdkz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.238317 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-metrics-certs\") pod \"speaker-qzdkz\" (UID: \"18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd\") " pod="metallb-system/speaker-qzdkz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.239215 4766 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.247907 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b03add67-e52f-47b3-9936-b029f88e9f1b-cert\") pod \"controller-68d546b9d8-l68rk\" (UID: \"b03add67-e52f-47b3-9936-b029f88e9f1b\") " pod="metallb-system/controller-68d546b9d8-l68rk" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.250061 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cldph\" (UniqueName: \"kubernetes.io/projected/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-kube-api-access-cldph\") pod \"speaker-qzdkz\" (UID: \"18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd\") " pod="metallb-system/speaker-qzdkz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.252593 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hczbz\" (UniqueName: \"kubernetes.io/projected/b03add67-e52f-47b3-9936-b029f88e9f1b-kube-api-access-hczbz\") pod \"controller-68d546b9d8-l68rk\" (UID: \"b03add67-e52f-47b3-9936-b029f88e9f1b\") " pod="metallb-system/controller-68d546b9d8-l68rk" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.325317 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-l68rk" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.430224 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-52rmm"] Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.550678 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-l68rk"] Oct 02 11:08:48 crc kubenswrapper[4766]: W1002 11:08:48.563181 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03add67_e52f_47b3_9936_b029f88e9f1b.slice/crio-68645515eb539b8b4fc8bdecc9b01c4394dc3a9be790e1ed1561387c0cbe4e56 WatchSource:0}: Error finding container 68645515eb539b8b4fc8bdecc9b01c4394dc3a9be790e1ed1561387c0cbe4e56: Status 404 returned error can't find the container with id 68645515eb539b8b4fc8bdecc9b01c4394dc3a9be790e1ed1561387c0cbe4e56 Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.571920 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-l68rk" event={"ID":"b03add67-e52f-47b3-9936-b029f88e9f1b","Type":"ContainerStarted","Data":"68645515eb539b8b4fc8bdecc9b01c4394dc3a9be790e1ed1561387c0cbe4e56"} Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.572698 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-52rmm" event={"ID":"9402a3a2-7e5c-4d01-bd76-27ac148ca1cb","Type":"ContainerStarted","Data":"4c71f2cfb05631bede0bee4635d02a17f162abcbb34662e59b9d511bf747bc2d"} Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.636928 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eadfcf0-faf8-455c-a0f7-f49298dffdee-metrics-certs\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.642383 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4eadfcf0-faf8-455c-a0f7-f49298dffdee-metrics-certs\") pod \"frr-k8s-j49dz\" (UID: \"4eadfcf0-faf8-455c-a0f7-f49298dffdee\") " pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.738286 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-memberlist\") pod \"speaker-qzdkz\" (UID: \"18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd\") " pod="metallb-system/speaker-qzdkz" Oct 02 11:08:48 crc kubenswrapper[4766]: E1002 11:08:48.738485 4766 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 02 11:08:48 crc kubenswrapper[4766]: E1002 11:08:48.738577 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-memberlist podName:18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd nodeName:}" failed. No retries permitted until 2025-10-02 11:08:49.738559671 +0000 UTC m=+1044.681430635 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-memberlist") pod "speaker-qzdkz" (UID: "18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd") : secret "metallb-memberlist" not found Oct 02 11:08:48 crc kubenswrapper[4766]: I1002 11:08:48.821190 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-j49dz" Oct 02 11:08:49 crc kubenswrapper[4766]: I1002 11:08:49.580151 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-l68rk" event={"ID":"b03add67-e52f-47b3-9936-b029f88e9f1b","Type":"ContainerStarted","Data":"eb8ef7838593e48c384da8616390b4749ce5afdea10ed513cfbeddcb86032ce9"} Oct 02 11:08:49 crc kubenswrapper[4766]: I1002 11:08:49.580205 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-l68rk" event={"ID":"b03add67-e52f-47b3-9936-b029f88e9f1b","Type":"ContainerStarted","Data":"04684c52d14cd0d29f8b5343482d478c8da68b4e10caaf97bdc5fca0f80d0b27"} Oct 02 11:08:49 crc kubenswrapper[4766]: I1002 11:08:49.580305 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-l68rk" Oct 02 11:08:49 crc kubenswrapper[4766]: I1002 11:08:49.581171 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j49dz" event={"ID":"4eadfcf0-faf8-455c-a0f7-f49298dffdee","Type":"ContainerStarted","Data":"86d618dab6ea6c972e623d607cf15f0f588bb5d5d71d578188b8736c0bfb83c2"} Oct 02 11:08:49 crc kubenswrapper[4766]: I1002 11:08:49.751624 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-memberlist\") pod \"speaker-qzdkz\" (UID: \"18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd\") " pod="metallb-system/speaker-qzdkz" Oct 02 11:08:49 crc kubenswrapper[4766]: I1002 11:08:49.773637 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd-memberlist\") pod \"speaker-qzdkz\" (UID: \"18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd\") " pod="metallb-system/speaker-qzdkz" Oct 02 11:08:49 crc kubenswrapper[4766]: I1002 11:08:49.808868 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qzdkz" Oct 02 11:08:49 crc kubenswrapper[4766]: W1002 11:08:49.839111 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18cbb20b_0ac7_4b62_86f8_7dbcf7ae2afd.slice/crio-8a7ea623778461d5f6c78610a63a3755a0e8328802b8dbd18ad267404d06e2b1 WatchSource:0}: Error finding container 8a7ea623778461d5f6c78610a63a3755a0e8328802b8dbd18ad267404d06e2b1: Status 404 returned error can't find the container with id 8a7ea623778461d5f6c78610a63a3755a0e8328802b8dbd18ad267404d06e2b1 Oct 02 11:08:50 crc kubenswrapper[4766]: I1002 11:08:50.589419 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qzdkz" event={"ID":"18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd","Type":"ContainerStarted","Data":"bf3ac908d73d4eeacbff979975c7de5488589a8cd69f3742002d81d1e265cca4"} Oct 02 11:08:50 crc kubenswrapper[4766]: I1002 11:08:50.589823 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qzdkz" event={"ID":"18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd","Type":"ContainerStarted","Data":"9ad257e313306220c6ab7c8c43ac5edd447602391e228fc14b1ae2a80bb541b0"} Oct 02 11:08:50 crc kubenswrapper[4766]: I1002 11:08:50.589836 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qzdkz" event={"ID":"18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd","Type":"ContainerStarted","Data":"8a7ea623778461d5f6c78610a63a3755a0e8328802b8dbd18ad267404d06e2b1"} Oct 02 11:08:50 crc kubenswrapper[4766]: I1002 11:08:50.589985 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-qzdkz" Oct 02 11:08:50 crc kubenswrapper[4766]: I1002 11:08:50.609179 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-qzdkz" podStartSLOduration=3.609141021 podStartE2EDuration="3.609141021s" podCreationTimestamp="2025-10-02 11:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:08:50.608134708 +0000 UTC m=+1045.551005672" watchObservedRunningTime="2025-10-02 11:08:50.609141021 +0000 UTC m=+1045.552011965" Oct 02 11:08:50 crc kubenswrapper[4766]: I1002 11:08:50.611148 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-l68rk" podStartSLOduration=3.611138244 podStartE2EDuration="3.611138244s" podCreationTimestamp="2025-10-02 11:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:08:49.598468466 +0000 UTC m=+1044.541339420" watchObservedRunningTime="2025-10-02 11:08:50.611138244 +0000 UTC m=+1045.554009188" Oct 02 11:08:56 crc kubenswrapper[4766]: I1002 11:08:56.652356 4766 generic.go:334] "Generic (PLEG): container finished" podID="4eadfcf0-faf8-455c-a0f7-f49298dffdee" containerID="0cfe9d2a3c2be9f1df4c2872ec8b507642c34471c1baf20b3a26b120ba492f71" exitCode=0 Oct 02 11:08:56 crc kubenswrapper[4766]: I1002 11:08:56.652447 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j49dz" event={"ID":"4eadfcf0-faf8-455c-a0f7-f49298dffdee","Type":"ContainerDied","Data":"0cfe9d2a3c2be9f1df4c2872ec8b507642c34471c1baf20b3a26b120ba492f71"} Oct 02 11:08:56 crc kubenswrapper[4766]: I1002 11:08:56.655624 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-52rmm" event={"ID":"9402a3a2-7e5c-4d01-bd76-27ac148ca1cb","Type":"ContainerStarted","Data":"5aae18e4aaf9303559500d68a6933c1ba95ad10620bfc688b8c2f9e324897d7b"} Oct 02 11:08:56 crc kubenswrapper[4766]: I1002 11:08:56.655851 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-52rmm" Oct 02 11:08:56 crc kubenswrapper[4766]: I1002 11:08:56.694557 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-52rmm" podStartSLOduration=2.488535883 podStartE2EDuration="9.694536325s" podCreationTimestamp="2025-10-02 11:08:47 +0000 UTC" firstStartedPulling="2025-10-02 11:08:48.441750401 +0000 UTC m=+1043.384621345" lastFinishedPulling="2025-10-02 11:08:55.647750843 +0000 UTC m=+1050.590621787" observedRunningTime="2025-10-02 11:08:56.693101629 +0000 UTC m=+1051.635972583" watchObservedRunningTime="2025-10-02 11:08:56.694536325 +0000 UTC m=+1051.637407269" Oct 02 11:08:57 crc kubenswrapper[4766]: I1002 11:08:57.662731 4766 generic.go:334] "Generic (PLEG): container finished" podID="4eadfcf0-faf8-455c-a0f7-f49298dffdee" containerID="37580efca21b192f6710b0c86c3d9c4a2bdcefdbe7e68112f338a8add41dd152" exitCode=0 Oct 02 11:08:57 crc kubenswrapper[4766]: I1002 11:08:57.662793 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j49dz" event={"ID":"4eadfcf0-faf8-455c-a0f7-f49298dffdee","Type":"ContainerDied","Data":"37580efca21b192f6710b0c86c3d9c4a2bdcefdbe7e68112f338a8add41dd152"} Oct 02 11:08:58 crc kubenswrapper[4766]: I1002 11:08:58.329875 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-l68rk" Oct 02 11:08:58 crc kubenswrapper[4766]: I1002 11:08:58.677636 4766 generic.go:334] "Generic (PLEG): container finished" podID="4eadfcf0-faf8-455c-a0f7-f49298dffdee" containerID="60642329b606fb68fedddf9eb7b287272ed76477ef4d315fdb1a9f5fe909b78d" exitCode=0 Oct 02 11:08:58 crc kubenswrapper[4766]: I1002 11:08:58.677698 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j49dz" event={"ID":"4eadfcf0-faf8-455c-a0f7-f49298dffdee","Type":"ContainerDied","Data":"60642329b606fb68fedddf9eb7b287272ed76477ef4d315fdb1a9f5fe909b78d"} Oct 02 11:08:59 crc kubenswrapper[4766]: I1002 11:08:59.689480 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j49dz" event={"ID":"4eadfcf0-faf8-455c-a0f7-f49298dffdee","Type":"ContainerStarted","Data":"8b6275bfab853e7d7ddf200e0539fa6fa44c9b15d59b95ffcb05dd30b522450a"} Oct 02 11:08:59 crc kubenswrapper[4766]: I1002 11:08:59.690037 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j49dz" event={"ID":"4eadfcf0-faf8-455c-a0f7-f49298dffdee","Type":"ContainerStarted","Data":"338c22f90089f4624b21bebfdd0bdaa1a9d4733b4099a64bc7502c5aa1a40907"} Oct 02 11:08:59 crc kubenswrapper[4766]: I1002 11:08:59.690053 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j49dz" event={"ID":"4eadfcf0-faf8-455c-a0f7-f49298dffdee","Type":"ContainerStarted","Data":"a3514e2716a2f0dbb8010dab95c4e66ad61faec284d753f8ca1cac6c3ad63d54"} Oct 02 11:08:59 crc kubenswrapper[4766]: I1002 11:08:59.690065 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j49dz" event={"ID":"4eadfcf0-faf8-455c-a0f7-f49298dffdee","Type":"ContainerStarted","Data":"7ad6004d9a0df9c83e4a67eb32a6470bebc7ebd1d89c6eb6feae8efdd44d63d2"} Oct 02 11:08:59 crc kubenswrapper[4766]: I1002 11:08:59.690076 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j49dz" event={"ID":"4eadfcf0-faf8-455c-a0f7-f49298dffdee","Type":"ContainerStarted","Data":"ab6975174b348a6e9d4f22ecbb41796f3683c934324fab5120f8286bf65d3723"} Oct 02 11:09:00 crc kubenswrapper[4766]: I1002 11:09:00.700970 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-j49dz" event={"ID":"4eadfcf0-faf8-455c-a0f7-f49298dffdee","Type":"ContainerStarted","Data":"3ba4cc5e08b68100acc93b74cff19d48fe46554ccee5b01575759998c3ed0aae"} Oct 02 11:09:00 crc kubenswrapper[4766]: I1002 11:09:00.701238 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-j49dz" Oct 02 11:09:00 crc kubenswrapper[4766]: I1002 11:09:00.725805 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-j49dz" podStartSLOduration=6.952685639 podStartE2EDuration="13.725785549s" podCreationTimestamp="2025-10-02 11:08:47 +0000 UTC" firstStartedPulling="2025-10-02 11:08:48.904353454 +0000 UTC m=+1043.847224398" lastFinishedPulling="2025-10-02 11:08:55.677453364 +0000 UTC m=+1050.620324308" observedRunningTime="2025-10-02 11:09:00.724349083 +0000 UTC m=+1055.667220047" watchObservedRunningTime="2025-10-02 11:09:00.725785549 +0000 UTC m=+1055.668656503" Oct 02 11:09:03 crc kubenswrapper[4766]: I1002 11:09:03.822707 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-j49dz" Oct 02 11:09:03 crc kubenswrapper[4766]: I1002 11:09:03.858960 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-j49dz" Oct 02 11:09:08 crc kubenswrapper[4766]: I1002 11:09:08.220601 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-52rmm" Oct 02 11:09:08 crc kubenswrapper[4766]: I1002 11:09:08.827121 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-j49dz" Oct 02 11:09:09 crc kubenswrapper[4766]: I1002 11:09:09.814148 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-qzdkz" Oct 02 11:09:11 crc kubenswrapper[4766]: I1002 11:09:11.221951 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx"] Oct 02 11:09:11 crc kubenswrapper[4766]: I1002 11:09:11.223041 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" Oct 02 11:09:11 crc kubenswrapper[4766]: I1002 11:09:11.225250 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 11:09:11 crc kubenswrapper[4766]: I1002 11:09:11.244744 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx"] Oct 02 11:09:11 crc kubenswrapper[4766]: I1002 11:09:11.301905 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1afef370-5e0c-402e-972b-6375f5c7a86e-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx\" (UID: \"1afef370-5e0c-402e-972b-6375f5c7a86e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" Oct 02 11:09:11 crc kubenswrapper[4766]: I1002 11:09:11.302298 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwtg9\" (UniqueName: \"kubernetes.io/projected/1afef370-5e0c-402e-972b-6375f5c7a86e-kube-api-access-gwtg9\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx\" (UID: \"1afef370-5e0c-402e-972b-6375f5c7a86e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" Oct 02 11:09:11 crc kubenswrapper[4766]: I1002 11:09:11.302590 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1afef370-5e0c-402e-972b-6375f5c7a86e-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx\" (UID: \"1afef370-5e0c-402e-972b-6375f5c7a86e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" Oct 02 11:09:11 crc kubenswrapper[4766]: I1002 11:09:11.404010 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1afef370-5e0c-402e-972b-6375f5c7a86e-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx\" (UID: \"1afef370-5e0c-402e-972b-6375f5c7a86e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" Oct 02 11:09:11 crc kubenswrapper[4766]: I1002 11:09:11.404096 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwtg9\" (UniqueName: \"kubernetes.io/projected/1afef370-5e0c-402e-972b-6375f5c7a86e-kube-api-access-gwtg9\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx\" (UID: \"1afef370-5e0c-402e-972b-6375f5c7a86e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" Oct 02 11:09:11 crc kubenswrapper[4766]: I1002 11:09:11.404175 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1afef370-5e0c-402e-972b-6375f5c7a86e-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx\" (UID: \"1afef370-5e0c-402e-972b-6375f5c7a86e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" Oct 02 11:09:11 crc kubenswrapper[4766]: I1002 11:09:11.404641 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1afef370-5e0c-402e-972b-6375f5c7a86e-bundle\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx\" (UID: \"1afef370-5e0c-402e-972b-6375f5c7a86e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" Oct 02 11:09:11 crc kubenswrapper[4766]: I1002 11:09:11.404764 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1afef370-5e0c-402e-972b-6375f5c7a86e-util\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx\" (UID: \"1afef370-5e0c-402e-972b-6375f5c7a86e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" Oct 02 11:09:11 crc kubenswrapper[4766]: I1002 11:09:11.431469 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwtg9\" (UniqueName: \"kubernetes.io/projected/1afef370-5e0c-402e-972b-6375f5c7a86e-kube-api-access-gwtg9\") pod \"695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx\" (UID: \"1afef370-5e0c-402e-972b-6375f5c7a86e\") " pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" Oct 02 11:09:11 crc kubenswrapper[4766]: I1002 11:09:11.540409 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" Oct 02 11:09:11 crc kubenswrapper[4766]: I1002 11:09:11.785738 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx"] Oct 02 11:09:12 crc kubenswrapper[4766]: I1002 11:09:12.776905 4766 generic.go:334] "Generic (PLEG): container finished" podID="1afef370-5e0c-402e-972b-6375f5c7a86e" containerID="1b60c0c2d97f1ea5c9fece98f5394482fcd70dfd0db5883a6660e61930a07a69" exitCode=0 Oct 02 11:09:12 crc kubenswrapper[4766]: I1002 11:09:12.777038 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" event={"ID":"1afef370-5e0c-402e-972b-6375f5c7a86e","Type":"ContainerDied","Data":"1b60c0c2d97f1ea5c9fece98f5394482fcd70dfd0db5883a6660e61930a07a69"} Oct 02 11:09:12 crc kubenswrapper[4766]: I1002 11:09:12.777073 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" event={"ID":"1afef370-5e0c-402e-972b-6375f5c7a86e","Type":"ContainerStarted","Data":"ff1c62f23788ad98855ea196c6f37a3eeb8ebb8a0f1c7a7feccb305969f4ad48"} Oct 02 11:09:16 crc kubenswrapper[4766]: I1002 11:09:16.815537 4766 generic.go:334] "Generic (PLEG): container finished" podID="1afef370-5e0c-402e-972b-6375f5c7a86e" containerID="281477f5439b14ba09d2399cf1dba11a6f4707d3982ede64ee73e420004df845" exitCode=0 Oct 02 11:09:16 crc kubenswrapper[4766]: I1002 11:09:16.815639 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" event={"ID":"1afef370-5e0c-402e-972b-6375f5c7a86e","Type":"ContainerDied","Data":"281477f5439b14ba09d2399cf1dba11a6f4707d3982ede64ee73e420004df845"} Oct 02 11:09:17 crc kubenswrapper[4766]: I1002 11:09:17.825096 4766 generic.go:334] "Generic (PLEG): container finished" podID="1afef370-5e0c-402e-972b-6375f5c7a86e" containerID="dc0f54164605fb45f11daeb6a1b1541ef6a91dfca0607549592743c102206e08" exitCode=0 Oct 02 11:09:17 crc kubenswrapper[4766]: I1002 11:09:17.825140 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" event={"ID":"1afef370-5e0c-402e-972b-6375f5c7a86e","Type":"ContainerDied","Data":"dc0f54164605fb45f11daeb6a1b1541ef6a91dfca0607549592743c102206e08"} Oct 02 11:09:19 crc kubenswrapper[4766]: I1002 11:09:19.050253 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" Oct 02 11:09:19 crc kubenswrapper[4766]: I1002 11:09:19.111125 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwtg9\" (UniqueName: \"kubernetes.io/projected/1afef370-5e0c-402e-972b-6375f5c7a86e-kube-api-access-gwtg9\") pod \"1afef370-5e0c-402e-972b-6375f5c7a86e\" (UID: \"1afef370-5e0c-402e-972b-6375f5c7a86e\") " Oct 02 11:09:19 crc kubenswrapper[4766]: I1002 11:09:19.111229 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1afef370-5e0c-402e-972b-6375f5c7a86e-util\") pod \"1afef370-5e0c-402e-972b-6375f5c7a86e\" (UID: \"1afef370-5e0c-402e-972b-6375f5c7a86e\") " Oct 02 11:09:19 crc kubenswrapper[4766]: I1002 11:09:19.111278 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1afef370-5e0c-402e-972b-6375f5c7a86e-bundle\") pod \"1afef370-5e0c-402e-972b-6375f5c7a86e\" (UID: \"1afef370-5e0c-402e-972b-6375f5c7a86e\") " Oct 02 11:09:19 crc kubenswrapper[4766]: I1002 11:09:19.112211 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1afef370-5e0c-402e-972b-6375f5c7a86e-bundle" (OuterVolumeSpecName: "bundle") pod "1afef370-5e0c-402e-972b-6375f5c7a86e" (UID: "1afef370-5e0c-402e-972b-6375f5c7a86e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:09:19 crc kubenswrapper[4766]: I1002 11:09:19.117747 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1afef370-5e0c-402e-972b-6375f5c7a86e-kube-api-access-gwtg9" (OuterVolumeSpecName: "kube-api-access-gwtg9") pod "1afef370-5e0c-402e-972b-6375f5c7a86e" (UID: "1afef370-5e0c-402e-972b-6375f5c7a86e"). InnerVolumeSpecName "kube-api-access-gwtg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:09:19 crc kubenswrapper[4766]: I1002 11:09:19.121122 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1afef370-5e0c-402e-972b-6375f5c7a86e-util" (OuterVolumeSpecName: "util") pod "1afef370-5e0c-402e-972b-6375f5c7a86e" (UID: "1afef370-5e0c-402e-972b-6375f5c7a86e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:09:19 crc kubenswrapper[4766]: I1002 11:09:19.213160 4766 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1afef370-5e0c-402e-972b-6375f5c7a86e-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:09:19 crc kubenswrapper[4766]: I1002 11:09:19.213198 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwtg9\" (UniqueName: \"kubernetes.io/projected/1afef370-5e0c-402e-972b-6375f5c7a86e-kube-api-access-gwtg9\") on node \"crc\" DevicePath \"\"" Oct 02 11:09:19 crc kubenswrapper[4766]: I1002 11:09:19.213210 4766 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1afef370-5e0c-402e-972b-6375f5c7a86e-util\") on node \"crc\" DevicePath \"\"" Oct 02 11:09:19 crc kubenswrapper[4766]: I1002 11:09:19.838573 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" event={"ID":"1afef370-5e0c-402e-972b-6375f5c7a86e","Type":"ContainerDied","Data":"ff1c62f23788ad98855ea196c6f37a3eeb8ebb8a0f1c7a7feccb305969f4ad48"} Oct 02 11:09:19 crc kubenswrapper[4766]: I1002 11:09:19.838618 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff1c62f23788ad98855ea196c6f37a3eeb8ebb8a0f1c7a7feccb305969f4ad48" Oct 02 11:09:19 crc kubenswrapper[4766]: I1002 11:09:19.838632 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx" Oct 02 11:09:24 crc kubenswrapper[4766]: I1002 11:09:24.431830 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:09:24 crc kubenswrapper[4766]: I1002 11:09:24.432253 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:09:32 crc kubenswrapper[4766]: I1002 11:09:32.675891 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdf5l"] Oct 02 11:09:32 crc kubenswrapper[4766]: E1002 11:09:32.677342 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1afef370-5e0c-402e-972b-6375f5c7a86e" containerName="pull" Oct 02 11:09:32 crc kubenswrapper[4766]: I1002 11:09:32.677418 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1afef370-5e0c-402e-972b-6375f5c7a86e" containerName="pull" Oct 02 11:09:32 crc kubenswrapper[4766]: E1002 11:09:32.677460 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1afef370-5e0c-402e-972b-6375f5c7a86e" containerName="util" Oct 02 11:09:32 crc kubenswrapper[4766]: I1002 11:09:32.677466 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1afef370-5e0c-402e-972b-6375f5c7a86e" containerName="util" Oct 02 11:09:32 crc kubenswrapper[4766]: E1002 11:09:32.677473 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1afef370-5e0c-402e-972b-6375f5c7a86e" containerName="extract" Oct 02 11:09:32 crc kubenswrapper[4766]: I1002 11:09:32.677479 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1afef370-5e0c-402e-972b-6375f5c7a86e" containerName="extract" Oct 02 11:09:32 crc kubenswrapper[4766]: I1002 11:09:32.677855 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1afef370-5e0c-402e-972b-6375f5c7a86e" containerName="extract" Oct 02 11:09:32 crc kubenswrapper[4766]: I1002 11:09:32.678525 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdf5l" Oct 02 11:09:32 crc kubenswrapper[4766]: I1002 11:09:32.681872 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Oct 02 11:09:32 crc kubenswrapper[4766]: I1002 11:09:32.682140 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Oct 02 11:09:32 crc kubenswrapper[4766]: I1002 11:09:32.682572 4766 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-wnwp8" Oct 02 11:09:32 crc kubenswrapper[4766]: I1002 11:09:32.688806 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdf5l"] Oct 02 11:09:32 crc kubenswrapper[4766]: I1002 11:09:32.811116 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgjxw\" (UniqueName: \"kubernetes.io/projected/2bcaafff-feb2-41a3-8946-9e66952f15e8-kube-api-access-sgjxw\") pod \"cert-manager-operator-controller-manager-57cd46d6d-bdf5l\" (UID: \"2bcaafff-feb2-41a3-8946-9e66952f15e8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdf5l" Oct 02 11:09:32 crc kubenswrapper[4766]: I1002 11:09:32.912211 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgjxw\" (UniqueName: \"kubernetes.io/projected/2bcaafff-feb2-41a3-8946-9e66952f15e8-kube-api-access-sgjxw\") pod \"cert-manager-operator-controller-manager-57cd46d6d-bdf5l\" (UID: \"2bcaafff-feb2-41a3-8946-9e66952f15e8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdf5l" Oct 02 11:09:32 crc kubenswrapper[4766]: I1002 11:09:32.935398 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgjxw\" (UniqueName: \"kubernetes.io/projected/2bcaafff-feb2-41a3-8946-9e66952f15e8-kube-api-access-sgjxw\") pod \"cert-manager-operator-controller-manager-57cd46d6d-bdf5l\" (UID: \"2bcaafff-feb2-41a3-8946-9e66952f15e8\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdf5l" Oct 02 11:09:33 crc kubenswrapper[4766]: I1002 11:09:33.000372 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdf5l" Oct 02 11:09:33 crc kubenswrapper[4766]: I1002 11:09:33.410922 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdf5l"] Oct 02 11:09:33 crc kubenswrapper[4766]: I1002 11:09:33.925811 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdf5l" event={"ID":"2bcaafff-feb2-41a3-8946-9e66952f15e8","Type":"ContainerStarted","Data":"4ee2abb4ab45d95518c17c15ef4bf3aeacb10229f4b0eab2d7ba7b9104538347"} Oct 02 11:09:39 crc kubenswrapper[4766]: I1002 11:09:39.962639 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdf5l" event={"ID":"2bcaafff-feb2-41a3-8946-9e66952f15e8","Type":"ContainerStarted","Data":"26f45476f8bb6348b909556ceb4cabd127dd638891678a08db888a65f2c2f21b"} Oct 02 11:09:43 crc kubenswrapper[4766]: I1002 11:09:43.056231 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-57cd46d6d-bdf5l" podStartSLOduration=4.777382216 podStartE2EDuration="11.056210669s" podCreationTimestamp="2025-10-02 11:09:32 +0000 UTC" firstStartedPulling="2025-10-02 11:09:33.429893113 +0000 UTC m=+1088.372764057" lastFinishedPulling="2025-10-02 11:09:39.708721566 +0000 UTC m=+1094.651592510" observedRunningTime="2025-10-02 11:09:39.988294275 +0000 UTC m=+1094.931165219" watchObservedRunningTime="2025-10-02 11:09:43.056210669 +0000 UTC m=+1097.999081613" Oct 02 11:09:43 crc kubenswrapper[4766]: I1002 11:09:43.058653 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-ggcgs"] Oct 02 11:09:43 crc kubenswrapper[4766]: I1002 11:09:43.059365 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-ggcgs" Oct 02 11:09:43 crc kubenswrapper[4766]: I1002 11:09:43.062688 4766 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-zprpv" Oct 02 11:09:43 crc kubenswrapper[4766]: I1002 11:09:43.063338 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 02 11:09:43 crc kubenswrapper[4766]: I1002 11:09:43.064097 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 02 11:09:43 crc kubenswrapper[4766]: I1002 11:09:43.066337 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-ggcgs"] Oct 02 11:09:43 crc kubenswrapper[4766]: I1002 11:09:43.148681 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/64245672-eb6a-4b99-9550-ac59d359dddf-bound-sa-token\") pod \"cert-manager-webhook-d969966f-ggcgs\" (UID: \"64245672-eb6a-4b99-9550-ac59d359dddf\") " pod="cert-manager/cert-manager-webhook-d969966f-ggcgs" Oct 02 11:09:43 crc kubenswrapper[4766]: I1002 11:09:43.148878 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mnh7\" (UniqueName: \"kubernetes.io/projected/64245672-eb6a-4b99-9550-ac59d359dddf-kube-api-access-6mnh7\") pod \"cert-manager-webhook-d969966f-ggcgs\" (UID: \"64245672-eb6a-4b99-9550-ac59d359dddf\") " pod="cert-manager/cert-manager-webhook-d969966f-ggcgs" Oct 02 11:09:43 crc kubenswrapper[4766]: I1002 11:09:43.250486 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mnh7\" (UniqueName: \"kubernetes.io/projected/64245672-eb6a-4b99-9550-ac59d359dddf-kube-api-access-6mnh7\") pod \"cert-manager-webhook-d969966f-ggcgs\" (UID: \"64245672-eb6a-4b99-9550-ac59d359dddf\") " pod="cert-manager/cert-manager-webhook-d969966f-ggcgs" Oct 02 11:09:43 crc kubenswrapper[4766]: I1002 11:09:43.250599 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/64245672-eb6a-4b99-9550-ac59d359dddf-bound-sa-token\") pod \"cert-manager-webhook-d969966f-ggcgs\" (UID: \"64245672-eb6a-4b99-9550-ac59d359dddf\") " pod="cert-manager/cert-manager-webhook-d969966f-ggcgs" Oct 02 11:09:43 crc kubenswrapper[4766]: I1002 11:09:43.271089 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/64245672-eb6a-4b99-9550-ac59d359dddf-bound-sa-token\") pod \"cert-manager-webhook-d969966f-ggcgs\" (UID: \"64245672-eb6a-4b99-9550-ac59d359dddf\") " pod="cert-manager/cert-manager-webhook-d969966f-ggcgs" Oct 02 11:09:43 crc kubenswrapper[4766]: I1002 11:09:43.272665 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mnh7\" (UniqueName: \"kubernetes.io/projected/64245672-eb6a-4b99-9550-ac59d359dddf-kube-api-access-6mnh7\") pod \"cert-manager-webhook-d969966f-ggcgs\" (UID: \"64245672-eb6a-4b99-9550-ac59d359dddf\") " pod="cert-manager/cert-manager-webhook-d969966f-ggcgs" Oct 02 11:09:43 crc kubenswrapper[4766]: I1002 11:09:43.375077 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-d969966f-ggcgs" Oct 02 11:09:43 crc kubenswrapper[4766]: I1002 11:09:43.623946 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-d969966f-ggcgs"] Oct 02 11:09:43 crc kubenswrapper[4766]: W1002 11:09:43.632664 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64245672_eb6a_4b99_9550_ac59d359dddf.slice/crio-7c86d52ab5ccfdffd2b1aa4d44b924ffd019358554ee3a13803be87f2e2e80b5 WatchSource:0}: Error finding container 7c86d52ab5ccfdffd2b1aa4d44b924ffd019358554ee3a13803be87f2e2e80b5: Status 404 returned error can't find the container with id 7c86d52ab5ccfdffd2b1aa4d44b924ffd019358554ee3a13803be87f2e2e80b5 Oct 02 11:09:43 crc kubenswrapper[4766]: I1002 11:09:43.986916 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-ggcgs" event={"ID":"64245672-eb6a-4b99-9550-ac59d359dddf","Type":"ContainerStarted","Data":"7c86d52ab5ccfdffd2b1aa4d44b924ffd019358554ee3a13803be87f2e2e80b5"} Oct 02 11:09:46 crc kubenswrapper[4766]: I1002 11:09:46.005004 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-rfgbm"] Oct 02 11:09:46 crc kubenswrapper[4766]: I1002 11:09:46.008763 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-rfgbm" Oct 02 11:09:46 crc kubenswrapper[4766]: I1002 11:09:46.015151 4766 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-dd964" Oct 02 11:09:46 crc kubenswrapper[4766]: I1002 11:09:46.016721 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-rfgbm"] Oct 02 11:09:46 crc kubenswrapper[4766]: I1002 11:09:46.097185 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c13af787-6251-4bbe-88b2-e47927aabd14-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-rfgbm\" (UID: \"c13af787-6251-4bbe-88b2-e47927aabd14\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-rfgbm" Oct 02 11:09:46 crc kubenswrapper[4766]: I1002 11:09:46.097385 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pf4r\" (UniqueName: \"kubernetes.io/projected/c13af787-6251-4bbe-88b2-e47927aabd14-kube-api-access-5pf4r\") pod \"cert-manager-cainjector-7d9f95dbf-rfgbm\" (UID: \"c13af787-6251-4bbe-88b2-e47927aabd14\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-rfgbm" Oct 02 11:09:46 crc kubenswrapper[4766]: I1002 11:09:46.199359 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pf4r\" (UniqueName: \"kubernetes.io/projected/c13af787-6251-4bbe-88b2-e47927aabd14-kube-api-access-5pf4r\") pod \"cert-manager-cainjector-7d9f95dbf-rfgbm\" (UID: \"c13af787-6251-4bbe-88b2-e47927aabd14\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-rfgbm" Oct 02 11:09:46 crc kubenswrapper[4766]: I1002 11:09:46.199439 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c13af787-6251-4bbe-88b2-e47927aabd14-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-rfgbm\" (UID: \"c13af787-6251-4bbe-88b2-e47927aabd14\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-rfgbm" Oct 02 11:09:46 crc kubenswrapper[4766]: I1002 11:09:46.218768 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pf4r\" (UniqueName: \"kubernetes.io/projected/c13af787-6251-4bbe-88b2-e47927aabd14-kube-api-access-5pf4r\") pod \"cert-manager-cainjector-7d9f95dbf-rfgbm\" (UID: \"c13af787-6251-4bbe-88b2-e47927aabd14\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-rfgbm" Oct 02 11:09:46 crc kubenswrapper[4766]: I1002 11:09:46.230530 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c13af787-6251-4bbe-88b2-e47927aabd14-bound-sa-token\") pod \"cert-manager-cainjector-7d9f95dbf-rfgbm\" (UID: \"c13af787-6251-4bbe-88b2-e47927aabd14\") " pod="cert-manager/cert-manager-cainjector-7d9f95dbf-rfgbm" Oct 02 11:09:46 crc kubenswrapper[4766]: I1002 11:09:46.342483 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-rfgbm" Oct 02 11:09:46 crc kubenswrapper[4766]: I1002 11:09:46.864976 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7d9f95dbf-rfgbm"] Oct 02 11:09:47 crc kubenswrapper[4766]: I1002 11:09:47.010259 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-rfgbm" event={"ID":"c13af787-6251-4bbe-88b2-e47927aabd14","Type":"ContainerStarted","Data":"bbb76194cac3d39a7fb3cc8ed5aea5e3d5805947924267d1042290e61d533bd1"} Oct 02 11:09:49 crc kubenswrapper[4766]: I1002 11:09:49.021690 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-rfgbm" event={"ID":"c13af787-6251-4bbe-88b2-e47927aabd14","Type":"ContainerStarted","Data":"4e81b9e988986278fc59e08967d79a35778044f0e0c658f4f315e9163047abf3"} Oct 02 11:09:49 crc kubenswrapper[4766]: I1002 11:09:49.023799 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-d969966f-ggcgs" event={"ID":"64245672-eb6a-4b99-9550-ac59d359dddf","Type":"ContainerStarted","Data":"62a73b85375770c94e24cd77723034d987a73525e6e7d590ddd29b7d187e50cf"} Oct 02 11:09:49 crc kubenswrapper[4766]: I1002 11:09:49.024153 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-d969966f-ggcgs" Oct 02 11:09:49 crc kubenswrapper[4766]: I1002 11:09:49.035255 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7d9f95dbf-rfgbm" podStartSLOduration=2.318586469 podStartE2EDuration="4.03524051s" podCreationTimestamp="2025-10-02 11:09:45 +0000 UTC" firstStartedPulling="2025-10-02 11:09:46.873753434 +0000 UTC m=+1101.816624378" lastFinishedPulling="2025-10-02 11:09:48.590407475 +0000 UTC m=+1103.533278419" observedRunningTime="2025-10-02 11:09:49.035145447 +0000 UTC m=+1103.978016391" watchObservedRunningTime="2025-10-02 11:09:49.03524051 +0000 UTC m=+1103.978111454" Oct 02 11:09:49 crc kubenswrapper[4766]: I1002 11:09:49.057854 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-d969966f-ggcgs" podStartSLOduration=1.095203161 podStartE2EDuration="6.057833172s" podCreationTimestamp="2025-10-02 11:09:43 +0000 UTC" firstStartedPulling="2025-10-02 11:09:43.634456618 +0000 UTC m=+1098.577327562" lastFinishedPulling="2025-10-02 11:09:48.597086629 +0000 UTC m=+1103.539957573" observedRunningTime="2025-10-02 11:09:49.053992329 +0000 UTC m=+1103.996863263" watchObservedRunningTime="2025-10-02 11:09:49.057833172 +0000 UTC m=+1104.000704126" Oct 02 11:09:53 crc kubenswrapper[4766]: I1002 11:09:53.378394 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-d969966f-ggcgs" Oct 02 11:09:54 crc kubenswrapper[4766]: I1002 11:09:54.432093 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:09:54 crc kubenswrapper[4766]: I1002 11:09:54.432148 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:10:02 crc kubenswrapper[4766]: I1002 11:10:02.144541 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-krw9f"] Oct 02 11:10:02 crc kubenswrapper[4766]: I1002 11:10:02.146145 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-krw9f" Oct 02 11:10:02 crc kubenswrapper[4766]: I1002 11:10:02.152703 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-krw9f"] Oct 02 11:10:02 crc kubenswrapper[4766]: I1002 11:10:02.196448 4766 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-wt2gj" Oct 02 11:10:02 crc kubenswrapper[4766]: I1002 11:10:02.213032 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrpcc\" (UniqueName: \"kubernetes.io/projected/e6b26aea-d5dc-4599-baee-0d3046b6f822-kube-api-access-jrpcc\") pod \"cert-manager-7d4cc89fcb-krw9f\" (UID: \"e6b26aea-d5dc-4599-baee-0d3046b6f822\") " pod="cert-manager/cert-manager-7d4cc89fcb-krw9f" Oct 02 11:10:02 crc kubenswrapper[4766]: I1002 11:10:02.213110 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6b26aea-d5dc-4599-baee-0d3046b6f822-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-krw9f\" (UID: \"e6b26aea-d5dc-4599-baee-0d3046b6f822\") " pod="cert-manager/cert-manager-7d4cc89fcb-krw9f" Oct 02 11:10:02 crc kubenswrapper[4766]: I1002 11:10:02.314051 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6b26aea-d5dc-4599-baee-0d3046b6f822-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-krw9f\" (UID: \"e6b26aea-d5dc-4599-baee-0d3046b6f822\") " pod="cert-manager/cert-manager-7d4cc89fcb-krw9f" Oct 02 11:10:02 crc kubenswrapper[4766]: I1002 11:10:02.314177 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrpcc\" (UniqueName: \"kubernetes.io/projected/e6b26aea-d5dc-4599-baee-0d3046b6f822-kube-api-access-jrpcc\") pod \"cert-manager-7d4cc89fcb-krw9f\" (UID: \"e6b26aea-d5dc-4599-baee-0d3046b6f822\") " pod="cert-manager/cert-manager-7d4cc89fcb-krw9f" Oct 02 11:10:02 crc kubenswrapper[4766]: I1002 11:10:02.332097 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e6b26aea-d5dc-4599-baee-0d3046b6f822-bound-sa-token\") pod \"cert-manager-7d4cc89fcb-krw9f\" (UID: \"e6b26aea-d5dc-4599-baee-0d3046b6f822\") " pod="cert-manager/cert-manager-7d4cc89fcb-krw9f" Oct 02 11:10:02 crc kubenswrapper[4766]: I1002 11:10:02.339007 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrpcc\" (UniqueName: \"kubernetes.io/projected/e6b26aea-d5dc-4599-baee-0d3046b6f822-kube-api-access-jrpcc\") pod \"cert-manager-7d4cc89fcb-krw9f\" (UID: \"e6b26aea-d5dc-4599-baee-0d3046b6f822\") " pod="cert-manager/cert-manager-7d4cc89fcb-krw9f" Oct 02 11:10:02 crc kubenswrapper[4766]: I1002 11:10:02.511975 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-7d4cc89fcb-krw9f" Oct 02 11:10:02 crc kubenswrapper[4766]: I1002 11:10:02.898990 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-7d4cc89fcb-krw9f"] Oct 02 11:10:02 crc kubenswrapper[4766]: W1002 11:10:02.906428 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6b26aea_d5dc_4599_baee_0d3046b6f822.slice/crio-cdd09d54aa4469fa00fb40d7ee3b1f55300203e96e2e1b0f753a7ce829294dda WatchSource:0}: Error finding container cdd09d54aa4469fa00fb40d7ee3b1f55300203e96e2e1b0f753a7ce829294dda: Status 404 returned error can't find the container with id cdd09d54aa4469fa00fb40d7ee3b1f55300203e96e2e1b0f753a7ce829294dda Oct 02 11:10:03 crc kubenswrapper[4766]: I1002 11:10:03.118966 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-krw9f" event={"ID":"e6b26aea-d5dc-4599-baee-0d3046b6f822","Type":"ContainerStarted","Data":"3e2c44656a50f87ff4cc5412d97ae237009ff1768c09962ab93afbdcb80212d4"} Oct 02 11:10:03 crc kubenswrapper[4766]: I1002 11:10:03.119372 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-7d4cc89fcb-krw9f" event={"ID":"e6b26aea-d5dc-4599-baee-0d3046b6f822","Type":"ContainerStarted","Data":"cdd09d54aa4469fa00fb40d7ee3b1f55300203e96e2e1b0f753a7ce829294dda"} Oct 02 11:10:03 crc kubenswrapper[4766]: I1002 11:10:03.148141 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-7d4cc89fcb-krw9f" podStartSLOduration=1.148110003 podStartE2EDuration="1.148110003s" podCreationTimestamp="2025-10-02 11:10:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:10:03.13798908 +0000 UTC m=+1118.080860094" watchObservedRunningTime="2025-10-02 11:10:03.148110003 +0000 UTC m=+1118.090980997" Oct 02 11:10:06 crc kubenswrapper[4766]: I1002 11:10:06.559493 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qtqxb"] Oct 02 11:10:06 crc kubenswrapper[4766]: I1002 11:10:06.560785 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qtqxb" Oct 02 11:10:06 crc kubenswrapper[4766]: I1002 11:10:06.568605 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-87n4b" Oct 02 11:10:06 crc kubenswrapper[4766]: I1002 11:10:06.568635 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 02 11:10:06 crc kubenswrapper[4766]: I1002 11:10:06.570940 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 02 11:10:06 crc kubenswrapper[4766]: I1002 11:10:06.578607 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qtqxb"] Oct 02 11:10:06 crc kubenswrapper[4766]: I1002 11:10:06.665975 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2shg\" (UniqueName: \"kubernetes.io/projected/e1bba5e2-7ffc-4790-9588-881f7c7dd1ea-kube-api-access-p2shg\") pod \"openstack-operator-index-qtqxb\" (UID: \"e1bba5e2-7ffc-4790-9588-881f7c7dd1ea\") " pod="openstack-operators/openstack-operator-index-qtqxb" Oct 02 11:10:06 crc kubenswrapper[4766]: I1002 11:10:06.767361 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2shg\" (UniqueName: \"kubernetes.io/projected/e1bba5e2-7ffc-4790-9588-881f7c7dd1ea-kube-api-access-p2shg\") pod \"openstack-operator-index-qtqxb\" (UID: \"e1bba5e2-7ffc-4790-9588-881f7c7dd1ea\") " pod="openstack-operators/openstack-operator-index-qtqxb" Oct 02 11:10:06 crc kubenswrapper[4766]: I1002 11:10:06.785584 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2shg\" (UniqueName: \"kubernetes.io/projected/e1bba5e2-7ffc-4790-9588-881f7c7dd1ea-kube-api-access-p2shg\") pod \"openstack-operator-index-qtqxb\" (UID: \"e1bba5e2-7ffc-4790-9588-881f7c7dd1ea\") " pod="openstack-operators/openstack-operator-index-qtqxb" Oct 02 11:10:06 crc kubenswrapper[4766]: I1002 11:10:06.880646 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qtqxb" Oct 02 11:10:07 crc kubenswrapper[4766]: I1002 11:10:07.083177 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qtqxb"] Oct 02 11:10:07 crc kubenswrapper[4766]: W1002 11:10:07.093410 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1bba5e2_7ffc_4790_9588_881f7c7dd1ea.slice/crio-aadb7aceae6cf6af953ece45db5b30b79295a9acab25c372a3d599c9a0410e4b WatchSource:0}: Error finding container aadb7aceae6cf6af953ece45db5b30b79295a9acab25c372a3d599c9a0410e4b: Status 404 returned error can't find the container with id aadb7aceae6cf6af953ece45db5b30b79295a9acab25c372a3d599c9a0410e4b Oct 02 11:10:07 crc kubenswrapper[4766]: I1002 11:10:07.139555 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qtqxb" event={"ID":"e1bba5e2-7ffc-4790-9588-881f7c7dd1ea","Type":"ContainerStarted","Data":"aadb7aceae6cf6af953ece45db5b30b79295a9acab25c372a3d599c9a0410e4b"} Oct 02 11:10:09 crc kubenswrapper[4766]: I1002 11:10:09.943881 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-qtqxb"] Oct 02 11:10:10 crc kubenswrapper[4766]: I1002 11:10:10.549419 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-llggj"] Oct 02 11:10:10 crc kubenswrapper[4766]: I1002 11:10:10.550130 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-llggj" Oct 02 11:10:10 crc kubenswrapper[4766]: I1002 11:10:10.564535 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-llggj"] Oct 02 11:10:10 crc kubenswrapper[4766]: I1002 11:10:10.620664 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fn57\" (UniqueName: \"kubernetes.io/projected/1727e62e-9173-45f1-b7dc-f4721872708a-kube-api-access-5fn57\") pod \"openstack-operator-index-llggj\" (UID: \"1727e62e-9173-45f1-b7dc-f4721872708a\") " pod="openstack-operators/openstack-operator-index-llggj" Oct 02 11:10:10 crc kubenswrapper[4766]: I1002 11:10:10.721792 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fn57\" (UniqueName: \"kubernetes.io/projected/1727e62e-9173-45f1-b7dc-f4721872708a-kube-api-access-5fn57\") pod \"openstack-operator-index-llggj\" (UID: \"1727e62e-9173-45f1-b7dc-f4721872708a\") " pod="openstack-operators/openstack-operator-index-llggj" Oct 02 11:10:10 crc kubenswrapper[4766]: I1002 11:10:10.742379 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fn57\" (UniqueName: \"kubernetes.io/projected/1727e62e-9173-45f1-b7dc-f4721872708a-kube-api-access-5fn57\") pod \"openstack-operator-index-llggj\" (UID: \"1727e62e-9173-45f1-b7dc-f4721872708a\") " pod="openstack-operators/openstack-operator-index-llggj" Oct 02 11:10:10 crc kubenswrapper[4766]: I1002 11:10:10.868691 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-llggj" Oct 02 11:10:13 crc kubenswrapper[4766]: I1002 11:10:13.614098 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-llggj"] Oct 02 11:10:14 crc kubenswrapper[4766]: I1002 11:10:14.189120 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qtqxb" event={"ID":"e1bba5e2-7ffc-4790-9588-881f7c7dd1ea","Type":"ContainerStarted","Data":"3b930c95a5e6679b422158fbaa0e7a54a98cade25b8187f0bda10e11d991edbd"} Oct 02 11:10:14 crc kubenswrapper[4766]: I1002 11:10:14.189158 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-qtqxb" podUID="e1bba5e2-7ffc-4790-9588-881f7c7dd1ea" containerName="registry-server" containerID="cri-o://3b930c95a5e6679b422158fbaa0e7a54a98cade25b8187f0bda10e11d991edbd" gracePeriod=2 Oct 02 11:10:14 crc kubenswrapper[4766]: I1002 11:10:14.191185 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-llggj" event={"ID":"1727e62e-9173-45f1-b7dc-f4721872708a","Type":"ContainerStarted","Data":"046879d95498b9de714d947e3bac30ee1dab4436bcf27ecef1d32f5c00040199"} Oct 02 11:10:14 crc kubenswrapper[4766]: I1002 11:10:14.191262 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-llggj" event={"ID":"1727e62e-9173-45f1-b7dc-f4721872708a","Type":"ContainerStarted","Data":"54300f157a6684a3dbf78d8720e316a2ede6720f78d93db620a06a5b90c3d925"} Oct 02 11:10:14 crc kubenswrapper[4766]: I1002 11:10:14.214272 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qtqxb" podStartSLOduration=1.5937591420000001 podStartE2EDuration="8.214236122s" podCreationTimestamp="2025-10-02 11:10:06 +0000 UTC" firstStartedPulling="2025-10-02 11:10:07.095864341 +0000 UTC m=+1122.038735285" lastFinishedPulling="2025-10-02 11:10:13.716341321 +0000 UTC m=+1128.659212265" observedRunningTime="2025-10-02 11:10:14.208148407 +0000 UTC m=+1129.151019361" watchObservedRunningTime="2025-10-02 11:10:14.214236122 +0000 UTC m=+1129.157107076" Oct 02 11:10:14 crc kubenswrapper[4766]: I1002 11:10:14.225575 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-llggj" podStartSLOduration=4.00893193 podStartE2EDuration="4.225547333s" podCreationTimestamp="2025-10-02 11:10:10 +0000 UTC" firstStartedPulling="2025-10-02 11:10:13.70474446 +0000 UTC m=+1128.647615404" lastFinishedPulling="2025-10-02 11:10:13.921359863 +0000 UTC m=+1128.864230807" observedRunningTime="2025-10-02 11:10:14.22293377 +0000 UTC m=+1129.165804724" watchObservedRunningTime="2025-10-02 11:10:14.225547333 +0000 UTC m=+1129.168418277" Oct 02 11:10:14 crc kubenswrapper[4766]: I1002 11:10:14.613847 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qtqxb" Oct 02 11:10:14 crc kubenswrapper[4766]: I1002 11:10:14.685207 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2shg\" (UniqueName: \"kubernetes.io/projected/e1bba5e2-7ffc-4790-9588-881f7c7dd1ea-kube-api-access-p2shg\") pod \"e1bba5e2-7ffc-4790-9588-881f7c7dd1ea\" (UID: \"e1bba5e2-7ffc-4790-9588-881f7c7dd1ea\") " Oct 02 11:10:14 crc kubenswrapper[4766]: I1002 11:10:14.692415 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1bba5e2-7ffc-4790-9588-881f7c7dd1ea-kube-api-access-p2shg" (OuterVolumeSpecName: "kube-api-access-p2shg") pod "e1bba5e2-7ffc-4790-9588-881f7c7dd1ea" (UID: "e1bba5e2-7ffc-4790-9588-881f7c7dd1ea"). InnerVolumeSpecName "kube-api-access-p2shg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:14 crc kubenswrapper[4766]: I1002 11:10:14.789231 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2shg\" (UniqueName: \"kubernetes.io/projected/e1bba5e2-7ffc-4790-9588-881f7c7dd1ea-kube-api-access-p2shg\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:15 crc kubenswrapper[4766]: I1002 11:10:15.198292 4766 generic.go:334] "Generic (PLEG): container finished" podID="e1bba5e2-7ffc-4790-9588-881f7c7dd1ea" containerID="3b930c95a5e6679b422158fbaa0e7a54a98cade25b8187f0bda10e11d991edbd" exitCode=0 Oct 02 11:10:15 crc kubenswrapper[4766]: I1002 11:10:15.198357 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qtqxb" Oct 02 11:10:15 crc kubenswrapper[4766]: I1002 11:10:15.198381 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qtqxb" event={"ID":"e1bba5e2-7ffc-4790-9588-881f7c7dd1ea","Type":"ContainerDied","Data":"3b930c95a5e6679b422158fbaa0e7a54a98cade25b8187f0bda10e11d991edbd"} Oct 02 11:10:15 crc kubenswrapper[4766]: I1002 11:10:15.198963 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qtqxb" event={"ID":"e1bba5e2-7ffc-4790-9588-881f7c7dd1ea","Type":"ContainerDied","Data":"aadb7aceae6cf6af953ece45db5b30b79295a9acab25c372a3d599c9a0410e4b"} Oct 02 11:10:15 crc kubenswrapper[4766]: I1002 11:10:15.199007 4766 scope.go:117] "RemoveContainer" containerID="3b930c95a5e6679b422158fbaa0e7a54a98cade25b8187f0bda10e11d991edbd" Oct 02 11:10:15 crc kubenswrapper[4766]: I1002 11:10:15.215763 4766 scope.go:117] "RemoveContainer" containerID="3b930c95a5e6679b422158fbaa0e7a54a98cade25b8187f0bda10e11d991edbd" Oct 02 11:10:15 crc kubenswrapper[4766]: E1002 11:10:15.216254 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b930c95a5e6679b422158fbaa0e7a54a98cade25b8187f0bda10e11d991edbd\": container with ID starting with 3b930c95a5e6679b422158fbaa0e7a54a98cade25b8187f0bda10e11d991edbd not found: ID does not exist" containerID="3b930c95a5e6679b422158fbaa0e7a54a98cade25b8187f0bda10e11d991edbd" Oct 02 11:10:15 crc kubenswrapper[4766]: I1002 11:10:15.216300 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b930c95a5e6679b422158fbaa0e7a54a98cade25b8187f0bda10e11d991edbd"} err="failed to get container status \"3b930c95a5e6679b422158fbaa0e7a54a98cade25b8187f0bda10e11d991edbd\": rpc error: code = NotFound desc = could not find container \"3b930c95a5e6679b422158fbaa0e7a54a98cade25b8187f0bda10e11d991edbd\": container with ID starting with 3b930c95a5e6679b422158fbaa0e7a54a98cade25b8187f0bda10e11d991edbd not found: ID does not exist" Oct 02 11:10:15 crc kubenswrapper[4766]: I1002 11:10:15.226797 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-qtqxb"] Oct 02 11:10:15 crc kubenswrapper[4766]: I1002 11:10:15.229970 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-qtqxb"] Oct 02 11:10:15 crc kubenswrapper[4766]: I1002 11:10:15.889826 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1bba5e2-7ffc-4790-9588-881f7c7dd1ea" path="/var/lib/kubelet/pods/e1bba5e2-7ffc-4790-9588-881f7c7dd1ea/volumes" Oct 02 11:10:20 crc kubenswrapper[4766]: I1002 11:10:20.869607 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-llggj" Oct 02 11:10:20 crc kubenswrapper[4766]: I1002 11:10:20.869951 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-llggj" Oct 02 11:10:20 crc kubenswrapper[4766]: I1002 11:10:20.892353 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-llggj" Oct 02 11:10:21 crc kubenswrapper[4766]: I1002 11:10:21.250193 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-llggj" Oct 02 11:10:24 crc kubenswrapper[4766]: I1002 11:10:24.432341 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:10:24 crc kubenswrapper[4766]: I1002 11:10:24.432772 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:10:24 crc kubenswrapper[4766]: I1002 11:10:24.432824 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 11:10:24 crc kubenswrapper[4766]: I1002 11:10:24.433560 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e9d8027960aa5ff2fdb64c8c9c88c1508201265b3f2ec5d57d7c673e50cbb5eb"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:10:24 crc kubenswrapper[4766]: I1002 11:10:24.433620 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://e9d8027960aa5ff2fdb64c8c9c88c1508201265b3f2ec5d57d7c673e50cbb5eb" gracePeriod=600 Oct 02 11:10:25 crc kubenswrapper[4766]: I1002 11:10:25.256211 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="e9d8027960aa5ff2fdb64c8c9c88c1508201265b3f2ec5d57d7c673e50cbb5eb" exitCode=0 Oct 02 11:10:25 crc kubenswrapper[4766]: I1002 11:10:25.256285 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"e9d8027960aa5ff2fdb64c8c9c88c1508201265b3f2ec5d57d7c673e50cbb5eb"} Oct 02 11:10:25 crc kubenswrapper[4766]: I1002 11:10:25.256903 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"586f742ea27e273779868792840bda390cd263c60dd6b64b6d933d49d83569e4"} Oct 02 11:10:25 crc kubenswrapper[4766]: I1002 11:10:25.256937 4766 scope.go:117] "RemoveContainer" containerID="c19749f939a14cc5cbc026d638c61fa14b50810b4586b8fd36f7ac6b16f32c80" Oct 02 11:10:26 crc kubenswrapper[4766]: I1002 11:10:26.809690 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb"] Oct 02 11:10:26 crc kubenswrapper[4766]: E1002 11:10:26.810244 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1bba5e2-7ffc-4790-9588-881f7c7dd1ea" containerName="registry-server" Oct 02 11:10:26 crc kubenswrapper[4766]: I1002 11:10:26.810258 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1bba5e2-7ffc-4790-9588-881f7c7dd1ea" containerName="registry-server" Oct 02 11:10:26 crc kubenswrapper[4766]: I1002 11:10:26.810384 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1bba5e2-7ffc-4790-9588-881f7c7dd1ea" containerName="registry-server" Oct 02 11:10:26 crc kubenswrapper[4766]: I1002 11:10:26.811223 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" Oct 02 11:10:26 crc kubenswrapper[4766]: I1002 11:10:26.813043 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-dvhhq" Oct 02 11:10:26 crc kubenswrapper[4766]: I1002 11:10:26.823718 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb"] Oct 02 11:10:26 crc kubenswrapper[4766]: I1002 11:10:26.848971 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88981c56-312c-4225-b3c2-7fb698637653-bundle\") pod \"157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb\" (UID: \"88981c56-312c-4225-b3c2-7fb698637653\") " pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" Oct 02 11:10:26 crc kubenswrapper[4766]: I1002 11:10:26.849093 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88981c56-312c-4225-b3c2-7fb698637653-util\") pod \"157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb\" (UID: \"88981c56-312c-4225-b3c2-7fb698637653\") " pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" Oct 02 11:10:26 crc kubenswrapper[4766]: I1002 11:10:26.849143 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfpj7\" (UniqueName: \"kubernetes.io/projected/88981c56-312c-4225-b3c2-7fb698637653-kube-api-access-qfpj7\") pod \"157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb\" (UID: \"88981c56-312c-4225-b3c2-7fb698637653\") " pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" Oct 02 11:10:26 crc kubenswrapper[4766]: I1002 11:10:26.950192 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfpj7\" (UniqueName: \"kubernetes.io/projected/88981c56-312c-4225-b3c2-7fb698637653-kube-api-access-qfpj7\") pod \"157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb\" (UID: \"88981c56-312c-4225-b3c2-7fb698637653\") " pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" Oct 02 11:10:26 crc kubenswrapper[4766]: I1002 11:10:26.950308 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88981c56-312c-4225-b3c2-7fb698637653-bundle\") pod \"157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb\" (UID: \"88981c56-312c-4225-b3c2-7fb698637653\") " pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" Oct 02 11:10:26 crc kubenswrapper[4766]: I1002 11:10:26.950561 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88981c56-312c-4225-b3c2-7fb698637653-util\") pod \"157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb\" (UID: \"88981c56-312c-4225-b3c2-7fb698637653\") " pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" Oct 02 11:10:26 crc kubenswrapper[4766]: I1002 11:10:26.950785 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88981c56-312c-4225-b3c2-7fb698637653-bundle\") pod \"157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb\" (UID: \"88981c56-312c-4225-b3c2-7fb698637653\") " pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" Oct 02 11:10:26 crc kubenswrapper[4766]: I1002 11:10:26.951101 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88981c56-312c-4225-b3c2-7fb698637653-util\") pod \"157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb\" (UID: \"88981c56-312c-4225-b3c2-7fb698637653\") " pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" Oct 02 11:10:26 crc kubenswrapper[4766]: I1002 11:10:26.972461 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfpj7\" (UniqueName: \"kubernetes.io/projected/88981c56-312c-4225-b3c2-7fb698637653-kube-api-access-qfpj7\") pod \"157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb\" (UID: \"88981c56-312c-4225-b3c2-7fb698637653\") " pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" Oct 02 11:10:27 crc kubenswrapper[4766]: I1002 11:10:27.128376 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" Oct 02 11:10:27 crc kubenswrapper[4766]: I1002 11:10:27.512518 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb"] Oct 02 11:10:28 crc kubenswrapper[4766]: I1002 11:10:28.285257 4766 generic.go:334] "Generic (PLEG): container finished" podID="88981c56-312c-4225-b3c2-7fb698637653" containerID="b2043d3a4c13bd302b40338f2b9e5dc11e973707061fd7139335f06f6feec939" exitCode=0 Oct 02 11:10:28 crc kubenswrapper[4766]: I1002 11:10:28.285380 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" event={"ID":"88981c56-312c-4225-b3c2-7fb698637653","Type":"ContainerDied","Data":"b2043d3a4c13bd302b40338f2b9e5dc11e973707061fd7139335f06f6feec939"} Oct 02 11:10:28 crc kubenswrapper[4766]: I1002 11:10:28.285657 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" event={"ID":"88981c56-312c-4225-b3c2-7fb698637653","Type":"ContainerStarted","Data":"a005e6779ab27a37ec539b200bb8efce5d15fac8481d9fbe02841319b12911e8"} Oct 02 11:10:28 crc kubenswrapper[4766]: I1002 11:10:28.286987 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:10:31 crc kubenswrapper[4766]: I1002 11:10:31.316463 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" event={"ID":"88981c56-312c-4225-b3c2-7fb698637653","Type":"ContainerStarted","Data":"55fc99aeeb0446e285e358597053b301fce86821e250f656931eb02a2fcd3134"} Oct 02 11:10:32 crc kubenswrapper[4766]: I1002 11:10:32.325354 4766 generic.go:334] "Generic (PLEG): container finished" podID="88981c56-312c-4225-b3c2-7fb698637653" containerID="55fc99aeeb0446e285e358597053b301fce86821e250f656931eb02a2fcd3134" exitCode=0 Oct 02 11:10:32 crc kubenswrapper[4766]: I1002 11:10:32.325472 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" event={"ID":"88981c56-312c-4225-b3c2-7fb698637653","Type":"ContainerDied","Data":"55fc99aeeb0446e285e358597053b301fce86821e250f656931eb02a2fcd3134"} Oct 02 11:10:33 crc kubenswrapper[4766]: I1002 11:10:33.335718 4766 generic.go:334] "Generic (PLEG): container finished" podID="88981c56-312c-4225-b3c2-7fb698637653" containerID="1372717e412fd161269c010c3c12cfff80f61b39f0cdb5865e57596104a52508" exitCode=0 Oct 02 11:10:33 crc kubenswrapper[4766]: I1002 11:10:33.335789 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" event={"ID":"88981c56-312c-4225-b3c2-7fb698637653","Type":"ContainerDied","Data":"1372717e412fd161269c010c3c12cfff80f61b39f0cdb5865e57596104a52508"} Oct 02 11:10:34 crc kubenswrapper[4766]: I1002 11:10:34.606238 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" Oct 02 11:10:34 crc kubenswrapper[4766]: I1002 11:10:34.654924 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88981c56-312c-4225-b3c2-7fb698637653-util\") pod \"88981c56-312c-4225-b3c2-7fb698637653\" (UID: \"88981c56-312c-4225-b3c2-7fb698637653\") " Oct 02 11:10:34 crc kubenswrapper[4766]: I1002 11:10:34.655238 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88981c56-312c-4225-b3c2-7fb698637653-bundle\") pod \"88981c56-312c-4225-b3c2-7fb698637653\" (UID: \"88981c56-312c-4225-b3c2-7fb698637653\") " Oct 02 11:10:34 crc kubenswrapper[4766]: I1002 11:10:34.655320 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfpj7\" (UniqueName: \"kubernetes.io/projected/88981c56-312c-4225-b3c2-7fb698637653-kube-api-access-qfpj7\") pod \"88981c56-312c-4225-b3c2-7fb698637653\" (UID: \"88981c56-312c-4225-b3c2-7fb698637653\") " Oct 02 11:10:34 crc kubenswrapper[4766]: I1002 11:10:34.656817 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88981c56-312c-4225-b3c2-7fb698637653-bundle" (OuterVolumeSpecName: "bundle") pod "88981c56-312c-4225-b3c2-7fb698637653" (UID: "88981c56-312c-4225-b3c2-7fb698637653"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:34 crc kubenswrapper[4766]: I1002 11:10:34.661270 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88981c56-312c-4225-b3c2-7fb698637653-kube-api-access-qfpj7" (OuterVolumeSpecName: "kube-api-access-qfpj7") pod "88981c56-312c-4225-b3c2-7fb698637653" (UID: "88981c56-312c-4225-b3c2-7fb698637653"). InnerVolumeSpecName "kube-api-access-qfpj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:10:34 crc kubenswrapper[4766]: I1002 11:10:34.666626 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88981c56-312c-4225-b3c2-7fb698637653-util" (OuterVolumeSpecName: "util") pod "88981c56-312c-4225-b3c2-7fb698637653" (UID: "88981c56-312c-4225-b3c2-7fb698637653"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:10:34 crc kubenswrapper[4766]: I1002 11:10:34.756629 4766 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88981c56-312c-4225-b3c2-7fb698637653-util\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:34 crc kubenswrapper[4766]: I1002 11:10:34.756662 4766 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88981c56-312c-4225-b3c2-7fb698637653-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:34 crc kubenswrapper[4766]: I1002 11:10:34.756671 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfpj7\" (UniqueName: \"kubernetes.io/projected/88981c56-312c-4225-b3c2-7fb698637653-kube-api-access-qfpj7\") on node \"crc\" DevicePath \"\"" Oct 02 11:10:35 crc kubenswrapper[4766]: I1002 11:10:35.357199 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" event={"ID":"88981c56-312c-4225-b3c2-7fb698637653","Type":"ContainerDied","Data":"a005e6779ab27a37ec539b200bb8efce5d15fac8481d9fbe02841319b12911e8"} Oct 02 11:10:35 crc kubenswrapper[4766]: I1002 11:10:35.357262 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a005e6779ab27a37ec539b200bb8efce5d15fac8481d9fbe02841319b12911e8" Oct 02 11:10:35 crc kubenswrapper[4766]: I1002 11:10:35.357324 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb" Oct 02 11:10:39 crc kubenswrapper[4766]: I1002 11:10:39.546119 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-cc764bd77-pvh4g"] Oct 02 11:10:39 crc kubenswrapper[4766]: E1002 11:10:39.546919 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88981c56-312c-4225-b3c2-7fb698637653" containerName="util" Oct 02 11:10:39 crc kubenswrapper[4766]: I1002 11:10:39.546933 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="88981c56-312c-4225-b3c2-7fb698637653" containerName="util" Oct 02 11:10:39 crc kubenswrapper[4766]: E1002 11:10:39.546955 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88981c56-312c-4225-b3c2-7fb698637653" containerName="pull" Oct 02 11:10:39 crc kubenswrapper[4766]: I1002 11:10:39.546961 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="88981c56-312c-4225-b3c2-7fb698637653" containerName="pull" Oct 02 11:10:39 crc kubenswrapper[4766]: E1002 11:10:39.546972 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88981c56-312c-4225-b3c2-7fb698637653" containerName="extract" Oct 02 11:10:39 crc kubenswrapper[4766]: I1002 11:10:39.546978 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="88981c56-312c-4225-b3c2-7fb698637653" containerName="extract" Oct 02 11:10:39 crc kubenswrapper[4766]: I1002 11:10:39.547079 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="88981c56-312c-4225-b3c2-7fb698637653" containerName="extract" Oct 02 11:10:39 crc kubenswrapper[4766]: I1002 11:10:39.547622 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-cc764bd77-pvh4g" Oct 02 11:10:39 crc kubenswrapper[4766]: I1002 11:10:39.550169 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-cfvmt" Oct 02 11:10:39 crc kubenswrapper[4766]: I1002 11:10:39.577776 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-cc764bd77-pvh4g"] Oct 02 11:10:39 crc kubenswrapper[4766]: I1002 11:10:39.733293 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpl4r\" (UniqueName: \"kubernetes.io/projected/d0a78ee4-d3b9-48b0-941a-cf1c73d8c3b1-kube-api-access-qpl4r\") pod \"openstack-operator-controller-operator-cc764bd77-pvh4g\" (UID: \"d0a78ee4-d3b9-48b0-941a-cf1c73d8c3b1\") " pod="openstack-operators/openstack-operator-controller-operator-cc764bd77-pvh4g" Oct 02 11:10:39 crc kubenswrapper[4766]: I1002 11:10:39.834956 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpl4r\" (UniqueName: \"kubernetes.io/projected/d0a78ee4-d3b9-48b0-941a-cf1c73d8c3b1-kube-api-access-qpl4r\") pod \"openstack-operator-controller-operator-cc764bd77-pvh4g\" (UID: \"d0a78ee4-d3b9-48b0-941a-cf1c73d8c3b1\") " pod="openstack-operators/openstack-operator-controller-operator-cc764bd77-pvh4g" Oct 02 11:10:39 crc kubenswrapper[4766]: I1002 11:10:39.864901 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpl4r\" (UniqueName: \"kubernetes.io/projected/d0a78ee4-d3b9-48b0-941a-cf1c73d8c3b1-kube-api-access-qpl4r\") pod \"openstack-operator-controller-operator-cc764bd77-pvh4g\" (UID: \"d0a78ee4-d3b9-48b0-941a-cf1c73d8c3b1\") " pod="openstack-operators/openstack-operator-controller-operator-cc764bd77-pvh4g" Oct 02 11:10:39 crc kubenswrapper[4766]: I1002 11:10:39.867674 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-cc764bd77-pvh4g" Oct 02 11:10:40 crc kubenswrapper[4766]: I1002 11:10:40.389896 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-cc764bd77-pvh4g"] Oct 02 11:10:40 crc kubenswrapper[4766]: I1002 11:10:40.412025 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-cc764bd77-pvh4g" event={"ID":"d0a78ee4-d3b9-48b0-941a-cf1c73d8c3b1","Type":"ContainerStarted","Data":"31932f38876e6b7a17a28693ff7cba34a19ccdebb9af5efee2abc4cd60a50318"} Oct 02 11:10:44 crc kubenswrapper[4766]: I1002 11:10:44.458051 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-cc764bd77-pvh4g" event={"ID":"d0a78ee4-d3b9-48b0-941a-cf1c73d8c3b1","Type":"ContainerStarted","Data":"290f65749143515d08a389a1cadb601fb0f0e99ac6200e82b7572cc425210d63"} Oct 02 11:10:46 crc kubenswrapper[4766]: I1002 11:10:46.471386 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-cc764bd77-pvh4g" event={"ID":"d0a78ee4-d3b9-48b0-941a-cf1c73d8c3b1","Type":"ContainerStarted","Data":"9982a4d125795bca53c22896b5053d5c336c392131c11d2a2a9d8c971c1c69af"} Oct 02 11:10:46 crc kubenswrapper[4766]: I1002 11:10:46.471749 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-cc764bd77-pvh4g" Oct 02 11:10:46 crc kubenswrapper[4766]: I1002 11:10:46.502905 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-cc764bd77-pvh4g" podStartSLOduration=1.804704431 podStartE2EDuration="7.502887967s" podCreationTimestamp="2025-10-02 11:10:39 +0000 UTC" firstStartedPulling="2025-10-02 11:10:40.403653294 +0000 UTC m=+1155.346524238" lastFinishedPulling="2025-10-02 11:10:46.10183683 +0000 UTC m=+1161.044707774" observedRunningTime="2025-10-02 11:10:46.50175437 +0000 UTC m=+1161.444625334" watchObservedRunningTime="2025-10-02 11:10:46.502887967 +0000 UTC m=+1161.445758911" Oct 02 11:10:49 crc kubenswrapper[4766]: I1002 11:10:49.870491 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-cc764bd77-pvh4g" Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.777122 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-n84bq"] Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.784558 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n84bq" Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.798273 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-dpbr8" Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.814933 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-7wbpc"] Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.816891 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7wbpc" Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.820479 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-n84bq"] Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.823531 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5272s" Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.848608 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-7wbpc"] Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.852608 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-8jlf2"] Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.855248 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8jlf2" Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.866242 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-td2cr" Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.887243 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bdfh\" (UniqueName: \"kubernetes.io/projected/7ef42077-e956-405d-8e5e-ee28586502dd-kube-api-access-2bdfh\") pod \"barbican-operator-controller-manager-6ff8b75857-n84bq\" (UID: \"7ef42077-e956-405d-8e5e-ee28586502dd\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n84bq" Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.904185 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-4jnlb"] Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.912901 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-8jlf2"] Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.913089 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-4jnlb" Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.925396 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-92jqr" Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.948283 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-4jnlb"] Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.963107 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-jc4jh"] Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.964222 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jc4jh" Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.971448 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-p6l2g" Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.989141 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqrs9\" (UniqueName: \"kubernetes.io/projected/eaa7722d-af7b-44aa-992b-9304ab1a56c3-kube-api-access-hqrs9\") pod \"cinder-operator-controller-manager-644bddb6d8-7wbpc\" (UID: \"eaa7722d-af7b-44aa-992b-9304ab1a56c3\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7wbpc" Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.989250 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bdfh\" (UniqueName: \"kubernetes.io/projected/7ef42077-e956-405d-8e5e-ee28586502dd-kube-api-access-2bdfh\") pod \"barbican-operator-controller-manager-6ff8b75857-n84bq\" (UID: \"7ef42077-e956-405d-8e5e-ee28586502dd\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n84bq" Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.989289 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9xpr\" (UniqueName: \"kubernetes.io/projected/712078f7-0205-4259-843b-10ca0a292fcb-kube-api-access-m9xpr\") pod \"glance-operator-controller-manager-84958c4d49-4jnlb\" (UID: \"712078f7-0205-4259-843b-10ca0a292fcb\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-4jnlb" Oct 02 11:11:05 crc kubenswrapper[4766]: I1002 11:11:05.989366 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsnv9\" (UniqueName: \"kubernetes.io/projected/fd0148cc-8cbc-4204-9c03-b6d446ec4b13-kube-api-access-bsnv9\") pod \"designate-operator-controller-manager-84f4f7b77b-8jlf2\" (UID: \"fd0148cc-8cbc-4204-9c03-b6d446ec4b13\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8jlf2" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.009068 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-xjhnl"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.010153 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-xjhnl" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.019982 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-jc4jh"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.041796 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-xjhnl"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.045902 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-86tqw" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.082582 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.083958 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.082678 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bdfh\" (UniqueName: \"kubernetes.io/projected/7ef42077-e956-405d-8e5e-ee28586502dd-kube-api-access-2bdfh\") pod \"barbican-operator-controller-manager-6ff8b75857-n84bq\" (UID: \"7ef42077-e956-405d-8e5e-ee28586502dd\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n84bq" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.090748 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsnv9\" (UniqueName: \"kubernetes.io/projected/fd0148cc-8cbc-4204-9c03-b6d446ec4b13-kube-api-access-bsnv9\") pod \"designate-operator-controller-manager-84f4f7b77b-8jlf2\" (UID: \"fd0148cc-8cbc-4204-9c03-b6d446ec4b13\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8jlf2" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.090845 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffmwz\" (UniqueName: \"kubernetes.io/projected/53ef1cad-2b60-4a0d-896c-958c59652c91-kube-api-access-ffmwz\") pod \"horizon-operator-controller-manager-9f4696d94-xjhnl\" (UID: \"53ef1cad-2b60-4a0d-896c-958c59652c91\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-xjhnl" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.090934 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqrs9\" (UniqueName: \"kubernetes.io/projected/eaa7722d-af7b-44aa-992b-9304ab1a56c3-kube-api-access-hqrs9\") pod \"cinder-operator-controller-manager-644bddb6d8-7wbpc\" (UID: \"eaa7722d-af7b-44aa-992b-9304ab1a56c3\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7wbpc" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.090988 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9xpr\" (UniqueName: \"kubernetes.io/projected/712078f7-0205-4259-843b-10ca0a292fcb-kube-api-access-m9xpr\") pod \"glance-operator-controller-manager-84958c4d49-4jnlb\" (UID: \"712078f7-0205-4259-843b-10ca0a292fcb\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-4jnlb" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.091017 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvs2p\" (UniqueName: \"kubernetes.io/projected/9df7b61f-82c3-4c2f-af77-b152b69666d7-kube-api-access-cvs2p\") pod \"heat-operator-controller-manager-5d889d78cf-jc4jh\" (UID: \"9df7b61f-82c3-4c2f-af77-b152b69666d7\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jc4jh" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.093389 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-94bh8" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.093848 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.120245 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-2sg8s"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.121865 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-2sg8s" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.135984 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.139808 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n84bq" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.140198 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-js7lm" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.144095 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9xpr\" (UniqueName: \"kubernetes.io/projected/712078f7-0205-4259-843b-10ca0a292fcb-kube-api-access-m9xpr\") pod \"glance-operator-controller-manager-84958c4d49-4jnlb\" (UID: \"712078f7-0205-4259-843b-10ca0a292fcb\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-4jnlb" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.161883 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsnv9\" (UniqueName: \"kubernetes.io/projected/fd0148cc-8cbc-4204-9c03-b6d446ec4b13-kube-api-access-bsnv9\") pod \"designate-operator-controller-manager-84f4f7b77b-8jlf2\" (UID: \"fd0148cc-8cbc-4204-9c03-b6d446ec4b13\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8jlf2" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.177548 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8jlf2" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.179244 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqrs9\" (UniqueName: \"kubernetes.io/projected/eaa7722d-af7b-44aa-992b-9304ab1a56c3-kube-api-access-hqrs9\") pod \"cinder-operator-controller-manager-644bddb6d8-7wbpc\" (UID: \"eaa7722d-af7b-44aa-992b-9304ab1a56c3\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7wbpc" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.196312 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15ef5082-eda7-4994-8631-8f896fd8a456-cert\") pod \"infra-operator-controller-manager-9d6c5db85-x4rnr\" (UID: \"15ef5082-eda7-4994-8631-8f896fd8a456\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.196370 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffmwz\" (UniqueName: \"kubernetes.io/projected/53ef1cad-2b60-4a0d-896c-958c59652c91-kube-api-access-ffmwz\") pod \"horizon-operator-controller-manager-9f4696d94-xjhnl\" (UID: \"53ef1cad-2b60-4a0d-896c-958c59652c91\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-xjhnl" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.196407 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wwsl\" (UniqueName: \"kubernetes.io/projected/1e727ede-3058-4edc-8631-a3c12bfa0b32-kube-api-access-4wwsl\") pod \"ironic-operator-controller-manager-5cd4858477-2sg8s\" (UID: \"1e727ede-3058-4edc-8631-a3c12bfa0b32\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-2sg8s" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.196463 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ftbt\" (UniqueName: \"kubernetes.io/projected/15ef5082-eda7-4994-8631-8f896fd8a456-kube-api-access-9ftbt\") pod \"infra-operator-controller-manager-9d6c5db85-x4rnr\" (UID: \"15ef5082-eda7-4994-8631-8f896fd8a456\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.196550 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvs2p\" (UniqueName: \"kubernetes.io/projected/9df7b61f-82c3-4c2f-af77-b152b69666d7-kube-api-access-cvs2p\") pod \"heat-operator-controller-manager-5d889d78cf-jc4jh\" (UID: \"9df7b61f-82c3-4c2f-af77-b152b69666d7\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jc4jh" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.197087 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-2sg8s"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.241200 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-4jnlb" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.244769 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvs2p\" (UniqueName: \"kubernetes.io/projected/9df7b61f-82c3-4c2f-af77-b152b69666d7-kube-api-access-cvs2p\") pod \"heat-operator-controller-manager-5d889d78cf-jc4jh\" (UID: \"9df7b61f-82c3-4c2f-af77-b152b69666d7\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jc4jh" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.247583 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffmwz\" (UniqueName: \"kubernetes.io/projected/53ef1cad-2b60-4a0d-896c-958c59652c91-kube-api-access-ffmwz\") pod \"horizon-operator-controller-manager-9f4696d94-xjhnl\" (UID: \"53ef1cad-2b60-4a0d-896c-958c59652c91\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-xjhnl" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.270148 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-85bcm"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.271774 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-85bcm" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.281793 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-hsxb5" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.292532 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jc4jh" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.299903 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15ef5082-eda7-4994-8631-8f896fd8a456-cert\") pod \"infra-operator-controller-manager-9d6c5db85-x4rnr\" (UID: \"15ef5082-eda7-4994-8631-8f896fd8a456\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.299971 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wwsl\" (UniqueName: \"kubernetes.io/projected/1e727ede-3058-4edc-8631-a3c12bfa0b32-kube-api-access-4wwsl\") pod \"ironic-operator-controller-manager-5cd4858477-2sg8s\" (UID: \"1e727ede-3058-4edc-8631-a3c12bfa0b32\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-2sg8s" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.300055 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ftbt\" (UniqueName: \"kubernetes.io/projected/15ef5082-eda7-4994-8631-8f896fd8a456-kube-api-access-9ftbt\") pod \"infra-operator-controller-manager-9d6c5db85-x4rnr\" (UID: \"15ef5082-eda7-4994-8631-8f896fd8a456\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" Oct 02 11:11:06 crc kubenswrapper[4766]: E1002 11:11:06.300612 4766 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 02 11:11:06 crc kubenswrapper[4766]: E1002 11:11:06.300669 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15ef5082-eda7-4994-8631-8f896fd8a456-cert podName:15ef5082-eda7-4994-8631-8f896fd8a456 nodeName:}" failed. No retries permitted until 2025-10-02 11:11:06.800648462 +0000 UTC m=+1181.743519406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/15ef5082-eda7-4994-8631-8f896fd8a456-cert") pod "infra-operator-controller-manager-9d6c5db85-x4rnr" (UID: "15ef5082-eda7-4994-8631-8f896fd8a456") : secret "infra-operator-webhook-server-cert" not found Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.310201 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-g4js2"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.311707 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-85bcm"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.311860 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-g4js2" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.318134 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-lt4cm" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.324409 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-g4js2"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.342824 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-xjhnl" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.344904 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-wtdmk"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.345767 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ftbt\" (UniqueName: \"kubernetes.io/projected/15ef5082-eda7-4994-8631-8f896fd8a456-kube-api-access-9ftbt\") pod \"infra-operator-controller-manager-9d6c5db85-x4rnr\" (UID: \"15ef5082-eda7-4994-8631-8f896fd8a456\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.346263 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-wtdmk" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.349254 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wwsl\" (UniqueName: \"kubernetes.io/projected/1e727ede-3058-4edc-8631-a3c12bfa0b32-kube-api-access-4wwsl\") pod \"ironic-operator-controller-manager-5cd4858477-2sg8s\" (UID: \"1e727ede-3058-4edc-8631-a3c12bfa0b32\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-2sg8s" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.368996 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-b28pm"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.384425 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-b28pm" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.387169 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-lwrmn" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.387375 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-g9wkv" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.405005 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd8zg\" (UniqueName: \"kubernetes.io/projected/25d1f804-fe78-4cc5-85a4-584ba18bf566-kube-api-access-nd8zg\") pod \"keystone-operator-controller-manager-5bd55b4bff-85bcm\" (UID: \"25d1f804-fe78-4cc5-85a4-584ba18bf566\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-85bcm" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.405101 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgg7g\" (UniqueName: \"kubernetes.io/projected/0b227a91-0adf-4131-bb61-e11c995527ca-kube-api-access-lgg7g\") pod \"manila-operator-controller-manager-6d68dbc695-g4js2\" (UID: \"0b227a91-0adf-4131-bb61-e11c995527ca\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-g4js2" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.420816 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-wtdmk"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.450415 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7wbpc" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.515085 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-d546t"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.538067 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgg7g\" (UniqueName: \"kubernetes.io/projected/0b227a91-0adf-4131-bb61-e11c995527ca-kube-api-access-lgg7g\") pod \"manila-operator-controller-manager-6d68dbc695-g4js2\" (UID: \"0b227a91-0adf-4131-bb61-e11c995527ca\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-g4js2" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.538256 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjjdq\" (UniqueName: \"kubernetes.io/projected/2dbff594-01b2-495a-af08-2c23c0d986de-kube-api-access-pjjdq\") pod \"neutron-operator-controller-manager-849d5b9b84-b28pm\" (UID: \"2dbff594-01b2-495a-af08-2c23c0d986de\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-b28pm" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.538730 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwprt\" (UniqueName: \"kubernetes.io/projected/2ba5311d-1e3c-4bf2-890e-836a7dda4335-kube-api-access-vwprt\") pod \"mariadb-operator-controller-manager-88c7-wtdmk\" (UID: \"2ba5311d-1e3c-4bf2-890e-836a7dda4335\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-wtdmk" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.539194 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd8zg\" (UniqueName: \"kubernetes.io/projected/25d1f804-fe78-4cc5-85a4-584ba18bf566-kube-api-access-nd8zg\") pod \"keystone-operator-controller-manager-5bd55b4bff-85bcm\" (UID: \"25d1f804-fe78-4cc5-85a4-584ba18bf566\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-85bcm" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.556211 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-d546t" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.570163 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-lpzdx" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.589550 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-2sg8s" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.632909 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgg7g\" (UniqueName: \"kubernetes.io/projected/0b227a91-0adf-4131-bb61-e11c995527ca-kube-api-access-lgg7g\") pod \"manila-operator-controller-manager-6d68dbc695-g4js2\" (UID: \"0b227a91-0adf-4131-bb61-e11c995527ca\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-g4js2" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.635933 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-b28pm"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.636653 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd8zg\" (UniqueName: \"kubernetes.io/projected/25d1f804-fe78-4cc5-85a4-584ba18bf566-kube-api-access-nd8zg\") pod \"keystone-operator-controller-manager-5bd55b4bff-85bcm\" (UID: \"25d1f804-fe78-4cc5-85a4-584ba18bf566\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-85bcm" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.675133 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-khvnh"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.677090 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-khvnh" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.685477 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjjdq\" (UniqueName: \"kubernetes.io/projected/2dbff594-01b2-495a-af08-2c23c0d986de-kube-api-access-pjjdq\") pod \"neutron-operator-controller-manager-849d5b9b84-b28pm\" (UID: \"2dbff594-01b2-495a-af08-2c23c0d986de\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-b28pm" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.686667 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27gn8\" (UniqueName: \"kubernetes.io/projected/133e16f7-a3d0-4920-827f-8da5e5d81d98-kube-api-access-27gn8\") pod \"nova-operator-controller-manager-64cd67b5cb-d546t\" (UID: \"133e16f7-a3d0-4920-827f-8da5e5d81d98\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-d546t" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.686793 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwprt\" (UniqueName: \"kubernetes.io/projected/2ba5311d-1e3c-4bf2-890e-836a7dda4335-kube-api-access-vwprt\") pod \"mariadb-operator-controller-manager-88c7-wtdmk\" (UID: \"2ba5311d-1e3c-4bf2-890e-836a7dda4335\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-wtdmk" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.687161 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-r9l74" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.687748 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-d546t"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.692854 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-khvnh"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.719735 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-g4js2" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.733983 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-km85r"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.735631 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-km85r" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.747549 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.756892 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwprt\" (UniqueName: \"kubernetes.io/projected/2ba5311d-1e3c-4bf2-890e-836a7dda4335-kube-api-access-vwprt\") pod \"mariadb-operator-controller-manager-88c7-wtdmk\" (UID: \"2ba5311d-1e3c-4bf2-890e-836a7dda4335\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-wtdmk" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.763861 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjjdq\" (UniqueName: \"kubernetes.io/projected/2dbff594-01b2-495a-af08-2c23c0d986de-kube-api-access-pjjdq\") pod \"neutron-operator-controller-manager-849d5b9b84-b28pm\" (UID: \"2dbff594-01b2-495a-af08-2c23c0d986de\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-b28pm" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.764631 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-wtdmk" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.765373 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-r64dt" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.769106 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.771326 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-km85r"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.789663 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.797442 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2xqz\" (UniqueName: \"kubernetes.io/projected/7c68c9c4-8848-48df-a28a-830a547f469a-kube-api-access-g2xqz\") pod \"octavia-operator-controller-manager-7b787867f4-khvnh\" (UID: \"7c68c9c4-8848-48df-a28a-830a547f469a\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-khvnh" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.797528 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27gn8\" (UniqueName: \"kubernetes.io/projected/133e16f7-a3d0-4920-827f-8da5e5d81d98-kube-api-access-27gn8\") pod \"nova-operator-controller-manager-64cd67b5cb-d546t\" (UID: \"133e16f7-a3d0-4920-827f-8da5e5d81d98\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-d546t" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.801596 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.801886 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-mq6rn" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.828315 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-b28pm" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.829630 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27gn8\" (UniqueName: \"kubernetes.io/projected/133e16f7-a3d0-4920-827f-8da5e5d81d98-kube-api-access-27gn8\") pod \"nova-operator-controller-manager-64cd67b5cb-d546t\" (UID: \"133e16f7-a3d0-4920-827f-8da5e5d81d98\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-d546t" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.905988 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btx2h\" (UniqueName: \"kubernetes.io/projected/2615c22d-ad24-47ec-bfb1-f0227eb91300-kube-api-access-btx2h\") pod \"ovn-operator-controller-manager-9976ff44c-km85r\" (UID: \"2615c22d-ad24-47ec-bfb1-f0227eb91300\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-km85r" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.906072 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15ef5082-eda7-4994-8631-8f896fd8a456-cert\") pod \"infra-operator-controller-manager-9d6c5db85-x4rnr\" (UID: \"15ef5082-eda7-4994-8631-8f896fd8a456\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.906295 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-d546t" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.906868 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2986de4-8b91-42e2-b0a4-4032b1ce7ae5-cert\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-gbnjj\" (UID: \"c2986de4-8b91-42e2-b0a4-4032b1ce7ae5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.906903 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnr8n\" (UniqueName: \"kubernetes.io/projected/c2986de4-8b91-42e2-b0a4-4032b1ce7ae5-kube-api-access-wnr8n\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-gbnjj\" (UID: \"c2986de4-8b91-42e2-b0a4-4032b1ce7ae5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.906949 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2xqz\" (UniqueName: \"kubernetes.io/projected/7c68c9c4-8848-48df-a28a-830a547f469a-kube-api-access-g2xqz\") pod \"octavia-operator-controller-manager-7b787867f4-khvnh\" (UID: \"7c68c9c4-8848-48df-a28a-830a547f469a\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-khvnh" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.910986 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-85bcm" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.919312 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/15ef5082-eda7-4994-8631-8f896fd8a456-cert\") pod \"infra-operator-controller-manager-9d6c5db85-x4rnr\" (UID: \"15ef5082-eda7-4994-8631-8f896fd8a456\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.924839 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-62s67"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.927418 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62s67" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.933377 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-kkj6z" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.937178 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-k5ms6"] Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.937713 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2xqz\" (UniqueName: \"kubernetes.io/projected/7c68c9c4-8848-48df-a28a-830a547f469a-kube-api-access-g2xqz\") pod \"octavia-operator-controller-manager-7b787867f4-khvnh\" (UID: \"7c68c9c4-8848-48df-a28a-830a547f469a\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-khvnh" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.938462 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-k5ms6" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.942586 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-nq8c9" Oct 02 11:11:06 crc kubenswrapper[4766]: I1002 11:11:06.972200 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-62s67"] Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.016302 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2986de4-8b91-42e2-b0a4-4032b1ce7ae5-cert\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-gbnjj\" (UID: \"c2986de4-8b91-42e2-b0a4-4032b1ce7ae5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.016379 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnr8n\" (UniqueName: \"kubernetes.io/projected/c2986de4-8b91-42e2-b0a4-4032b1ce7ae5-kube-api-access-wnr8n\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-gbnjj\" (UID: \"c2986de4-8b91-42e2-b0a4-4032b1ce7ae5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.016552 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm8bj\" (UniqueName: \"kubernetes.io/projected/1ffa0b32-f1ea-4273-baf6-67b9217803b3-kube-api-access-vm8bj\") pod \"swift-operator-controller-manager-84d6b4b759-62s67\" (UID: \"1ffa0b32-f1ea-4273-baf6-67b9217803b3\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62s67" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.016617 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btx2h\" (UniqueName: \"kubernetes.io/projected/2615c22d-ad24-47ec-bfb1-f0227eb91300-kube-api-access-btx2h\") pod \"ovn-operator-controller-manager-9976ff44c-km85r\" (UID: \"2615c22d-ad24-47ec-bfb1-f0227eb91300\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-km85r" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.021967 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-khvnh" Oct 02 11:11:07 crc kubenswrapper[4766]: E1002 11:11:07.022853 4766 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 02 11:11:07 crc kubenswrapper[4766]: E1002 11:11:07.023571 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2986de4-8b91-42e2-b0a4-4032b1ce7ae5-cert podName:c2986de4-8b91-42e2-b0a4-4032b1ce7ae5 nodeName:}" failed. No retries permitted until 2025-10-02 11:11:07.522904413 +0000 UTC m=+1182.465775347 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2986de4-8b91-42e2-b0a4-4032b1ce7ae5-cert") pod "openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" (UID: "c2986de4-8b91-42e2-b0a4-4032b1ce7ae5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.033630 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-k5ms6"] Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.041016 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.049652 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btx2h\" (UniqueName: \"kubernetes.io/projected/2615c22d-ad24-47ec-bfb1-f0227eb91300-kube-api-access-btx2h\") pod \"ovn-operator-controller-manager-9976ff44c-km85r\" (UID: \"2615c22d-ad24-47ec-bfb1-f0227eb91300\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-km85r" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.067091 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnr8n\" (UniqueName: \"kubernetes.io/projected/c2986de4-8b91-42e2-b0a4-4032b1ce7ae5-kube-api-access-wnr8n\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-gbnjj\" (UID: \"c2986de4-8b91-42e2-b0a4-4032b1ce7ae5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.072628 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-kchvm"] Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.073764 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-kchvm" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.077913 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-hzsdb" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.083566 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-kchvm"] Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.105656 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-mn5ks"] Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.106712 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-mn5ks" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.108822 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-t4r98"] Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.109977 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-7zzc8" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.110123 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-t4r98" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.112738 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-gl89f" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.118556 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-km85r" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.118870 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm8bj\" (UniqueName: \"kubernetes.io/projected/1ffa0b32-f1ea-4273-baf6-67b9217803b3-kube-api-access-vm8bj\") pod \"swift-operator-controller-manager-84d6b4b759-62s67\" (UID: \"1ffa0b32-f1ea-4273-baf6-67b9217803b3\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62s67" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.118936 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxzrt\" (UniqueName: \"kubernetes.io/projected/43a5f26a-7b51-4514-afc3-15048f9acec9-kube-api-access-wxzrt\") pod \"placement-operator-controller-manager-589c58c6c-k5ms6\" (UID: \"43a5f26a-7b51-4514-afc3-15048f9acec9\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-k5ms6" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.124895 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-mn5ks"] Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.139754 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm8bj\" (UniqueName: \"kubernetes.io/projected/1ffa0b32-f1ea-4273-baf6-67b9217803b3-kube-api-access-vm8bj\") pod \"swift-operator-controller-manager-84d6b4b759-62s67\" (UID: \"1ffa0b32-f1ea-4273-baf6-67b9217803b3\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62s67" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.143163 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-t4r98"] Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.187209 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz"] Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.198334 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.201039 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-bwt78" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.201059 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.220677 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz"] Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.236380 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-8jlf2"] Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.240486 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xbqj\" (UniqueName: \"kubernetes.io/projected/02c6432b-aae3-4392-9d39-edbbf8b5e48a-kube-api-access-7xbqj\") pod \"test-operator-controller-manager-85777745bb-t4r98\" (UID: \"02c6432b-aae3-4392-9d39-edbbf8b5e48a\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-t4r98" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.240572 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d01c25f-6e83-4e83-8193-203e990ffd70-cert\") pod \"openstack-operator-controller-manager-5f7d749dc7-n4nqz\" (UID: \"7d01c25f-6e83-4e83-8193-203e990ffd70\") " pod="openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.240636 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb9b7\" (UniqueName: \"kubernetes.io/projected/cc498066-7f28-4345-8cd9-b3168f10fe32-kube-api-access-hb9b7\") pod \"telemetry-operator-controller-manager-b8d54b5d7-kchvm\" (UID: \"cc498066-7f28-4345-8cd9-b3168f10fe32\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-kchvm" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.240685 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7g4q\" (UniqueName: \"kubernetes.io/projected/a2658564-9624-44ee-b9ce-1579493d044f-kube-api-access-h7g4q\") pod \"watcher-operator-controller-manager-6b9957f54f-mn5ks\" (UID: \"a2658564-9624-44ee-b9ce-1579493d044f\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-mn5ks" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.240735 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxzrt\" (UniqueName: \"kubernetes.io/projected/43a5f26a-7b51-4514-afc3-15048f9acec9-kube-api-access-wxzrt\") pod \"placement-operator-controller-manager-589c58c6c-k5ms6\" (UID: \"43a5f26a-7b51-4514-afc3-15048f9acec9\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-k5ms6" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.240833 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m54h\" (UniqueName: \"kubernetes.io/projected/7d01c25f-6e83-4e83-8193-203e990ffd70-kube-api-access-7m54h\") pod \"openstack-operator-controller-manager-5f7d749dc7-n4nqz\" (UID: \"7d01c25f-6e83-4e83-8193-203e990ffd70\") " pod="openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.253781 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-n84bq"] Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.269896 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62s67" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.275751 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxzrt\" (UniqueName: \"kubernetes.io/projected/43a5f26a-7b51-4514-afc3-15048f9acec9-kube-api-access-wxzrt\") pod \"placement-operator-controller-manager-589c58c6c-k5ms6\" (UID: \"43a5f26a-7b51-4514-afc3-15048f9acec9\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-k5ms6" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.277691 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8"] Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.278807 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.282204 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-5dcf8" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.285948 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8"] Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.365256 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m54h\" (UniqueName: \"kubernetes.io/projected/7d01c25f-6e83-4e83-8193-203e990ffd70-kube-api-access-7m54h\") pod \"openstack-operator-controller-manager-5f7d749dc7-n4nqz\" (UID: \"7d01c25f-6e83-4e83-8193-203e990ffd70\") " pod="openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.365350 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xbqj\" (UniqueName: \"kubernetes.io/projected/02c6432b-aae3-4392-9d39-edbbf8b5e48a-kube-api-access-7xbqj\") pod \"test-operator-controller-manager-85777745bb-t4r98\" (UID: \"02c6432b-aae3-4392-9d39-edbbf8b5e48a\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-t4r98" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.365378 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d01c25f-6e83-4e83-8193-203e990ffd70-cert\") pod \"openstack-operator-controller-manager-5f7d749dc7-n4nqz\" (UID: \"7d01c25f-6e83-4e83-8193-203e990ffd70\") " pod="openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.365420 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb9b7\" (UniqueName: \"kubernetes.io/projected/cc498066-7f28-4345-8cd9-b3168f10fe32-kube-api-access-hb9b7\") pod \"telemetry-operator-controller-manager-b8d54b5d7-kchvm\" (UID: \"cc498066-7f28-4345-8cd9-b3168f10fe32\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-kchvm" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.365459 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7g4q\" (UniqueName: \"kubernetes.io/projected/a2658564-9624-44ee-b9ce-1579493d044f-kube-api-access-h7g4q\") pod \"watcher-operator-controller-manager-6b9957f54f-mn5ks\" (UID: \"a2658564-9624-44ee-b9ce-1579493d044f\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-mn5ks" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.368687 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-k5ms6" Oct 02 11:11:07 crc kubenswrapper[4766]: E1002 11:11:07.370208 4766 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 02 11:11:07 crc kubenswrapper[4766]: E1002 11:11:07.370271 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d01c25f-6e83-4e83-8193-203e990ffd70-cert podName:7d01c25f-6e83-4e83-8193-203e990ffd70 nodeName:}" failed. No retries permitted until 2025-10-02 11:11:07.870247163 +0000 UTC m=+1182.813118297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7d01c25f-6e83-4e83-8193-203e990ffd70-cert") pod "openstack-operator-controller-manager-5f7d749dc7-n4nqz" (UID: "7d01c25f-6e83-4e83-8193-203e990ffd70") : secret "webhook-server-cert" not found Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.397549 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m54h\" (UniqueName: \"kubernetes.io/projected/7d01c25f-6e83-4e83-8193-203e990ffd70-kube-api-access-7m54h\") pod \"openstack-operator-controller-manager-5f7d749dc7-n4nqz\" (UID: \"7d01c25f-6e83-4e83-8193-203e990ffd70\") " pod="openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.401329 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xbqj\" (UniqueName: \"kubernetes.io/projected/02c6432b-aae3-4392-9d39-edbbf8b5e48a-kube-api-access-7xbqj\") pod \"test-operator-controller-manager-85777745bb-t4r98\" (UID: \"02c6432b-aae3-4392-9d39-edbbf8b5e48a\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-t4r98" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.405175 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb9b7\" (UniqueName: \"kubernetes.io/projected/cc498066-7f28-4345-8cd9-b3168f10fe32-kube-api-access-hb9b7\") pod \"telemetry-operator-controller-manager-b8d54b5d7-kchvm\" (UID: \"cc498066-7f28-4345-8cd9-b3168f10fe32\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-kchvm" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.406770 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7g4q\" (UniqueName: \"kubernetes.io/projected/a2658564-9624-44ee-b9ce-1579493d044f-kube-api-access-h7g4q\") pod \"watcher-operator-controller-manager-6b9957f54f-mn5ks\" (UID: \"a2658564-9624-44ee-b9ce-1579493d044f\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-mn5ks" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.433563 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-kchvm" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.446980 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-mn5ks" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.460852 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-t4r98" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.479685 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfvrx\" (UniqueName: \"kubernetes.io/projected/9b6bf2a3-2784-4940-8a10-a42a0f876577-kube-api-access-vfvrx\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8\" (UID: \"9b6bf2a3-2784-4940-8a10-a42a0f876577\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.580237 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfvrx\" (UniqueName: \"kubernetes.io/projected/9b6bf2a3-2784-4940-8a10-a42a0f876577-kube-api-access-vfvrx\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8\" (UID: \"9b6bf2a3-2784-4940-8a10-a42a0f876577\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.580328 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2986de4-8b91-42e2-b0a4-4032b1ce7ae5-cert\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-gbnjj\" (UID: \"c2986de4-8b91-42e2-b0a4-4032b1ce7ae5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.598919 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2986de4-8b91-42e2-b0a4-4032b1ce7ae5-cert\") pod \"openstack-baremetal-operator-controller-manager-5869cb545-gbnjj\" (UID: \"c2986de4-8b91-42e2-b0a4-4032b1ce7ae5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.619100 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfvrx\" (UniqueName: \"kubernetes.io/projected/9b6bf2a3-2784-4940-8a10-a42a0f876577-kube-api-access-vfvrx\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8\" (UID: \"9b6bf2a3-2784-4940-8a10-a42a0f876577\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.674043 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n84bq" event={"ID":"7ef42077-e956-405d-8e5e-ee28586502dd","Type":"ContainerStarted","Data":"0a2dbf006cce0e5fffe95133a279147ecc784b6bdb15af205db779a1c8dadaaa"} Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.676127 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8jlf2" event={"ID":"fd0148cc-8cbc-4204-9c03-b6d446ec4b13","Type":"ContainerStarted","Data":"4749952761db419f330c331913bfdfc30f6cc2cd3e99da825f3c59954eaa57f6"} Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.754456 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-xjhnl"] Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.764396 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-4jnlb"] Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.787141 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" Oct 02 11:11:07 crc kubenswrapper[4766]: W1002 11:11:07.803086 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53ef1cad_2b60_4a0d_896c_958c59652c91.slice/crio-22354638313c5210b7d1439d6a61cef9dc96136bbf27d5aca117b653fa89f6d0 WatchSource:0}: Error finding container 22354638313c5210b7d1439d6a61cef9dc96136bbf27d5aca117b653fa89f6d0: Status 404 returned error can't find the container with id 22354638313c5210b7d1439d6a61cef9dc96136bbf27d5aca117b653fa89f6d0 Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.822317 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.886753 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d01c25f-6e83-4e83-8193-203e990ffd70-cert\") pod \"openstack-operator-controller-manager-5f7d749dc7-n4nqz\" (UID: \"7d01c25f-6e83-4e83-8193-203e990ffd70\") " pod="openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz" Oct 02 11:11:07 crc kubenswrapper[4766]: I1002 11:11:07.900815 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d01c25f-6e83-4e83-8193-203e990ffd70-cert\") pod \"openstack-operator-controller-manager-5f7d749dc7-n4nqz\" (UID: \"7d01c25f-6e83-4e83-8193-203e990ffd70\") " pod="openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz" Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.065311 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-jc4jh"] Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.083877 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-7wbpc"] Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.087761 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-2sg8s"] Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.093701 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz" Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.098117 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-g4js2"] Oct 02 11:11:08 crc kubenswrapper[4766]: W1002 11:11:08.105367 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e727ede_3058_4edc_8631_a3c12bfa0b32.slice/crio-4636dfdc9a94d2a62a303c04ed3cd5415b9f57775cc08c8c14f3d9a13bca9d61 WatchSource:0}: Error finding container 4636dfdc9a94d2a62a303c04ed3cd5415b9f57775cc08c8c14f3d9a13bca9d61: Status 404 returned error can't find the container with id 4636dfdc9a94d2a62a303c04ed3cd5415b9f57775cc08c8c14f3d9a13bca9d61 Oct 02 11:11:08 crc kubenswrapper[4766]: W1002 11:11:08.117578 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b227a91_0adf_4131_bb61_e11c995527ca.slice/crio-99ac0b593771704e3af6b56d9bea928779891e4d9c15bf8b60f979c657d0f43a WatchSource:0}: Error finding container 99ac0b593771704e3af6b56d9bea928779891e4d9c15bf8b60f979c657d0f43a: Status 404 returned error can't find the container with id 99ac0b593771704e3af6b56d9bea928779891e4d9c15bf8b60f979c657d0f43a Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.379101 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-62s67"] Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.390797 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-d546t"] Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.401976 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-b28pm"] Oct 02 11:11:08 crc kubenswrapper[4766]: W1002 11:11:08.425695 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod133e16f7_a3d0_4920_827f_8da5e5d81d98.slice/crio-f33118043caf1b6e135ae2be518317ea00bcdcaf7e07548fb07c636e4cc271fb WatchSource:0}: Error finding container f33118043caf1b6e135ae2be518317ea00bcdcaf7e07548fb07c636e4cc271fb: Status 404 returned error can't find the container with id f33118043caf1b6e135ae2be518317ea00bcdcaf7e07548fb07c636e4cc271fb Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.428314 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-mn5ks"] Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.452603 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-khvnh"] Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.466660 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr"] Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.482974 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-k5ms6"] Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.504988 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-kchvm"] Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.529770 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-85bcm"] Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.538041 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-km85r"] Oct 02 11:11:08 crc kubenswrapper[4766]: W1002 11:11:08.543569 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25d1f804_fe78_4cc5_85a4_584ba18bf566.slice/crio-0763bab2a3562d22d731a093032ad05a4b54cbe81891e459a155933fd3002fdd WatchSource:0}: Error finding container 0763bab2a3562d22d731a093032ad05a4b54cbe81891e459a155933fd3002fdd: Status 404 returned error can't find the container with id 0763bab2a3562d22d731a093032ad05a4b54cbe81891e459a155933fd3002fdd Oct 02 11:11:08 crc kubenswrapper[4766]: W1002 11:11:08.544065 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c68c9c4_8848_48df_a28a_830a547f469a.slice/crio-1d7328c464bf3093db82905024c3aa8612e8eeac032f3614cc2678a3f5422de1 WatchSource:0}: Error finding container 1d7328c464bf3093db82905024c3aa8612e8eeac032f3614cc2678a3f5422de1: Status 404 returned error can't find the container with id 1d7328c464bf3093db82905024c3aa8612e8eeac032f3614cc2678a3f5422de1 Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.544093 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-wtdmk"] Oct 02 11:11:08 crc kubenswrapper[4766]: E1002 11:11:08.549321 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9ftbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-9d6c5db85-x4rnr_openstack-operators(15ef5082-eda7-4994-8631-8f896fd8a456): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:11:08 crc kubenswrapper[4766]: W1002 11:11:08.551005 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2615c22d_ad24_47ec_bfb1_f0227eb91300.slice/crio-10badb8aaf6e5e41b7f1716026e17be38507ae9b87058a8e7bd88a391408f026 WatchSource:0}: Error finding container 10badb8aaf6e5e41b7f1716026e17be38507ae9b87058a8e7bd88a391408f026: Status 404 returned error can't find the container with id 10badb8aaf6e5e41b7f1716026e17be38507ae9b87058a8e7bd88a391408f026 Oct 02 11:11:08 crc kubenswrapper[4766]: E1002 11:11:08.551121 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vwprt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-88c7-wtdmk_openstack-operators(2ba5311d-1e3c-4bf2-890e-836a7dda4335): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:11:08 crc kubenswrapper[4766]: E1002 11:11:08.554277 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-btx2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-9976ff44c-km85r_openstack-operators(2615c22d-ad24-47ec-bfb1-f0227eb91300): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.666791 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-t4r98"] Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.696064 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj"] Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.698974 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-khvnh" event={"ID":"7c68c9c4-8848-48df-a28a-830a547f469a","Type":"ContainerStarted","Data":"1d7328c464bf3093db82905024c3aa8612e8eeac032f3614cc2678a3f5422de1"} Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.700000 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-kchvm" event={"ID":"cc498066-7f28-4345-8cd9-b3168f10fe32","Type":"ContainerStarted","Data":"55100684387df88c6d4b9723b49df45edbc3bc02f40a61a1a37c13f12dbc1e4d"} Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.700662 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-wtdmk" event={"ID":"2ba5311d-1e3c-4bf2-890e-836a7dda4335","Type":"ContainerStarted","Data":"10f818aa957fd7195831618db9353ea004d6a9ee16120db41fb3829f83e1e67e"} Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.723272 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-85bcm" event={"ID":"25d1f804-fe78-4cc5-85a4-584ba18bf566","Type":"ContainerStarted","Data":"0763bab2a3562d22d731a093032ad05a4b54cbe81891e459a155933fd3002fdd"} Oct 02 11:11:08 crc kubenswrapper[4766]: E1002 11:11:08.727279 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7xbqj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-85777745bb-t4r98_openstack-operators(02c6432b-aae3-4392-9d39-edbbf8b5e48a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.742776 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" event={"ID":"15ef5082-eda7-4994-8631-8f896fd8a456","Type":"ContainerStarted","Data":"839e977f922bc8bc9f5bacf2bda437c6a088ae3c907316b7f437acc5c60747d0"} Oct 02 11:11:08 crc kubenswrapper[4766]: E1002 11:11:08.747267 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vfvrx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8_openstack-operators(9b6bf2a3-2784-4940-8a10-a42a0f876577): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:11:08 crc kubenswrapper[4766]: E1002 11:11:08.748784 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8" podUID="9b6bf2a3-2784-4940-8a10-a42a0f876577" Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.748853 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-mn5ks" event={"ID":"a2658564-9624-44ee-b9ce-1579493d044f","Type":"ContainerStarted","Data":"571f962d16b38a1a254dfaeb362f5d4baa9bc553724e99fd0de61d4987dbb33c"} Oct 02 11:11:08 crc kubenswrapper[4766]: E1002 11:11:08.752221 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wnr8n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-5869cb545-gbnjj_openstack-operators(c2986de4-8b91-42e2-b0a4-4032b1ce7ae5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.754548 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-g4js2" event={"ID":"0b227a91-0adf-4131-bb61-e11c995527ca","Type":"ContainerStarted","Data":"99ac0b593771704e3af6b56d9bea928779891e4d9c15bf8b60f979c657d0f43a"} Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.757123 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8"] Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.763184 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-xjhnl" event={"ID":"53ef1cad-2b60-4a0d-896c-958c59652c91","Type":"ContainerStarted","Data":"22354638313c5210b7d1439d6a61cef9dc96136bbf27d5aca117b653fa89f6d0"} Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.766303 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-2sg8s" event={"ID":"1e727ede-3058-4edc-8631-a3c12bfa0b32","Type":"ContainerStarted","Data":"4636dfdc9a94d2a62a303c04ed3cd5415b9f57775cc08c8c14f3d9a13bca9d61"} Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.775438 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-d546t" event={"ID":"133e16f7-a3d0-4920-827f-8da5e5d81d98","Type":"ContainerStarted","Data":"f33118043caf1b6e135ae2be518317ea00bcdcaf7e07548fb07c636e4cc271fb"} Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.793728 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7wbpc" event={"ID":"eaa7722d-af7b-44aa-992b-9304ab1a56c3","Type":"ContainerStarted","Data":"8ce1824c3014da53480a75a6aacf4f3b27b97d70bd29f5ea9dbb021f3d837e15"} Oct 02 11:11:08 crc kubenswrapper[4766]: E1002 11:11:08.795463 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" podUID="15ef5082-eda7-4994-8631-8f896fd8a456" Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.796165 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62s67" event={"ID":"1ffa0b32-f1ea-4273-baf6-67b9217803b3","Type":"ContainerStarted","Data":"a9ef538eaf45cb574d7d92449ccb96f2f968850ef6ce5321b320b615f2431412"} Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.797576 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-k5ms6" event={"ID":"43a5f26a-7b51-4514-afc3-15048f9acec9","Type":"ContainerStarted","Data":"a64a48d56c71695d7f1745b2ecd960d6eb8011fff2bc7512d870690baade18fa"} Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.806488 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz"] Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.834260 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-b28pm" event={"ID":"2dbff594-01b2-495a-af08-2c23c0d986de","Type":"ContainerStarted","Data":"eed71a48b10ad2a51637df243e424812f4f73b51e7a3b5141e6c142c3f2a68ce"} Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.849985 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-4jnlb" event={"ID":"712078f7-0205-4259-843b-10ca0a292fcb","Type":"ContainerStarted","Data":"cd6cdfd8babc1362b5f2aa78087d76c0ba3e9e6a566be573ceb9689d72989668"} Oct 02 11:11:08 crc kubenswrapper[4766]: W1002 11:11:08.851283 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d01c25f_6e83_4e83_8193_203e990ffd70.slice/crio-e0f08998ba0653905d5196dce6790bc01d84f81beb87b718310bac740913094b WatchSource:0}: Error finding container e0f08998ba0653905d5196dce6790bc01d84f81beb87b718310bac740913094b: Status 404 returned error can't find the container with id e0f08998ba0653905d5196dce6790bc01d84f81beb87b718310bac740913094b Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.851448 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-km85r" event={"ID":"2615c22d-ad24-47ec-bfb1-f0227eb91300","Type":"ContainerStarted","Data":"10badb8aaf6e5e41b7f1716026e17be38507ae9b87058a8e7bd88a391408f026"} Oct 02 11:11:08 crc kubenswrapper[4766]: I1002 11:11:08.853581 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jc4jh" event={"ID":"9df7b61f-82c3-4c2f-af77-b152b69666d7","Type":"ContainerStarted","Data":"8af38317820841eb07a0a06ff1102d669f6289fb0fd6e88e5bccdbde59618f78"} Oct 02 11:11:08 crc kubenswrapper[4766]: E1002 11:11:08.891467 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-km85r" podUID="2615c22d-ad24-47ec-bfb1-f0227eb91300" Oct 02 11:11:08 crc kubenswrapper[4766]: E1002 11:11:08.895636 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-wtdmk" podUID="2ba5311d-1e3c-4bf2-890e-836a7dda4335" Oct 02 11:11:09 crc kubenswrapper[4766]: E1002 11:11:09.004813 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-t4r98" podUID="02c6432b-aae3-4392-9d39-edbbf8b5e48a" Oct 02 11:11:09 crc kubenswrapper[4766]: E1002 11:11:09.054984 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" podUID="c2986de4-8b91-42e2-b0a4-4032b1ce7ae5" Oct 02 11:11:09 crc kubenswrapper[4766]: E1002 11:11:09.908873 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" podUID="15ef5082-eda7-4994-8631-8f896fd8a456" Oct 02 11:11:09 crc kubenswrapper[4766]: I1002 11:11:09.910590 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" event={"ID":"15ef5082-eda7-4994-8631-8f896fd8a456","Type":"ContainerStarted","Data":"5956e8e8b7a13c1298cd6aa01170bce7550a297f047826dce21b37d6a15b542f"} Oct 02 11:11:09 crc kubenswrapper[4766]: I1002 11:11:09.910620 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-km85r" event={"ID":"2615c22d-ad24-47ec-bfb1-f0227eb91300","Type":"ContainerStarted","Data":"447c6e77c58c485efcf270f3d158a78193eb6611c4803a57717b410d56dc6ebf"} Oct 02 11:11:09 crc kubenswrapper[4766]: E1002 11:11:09.917670 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-km85r" podUID="2615c22d-ad24-47ec-bfb1-f0227eb91300" Oct 02 11:11:09 crc kubenswrapper[4766]: I1002 11:11:09.927343 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-wtdmk" event={"ID":"2ba5311d-1e3c-4bf2-890e-836a7dda4335","Type":"ContainerStarted","Data":"db7d493d81b07c605f8f2399c35a8dcb88b69db34b3f4b7514c4a5e3299f0f43"} Oct 02 11:11:09 crc kubenswrapper[4766]: E1002 11:11:09.929336 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-wtdmk" podUID="2ba5311d-1e3c-4bf2-890e-836a7dda4335" Oct 02 11:11:09 crc kubenswrapper[4766]: I1002 11:11:09.931421 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8" event={"ID":"9b6bf2a3-2784-4940-8a10-a42a0f876577","Type":"ContainerStarted","Data":"827298d8b99f35f0fc38b6b03a854f5990f0408c1ea899802f4ddde2d1d28333"} Oct 02 11:11:09 crc kubenswrapper[4766]: E1002 11:11:09.933189 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8" podUID="9b6bf2a3-2784-4940-8a10-a42a0f876577" Oct 02 11:11:09 crc kubenswrapper[4766]: I1002 11:11:09.947697 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz" event={"ID":"7d01c25f-6e83-4e83-8193-203e990ffd70","Type":"ContainerStarted","Data":"b16d59e48d656dda9bb9161ef0a66aa441b3f1f8483ed8106f04dd90a74c42cd"} Oct 02 11:11:09 crc kubenswrapper[4766]: I1002 11:11:09.947749 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz" event={"ID":"7d01c25f-6e83-4e83-8193-203e990ffd70","Type":"ContainerStarted","Data":"a428bdc1a5bc9963014d3071df084f404c1c482e7f4fa41df7fb436a593ef7b9"} Oct 02 11:11:09 crc kubenswrapper[4766]: I1002 11:11:09.947759 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz" event={"ID":"7d01c25f-6e83-4e83-8193-203e990ffd70","Type":"ContainerStarted","Data":"e0f08998ba0653905d5196dce6790bc01d84f81beb87b718310bac740913094b"} Oct 02 11:11:09 crc kubenswrapper[4766]: I1002 11:11:09.949162 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz" Oct 02 11:11:09 crc kubenswrapper[4766]: I1002 11:11:09.963450 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" event={"ID":"c2986de4-8b91-42e2-b0a4-4032b1ce7ae5","Type":"ContainerStarted","Data":"1b595816c5f028cfd174f41439a94f1d2a78be7b544663039b0dee191f3f9d62"} Oct 02 11:11:09 crc kubenswrapper[4766]: I1002 11:11:09.963523 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" event={"ID":"c2986de4-8b91-42e2-b0a4-4032b1ce7ae5","Type":"ContainerStarted","Data":"b9cc4bd627736385ff5e2a517a283fe374a286825d4e6aa26a8b3f6820bb2c1a"} Oct 02 11:11:09 crc kubenswrapper[4766]: E1002 11:11:09.968224 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" podUID="c2986de4-8b91-42e2-b0a4-4032b1ce7ae5" Oct 02 11:11:09 crc kubenswrapper[4766]: I1002 11:11:09.978147 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-t4r98" event={"ID":"02c6432b-aae3-4392-9d39-edbbf8b5e48a","Type":"ContainerStarted","Data":"c4ecb1fad934f081248f799581a4cadd73740b692603ba21ca9732fd39f50983"} Oct 02 11:11:09 crc kubenswrapper[4766]: I1002 11:11:09.978197 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-t4r98" event={"ID":"02c6432b-aae3-4392-9d39-edbbf8b5e48a","Type":"ContainerStarted","Data":"b7f9f02140260d18a089d3029746c280497dcff39e078c9e8484fcbbb3a25e03"} Oct 02 11:11:09 crc kubenswrapper[4766]: E1002 11:11:09.982291 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-t4r98" podUID="02c6432b-aae3-4392-9d39-edbbf8b5e48a" Oct 02 11:11:10 crc kubenswrapper[4766]: I1002 11:11:10.054129 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz" podStartSLOduration=4.054108531 podStartE2EDuration="4.054108531s" podCreationTimestamp="2025-10-02 11:11:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:11:10.034160604 +0000 UTC m=+1184.977031548" watchObservedRunningTime="2025-10-02 11:11:10.054108531 +0000 UTC m=+1184.996979475" Oct 02 11:11:10 crc kubenswrapper[4766]: E1002 11:11:10.996951 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-km85r" podUID="2615c22d-ad24-47ec-bfb1-f0227eb91300" Oct 02 11:11:10 crc kubenswrapper[4766]: E1002 11:11:10.997323 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" podUID="c2986de4-8b91-42e2-b0a4-4032b1ce7ae5" Oct 02 11:11:11 crc kubenswrapper[4766]: E1002 11:11:11.008763 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:15d7b5a365350a831ca59d984df67fadeccf89d599e487a7597b105afb82ce4a\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-88c7-wtdmk" podUID="2ba5311d-1e3c-4bf2-890e-836a7dda4335" Oct 02 11:11:11 crc kubenswrapper[4766]: E1002 11:11:11.008860 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" podUID="15ef5082-eda7-4994-8631-8f896fd8a456" Oct 02 11:11:11 crc kubenswrapper[4766]: E1002 11:11:11.008999 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8" podUID="9b6bf2a3-2784-4940-8a10-a42a0f876577" Oct 02 11:11:11 crc kubenswrapper[4766]: E1002 11:11:10.998682 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-t4r98" podUID="02c6432b-aae3-4392-9d39-edbbf8b5e48a" Oct 02 11:11:18 crc kubenswrapper[4766]: I1002 11:11:18.100748 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5f7d749dc7-n4nqz" Oct 02 11:11:21 crc kubenswrapper[4766]: I1002 11:11:21.056971 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8jlf2" event={"ID":"fd0148cc-8cbc-4204-9c03-b6d446ec4b13","Type":"ContainerStarted","Data":"29ac92af6d7f8c384c016a282a8ebe857282f7616a06e983cdb6884d3d95fa38"} Oct 02 11:11:21 crc kubenswrapper[4766]: I1002 11:11:21.058628 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-d546t" event={"ID":"133e16f7-a3d0-4920-827f-8da5e5d81d98","Type":"ContainerStarted","Data":"b3da9bbbc0915e83d4785766c9c7350a9fa183d43adad954af1b38e0cb22105f"} Oct 02 11:11:21 crc kubenswrapper[4766]: I1002 11:11:21.060918 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-k5ms6" event={"ID":"43a5f26a-7b51-4514-afc3-15048f9acec9","Type":"ContainerStarted","Data":"45d9fdd7a5de5261561622e86544beee5d1764f75e1aa31431d73250d5ce7d7a"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.075142 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-xjhnl" event={"ID":"53ef1cad-2b60-4a0d-896c-958c59652c91","Type":"ContainerStarted","Data":"ad70bb244e53c257eead0b649ae4b742ceb31ec229014c4f40b498cab1f3aa73"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.078426 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-85bcm" event={"ID":"25d1f804-fe78-4cc5-85a4-584ba18bf566","Type":"ContainerStarted","Data":"2e4e5bf5e749d8906a3f12842d6241e8d78f1503adbeb864e6bf4f8ca226a483"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.083382 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-4jnlb" event={"ID":"712078f7-0205-4259-843b-10ca0a292fcb","Type":"ContainerStarted","Data":"ae7ba28107cf921f1fe8a32529c4a742decad95a612b3599d86df08481a19567"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.087142 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-khvnh" event={"ID":"7c68c9c4-8848-48df-a28a-830a547f469a","Type":"ContainerStarted","Data":"d467f0f9edd3a1fb20840fdefbe93a94c3efb35c103c629037afd19e71ee34ab"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.089153 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62s67" event={"ID":"1ffa0b32-f1ea-4273-baf6-67b9217803b3","Type":"ContainerStarted","Data":"535d26ce5b05a35680f6ae2b4b052505bff449cc3740125f0066217b5e4ce657"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.089178 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62s67" event={"ID":"1ffa0b32-f1ea-4273-baf6-67b9217803b3","Type":"ContainerStarted","Data":"631051951fbef59301f5143fa6791eec9aa9fa4d58657f0163a6e9b09dcfa423"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.090017 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62s67" Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.110225 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-k5ms6" event={"ID":"43a5f26a-7b51-4514-afc3-15048f9acec9","Type":"ContainerStarted","Data":"4dcd3fbec499141c11a238400ef2ae5804d8afc1428579e018a8f44b1f02b03c"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.110432 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-k5ms6" Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.114855 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62s67" podStartSLOduration=3.982992918 podStartE2EDuration="16.114844544s" podCreationTimestamp="2025-10-02 11:11:06 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.43281473 +0000 UTC m=+1183.375685674" lastFinishedPulling="2025-10-02 11:11:20.564666356 +0000 UTC m=+1195.507537300" observedRunningTime="2025-10-02 11:11:22.111638782 +0000 UTC m=+1197.054509736" watchObservedRunningTime="2025-10-02 11:11:22.114844544 +0000 UTC m=+1197.057715488" Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.117539 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-kchvm" event={"ID":"cc498066-7f28-4345-8cd9-b3168f10fe32","Type":"ContainerStarted","Data":"eacd977f3ac9c6e83d77b7f7659c1d023ec97a64065ea26d854bfa4033413f13"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.127551 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-2sg8s" event={"ID":"1e727ede-3058-4edc-8631-a3c12bfa0b32","Type":"ContainerStarted","Data":"1146f7a25c6b963b136a7595a9a9bf1f4aeb65918cf7fc8ddd853f0efd8548dd"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.139111 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8jlf2" event={"ID":"fd0148cc-8cbc-4204-9c03-b6d446ec4b13","Type":"ContainerStarted","Data":"74045de05d25baa70eb18198cf57c236c21a7e81f87389b81b94b43dd0e5d01b"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.139766 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8jlf2" Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.140839 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-k5ms6" podStartSLOduration=4.098916493 podStartE2EDuration="16.140827505s" podCreationTimestamp="2025-10-02 11:11:06 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.537805085 +0000 UTC m=+1183.480676029" lastFinishedPulling="2025-10-02 11:11:20.579716097 +0000 UTC m=+1195.522587041" observedRunningTime="2025-10-02 11:11:22.13785273 +0000 UTC m=+1197.080723674" watchObservedRunningTime="2025-10-02 11:11:22.140827505 +0000 UTC m=+1197.083698449" Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.147967 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jc4jh" event={"ID":"9df7b61f-82c3-4c2f-af77-b152b69666d7","Type":"ContainerStarted","Data":"94907ffde829bc189c9dae6b82d9051cc4f4bc1341e14669e5a890f47cd963a3"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.156969 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n84bq" event={"ID":"7ef42077-e956-405d-8e5e-ee28586502dd","Type":"ContainerStarted","Data":"4b0f46d110c20a6c6dd10dd91b7c4eac1c539e5252c142e5ba157604cba6069c"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.157766 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8jlf2" podStartSLOduration=3.707620441 podStartE2EDuration="17.157753516s" podCreationTimestamp="2025-10-02 11:11:05 +0000 UTC" firstStartedPulling="2025-10-02 11:11:07.108862819 +0000 UTC m=+1182.051733773" lastFinishedPulling="2025-10-02 11:11:20.558995884 +0000 UTC m=+1195.501866848" observedRunningTime="2025-10-02 11:11:22.153295514 +0000 UTC m=+1197.096166458" watchObservedRunningTime="2025-10-02 11:11:22.157753516 +0000 UTC m=+1197.100624460" Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.158176 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-b28pm" event={"ID":"2dbff594-01b2-495a-af08-2c23c0d986de","Type":"ContainerStarted","Data":"929e253e9dc9ee60b30bfd353128bd20a1595bc19fdf439c4520745e88dbd171"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.162644 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-g4js2" event={"ID":"0b227a91-0adf-4131-bb61-e11c995527ca","Type":"ContainerStarted","Data":"3e93503fd645d09257a31f44c10ce4202b8e564ca0290e6c43152e83c738ad39"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.165034 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-mn5ks" event={"ID":"a2658564-9624-44ee-b9ce-1579493d044f","Type":"ContainerStarted","Data":"53e91138cedce7245cc796fb498a3246a980263a1dbc196d589fffb35c6f3500"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.165061 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-mn5ks" event={"ID":"a2658564-9624-44ee-b9ce-1579493d044f","Type":"ContainerStarted","Data":"04769a478da2a66a370efea144d7bdd93c45322ed30aee5e08e4e6d23d015a73"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.165932 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-mn5ks" Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.177078 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-d546t" event={"ID":"133e16f7-a3d0-4920-827f-8da5e5d81d98","Type":"ContainerStarted","Data":"bf646a0359645a45a48f6cb7a7dbac7fc3306e325a0b90db63c734b44ee173ab"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.177324 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-d546t" Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.188534 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7wbpc" event={"ID":"eaa7722d-af7b-44aa-992b-9304ab1a56c3","Type":"ContainerStarted","Data":"94fc6cbbd5da948b29554b4ac8ed9e461872895697abcc52870006a523f2ce67"} Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.195793 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-mn5ks" podStartSLOduration=4.112337571 podStartE2EDuration="16.19577088s" podCreationTimestamp="2025-10-02 11:11:06 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.517726823 +0000 UTC m=+1183.460597767" lastFinishedPulling="2025-10-02 11:11:20.601160142 +0000 UTC m=+1195.544031076" observedRunningTime="2025-10-02 11:11:22.188883491 +0000 UTC m=+1197.131754435" watchObservedRunningTime="2025-10-02 11:11:22.19577088 +0000 UTC m=+1197.138641824" Oct 02 11:11:22 crc kubenswrapper[4766]: I1002 11:11:22.211210 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-d546t" podStartSLOduration=4.086960541 podStartE2EDuration="16.211187663s" podCreationTimestamp="2025-10-02 11:11:06 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.440452784 +0000 UTC m=+1183.383323728" lastFinishedPulling="2025-10-02 11:11:20.564679906 +0000 UTC m=+1195.507550850" observedRunningTime="2025-10-02 11:11:22.206811593 +0000 UTC m=+1197.149682537" watchObservedRunningTime="2025-10-02 11:11:22.211187663 +0000 UTC m=+1197.154058607" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.200005 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-2sg8s" event={"ID":"1e727ede-3058-4edc-8631-a3c12bfa0b32","Type":"ContainerStarted","Data":"903801bb70dba949eca082626af9df3993c738e1f41bfdc70cbf1af0a8fe6ea7"} Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.200946 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-2sg8s" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.210296 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jc4jh" event={"ID":"9df7b61f-82c3-4c2f-af77-b152b69666d7","Type":"ContainerStarted","Data":"fbf6ae3f396f6274adcfefd17dbcdd510df3ceefb1e91093df6e89e4d023c51f"} Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.211106 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jc4jh" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.212836 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n84bq" event={"ID":"7ef42077-e956-405d-8e5e-ee28586502dd","Type":"ContainerStarted","Data":"8a2c2990ffcdbf6e305dab325ac2195009cbec62449dedc6f966b9a3d982fc47"} Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.213269 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n84bq" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.215256 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7wbpc" event={"ID":"eaa7722d-af7b-44aa-992b-9304ab1a56c3","Type":"ContainerStarted","Data":"19bfb26123277871f74638b9040e3123a8005c69af38dcce283d89e17342d169"} Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.215693 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7wbpc" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.218993 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-kchvm" event={"ID":"cc498066-7f28-4345-8cd9-b3168f10fe32","Type":"ContainerStarted","Data":"382ebe9923db2ce1984a9cd70b3baabb73d2f74f59c587a0c78e7bda73961d68"} Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.219910 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-kchvm" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.227153 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-2sg8s" podStartSLOduration=5.782654293 podStartE2EDuration="18.22713227s" podCreationTimestamp="2025-10-02 11:11:05 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.114361302 +0000 UTC m=+1183.057232246" lastFinishedPulling="2025-10-02 11:11:20.558839269 +0000 UTC m=+1195.501710223" observedRunningTime="2025-10-02 11:11:23.218918297 +0000 UTC m=+1198.161789241" watchObservedRunningTime="2025-10-02 11:11:23.22713227 +0000 UTC m=+1198.170003214" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.227343 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-4jnlb" event={"ID":"712078f7-0205-4259-843b-10ca0a292fcb","Type":"ContainerStarted","Data":"911be2c2a5c7178769fb7f33947e0af9c8450fee5d1e1858f6806891c406c2ae"} Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.227607 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-4jnlb" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.235102 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-khvnh" event={"ID":"7c68c9c4-8848-48df-a28a-830a547f469a","Type":"ContainerStarted","Data":"f5bd302c897de21e0f2db3926f695b8051afc990455f167acdfa128c902bec0d"} Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.235687 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-khvnh" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.240865 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n84bq" podStartSLOduration=4.930309095 podStartE2EDuration="18.240847358s" podCreationTimestamp="2025-10-02 11:11:05 +0000 UTC" firstStartedPulling="2025-10-02 11:11:07.265100133 +0000 UTC m=+1182.207971077" lastFinishedPulling="2025-10-02 11:11:20.575638396 +0000 UTC m=+1195.518509340" observedRunningTime="2025-10-02 11:11:23.238004247 +0000 UTC m=+1198.180875201" watchObservedRunningTime="2025-10-02 11:11:23.240847358 +0000 UTC m=+1198.183718302" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.243488 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-b28pm" event={"ID":"2dbff594-01b2-495a-af08-2c23c0d986de","Type":"ContainerStarted","Data":"e1a04e164ddf389c4e83ff8022809a40e3b90aa5b35a2b645197bcf161af695c"} Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.243622 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-b28pm" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.251020 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-85bcm" event={"ID":"25d1f804-fe78-4cc5-85a4-584ba18bf566","Type":"ContainerStarted","Data":"e531860f70e19a1dff708cb9a23b20926979cf843e7eb96b13260d8138b8e25d"} Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.251482 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-85bcm" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.253404 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-g4js2" event={"ID":"0b227a91-0adf-4131-bb61-e11c995527ca","Type":"ContainerStarted","Data":"8379038d093158110aa48c6d62c81a6e92b65604d7f352495cae9a7d99e77d48"} Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.253575 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-g4js2" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.261100 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-xjhnl" event={"ID":"53ef1cad-2b60-4a0d-896c-958c59652c91","Type":"ContainerStarted","Data":"043f1f576d188892e0588fa4dff42350ed5005d0a127ea93781aafc934c26aea"} Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.261140 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-xjhnl" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.262602 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jc4jh" podStartSLOduration=5.787112025 podStartE2EDuration="18.262582212s" podCreationTimestamp="2025-10-02 11:11:05 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.089186798 +0000 UTC m=+1183.032057742" lastFinishedPulling="2025-10-02 11:11:20.564656985 +0000 UTC m=+1195.507527929" observedRunningTime="2025-10-02 11:11:23.259827924 +0000 UTC m=+1198.202698878" watchObservedRunningTime="2025-10-02 11:11:23.262582212 +0000 UTC m=+1198.205453156" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.279165 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7wbpc" podStartSLOduration=5.807375573 podStartE2EDuration="18.279143451s" podCreationTimestamp="2025-10-02 11:11:05 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.132346158 +0000 UTC m=+1183.075217112" lastFinishedPulling="2025-10-02 11:11:20.604114026 +0000 UTC m=+1195.546984990" observedRunningTime="2025-10-02 11:11:23.274290947 +0000 UTC m=+1198.217161911" watchObservedRunningTime="2025-10-02 11:11:23.279143451 +0000 UTC m=+1198.222014405" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.297766 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-khvnh" podStartSLOduration=5.318983563 podStartE2EDuration="17.297747677s" podCreationTimestamp="2025-10-02 11:11:06 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.624402602 +0000 UTC m=+1183.567273536" lastFinishedPulling="2025-10-02 11:11:20.603166706 +0000 UTC m=+1195.546037650" observedRunningTime="2025-10-02 11:11:23.289853825 +0000 UTC m=+1198.232724769" watchObservedRunningTime="2025-10-02 11:11:23.297747677 +0000 UTC m=+1198.240618621" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.318363 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-4jnlb" podStartSLOduration=5.507859001 podStartE2EDuration="18.318343434s" podCreationTimestamp="2025-10-02 11:11:05 +0000 UTC" firstStartedPulling="2025-10-02 11:11:07.793644004 +0000 UTC m=+1182.736514968" lastFinishedPulling="2025-10-02 11:11:20.604128457 +0000 UTC m=+1195.546999401" observedRunningTime="2025-10-02 11:11:23.30881186 +0000 UTC m=+1198.251682814" watchObservedRunningTime="2025-10-02 11:11:23.318343434 +0000 UTC m=+1198.261214378" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.332284 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-g4js2" podStartSLOduration=4.831224404 podStartE2EDuration="17.332266849s" podCreationTimestamp="2025-10-02 11:11:06 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.132360368 +0000 UTC m=+1183.075231312" lastFinishedPulling="2025-10-02 11:11:20.633402813 +0000 UTC m=+1195.576273757" observedRunningTime="2025-10-02 11:11:23.329205302 +0000 UTC m=+1198.272076256" watchObservedRunningTime="2025-10-02 11:11:23.332266849 +0000 UTC m=+1198.275137793" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.348989 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-kchvm" podStartSLOduration=5.325598185 podStartE2EDuration="17.348970264s" podCreationTimestamp="2025-10-02 11:11:06 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.535492661 +0000 UTC m=+1183.478363605" lastFinishedPulling="2025-10-02 11:11:20.55886474 +0000 UTC m=+1195.501735684" observedRunningTime="2025-10-02 11:11:23.346001649 +0000 UTC m=+1198.288872593" watchObservedRunningTime="2025-10-02 11:11:23.348970264 +0000 UTC m=+1198.291841208" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.380970 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-85bcm" podStartSLOduration=5.28263779 podStartE2EDuration="17.380953505s" podCreationTimestamp="2025-10-02 11:11:06 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.545765459 +0000 UTC m=+1183.488636403" lastFinishedPulling="2025-10-02 11:11:20.644081174 +0000 UTC m=+1195.586952118" observedRunningTime="2025-10-02 11:11:23.377934709 +0000 UTC m=+1198.320805653" watchObservedRunningTime="2025-10-02 11:11:23.380953505 +0000 UTC m=+1198.323824449" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.382009 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-xjhnl" podStartSLOduration=5.634731706 podStartE2EDuration="18.382002559s" podCreationTimestamp="2025-10-02 11:11:05 +0000 UTC" firstStartedPulling="2025-10-02 11:11:07.855464579 +0000 UTC m=+1182.798335523" lastFinishedPulling="2025-10-02 11:11:20.602735432 +0000 UTC m=+1195.545606376" observedRunningTime="2025-10-02 11:11:23.363963512 +0000 UTC m=+1198.306834466" watchObservedRunningTime="2025-10-02 11:11:23.382002559 +0000 UTC m=+1198.324873503" Oct 02 11:11:23 crc kubenswrapper[4766]: I1002 11:11:23.398381 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-b28pm" podStartSLOduration=5.235919318 podStartE2EDuration="17.398356812s" podCreationTimestamp="2025-10-02 11:11:06 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.441488676 +0000 UTC m=+1183.384359620" lastFinishedPulling="2025-10-02 11:11:20.60392617 +0000 UTC m=+1195.546797114" observedRunningTime="2025-10-02 11:11:23.392089651 +0000 UTC m=+1198.334960595" watchObservedRunningTime="2025-10-02 11:11:23.398356812 +0000 UTC m=+1198.341227756" Oct 02 11:11:25 crc kubenswrapper[4766]: I1002 11:11:25.277787 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" event={"ID":"c2986de4-8b91-42e2-b0a4-4032b1ce7ae5","Type":"ContainerStarted","Data":"31a3ba44b5dd00d8dd4e25c3fb186fd00937982e4794d6e8f351edf2112dd9b6"} Oct 02 11:11:25 crc kubenswrapper[4766]: I1002 11:11:25.278303 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" Oct 02 11:11:25 crc kubenswrapper[4766]: I1002 11:11:25.279713 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" event={"ID":"15ef5082-eda7-4994-8631-8f896fd8a456","Type":"ContainerStarted","Data":"0c1aec216bf96b98e3428557f797f29460caa111e3190cffcd755d0752727119"} Oct 02 11:11:25 crc kubenswrapper[4766]: I1002 11:11:25.279979 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" Oct 02 11:11:25 crc kubenswrapper[4766]: I1002 11:11:25.282255 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-km85r" event={"ID":"2615c22d-ad24-47ec-bfb1-f0227eb91300","Type":"ContainerStarted","Data":"73fbe5468f10d3f1bd3406c17a4791979830b7587b1ac151fe21fe384f3f4100"} Oct 02 11:11:25 crc kubenswrapper[4766]: I1002 11:11:25.302916 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" podStartSLOduration=3.337612535 podStartE2EDuration="19.302898665s" podCreationTimestamp="2025-10-02 11:11:06 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.744870112 +0000 UTC m=+1183.687741066" lastFinishedPulling="2025-10-02 11:11:24.710156252 +0000 UTC m=+1199.653027196" observedRunningTime="2025-10-02 11:11:25.300921301 +0000 UTC m=+1200.243792245" watchObservedRunningTime="2025-10-02 11:11:25.302898665 +0000 UTC m=+1200.245769609" Oct 02 11:11:25 crc kubenswrapper[4766]: I1002 11:11:25.328466 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" podStartSLOduration=4.155613188 podStartE2EDuration="20.328443481s" podCreationTimestamp="2025-10-02 11:11:05 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.549176918 +0000 UTC m=+1183.492047862" lastFinishedPulling="2025-10-02 11:11:24.722007211 +0000 UTC m=+1199.664878155" observedRunningTime="2025-10-02 11:11:25.322943525 +0000 UTC m=+1200.265814459" watchObservedRunningTime="2025-10-02 11:11:25.328443481 +0000 UTC m=+1200.271314425" Oct 02 11:11:25 crc kubenswrapper[4766]: I1002 11:11:25.336300 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-km85r" podStartSLOduration=3.187602681 podStartE2EDuration="19.336282932s" podCreationTimestamp="2025-10-02 11:11:06 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.553899389 +0000 UTC m=+1183.496770333" lastFinishedPulling="2025-10-02 11:11:24.70257964 +0000 UTC m=+1199.645450584" observedRunningTime="2025-10-02 11:11:25.335259569 +0000 UTC m=+1200.278130513" watchObservedRunningTime="2025-10-02 11:11:25.336282932 +0000 UTC m=+1200.279153876" Oct 02 11:11:26 crc kubenswrapper[4766]: I1002 11:11:26.143471 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-n84bq" Oct 02 11:11:26 crc kubenswrapper[4766]: I1002 11:11:26.187179 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8jlf2" Oct 02 11:11:26 crc kubenswrapper[4766]: I1002 11:11:26.246838 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-4jnlb" Oct 02 11:11:26 crc kubenswrapper[4766]: I1002 11:11:26.290886 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-wtdmk" event={"ID":"2ba5311d-1e3c-4bf2-890e-836a7dda4335","Type":"ContainerStarted","Data":"12380ca36dbe8fe4e4f2118a3ea8b00a9ad653f6b0fbdb7a407d3aa7e45a1a67"} Oct 02 11:11:26 crc kubenswrapper[4766]: I1002 11:11:26.292112 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-wtdmk" Oct 02 11:11:26 crc kubenswrapper[4766]: I1002 11:11:26.297286 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-jc4jh" Oct 02 11:11:26 crc kubenswrapper[4766]: I1002 11:11:26.309835 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-wtdmk" podStartSLOduration=3.005118379 podStartE2EDuration="20.309816153s" podCreationTimestamp="2025-10-02 11:11:06 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.551031987 +0000 UTC m=+1183.493902931" lastFinishedPulling="2025-10-02 11:11:25.855729761 +0000 UTC m=+1200.798600705" observedRunningTime="2025-10-02 11:11:26.308681026 +0000 UTC m=+1201.251552000" watchObservedRunningTime="2025-10-02 11:11:26.309816153 +0000 UTC m=+1201.252687097" Oct 02 11:11:26 crc kubenswrapper[4766]: I1002 11:11:26.347514 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-xjhnl" Oct 02 11:11:26 crc kubenswrapper[4766]: I1002 11:11:26.454186 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-7wbpc" Oct 02 11:11:26 crc kubenswrapper[4766]: I1002 11:11:26.591936 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-2sg8s" Oct 02 11:11:26 crc kubenswrapper[4766]: I1002 11:11:26.722426 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-g4js2" Oct 02 11:11:26 crc kubenswrapper[4766]: I1002 11:11:26.832571 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-b28pm" Oct 02 11:11:26 crc kubenswrapper[4766]: I1002 11:11:26.909391 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-d546t" Oct 02 11:11:26 crc kubenswrapper[4766]: I1002 11:11:26.914912 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-85bcm" Oct 02 11:11:27 crc kubenswrapper[4766]: I1002 11:11:27.029102 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-khvnh" Oct 02 11:11:27 crc kubenswrapper[4766]: I1002 11:11:27.120651 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-km85r" Oct 02 11:11:27 crc kubenswrapper[4766]: I1002 11:11:27.272095 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-62s67" Oct 02 11:11:27 crc kubenswrapper[4766]: I1002 11:11:27.373084 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-k5ms6" Oct 02 11:11:27 crc kubenswrapper[4766]: I1002 11:11:27.436532 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-kchvm" Oct 02 11:11:27 crc kubenswrapper[4766]: I1002 11:11:27.457586 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-mn5ks" Oct 02 11:11:29 crc kubenswrapper[4766]: I1002 11:11:29.321539 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-t4r98" event={"ID":"02c6432b-aae3-4392-9d39-edbbf8b5e48a","Type":"ContainerStarted","Data":"23be834e99bf37f29ed18cb648caa25ce67c098e738b2b7694ae3cc82975517e"} Oct 02 11:11:29 crc kubenswrapper[4766]: I1002 11:11:29.323230 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8" event={"ID":"9b6bf2a3-2784-4940-8a10-a42a0f876577","Type":"ContainerStarted","Data":"b2eb94a9333d8636842e69c7d573e079a96ed7eb27399b2bbe3c5d169fc62f27"} Oct 02 11:11:29 crc kubenswrapper[4766]: I1002 11:11:29.323484 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-85777745bb-t4r98" Oct 02 11:11:29 crc kubenswrapper[4766]: I1002 11:11:29.352972 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-85777745bb-t4r98" podStartSLOduration=3.889483581 podStartE2EDuration="23.352949692s" podCreationTimestamp="2025-10-02 11:11:06 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.72664345 +0000 UTC m=+1183.669514394" lastFinishedPulling="2025-10-02 11:11:28.190109561 +0000 UTC m=+1203.132980505" observedRunningTime="2025-10-02 11:11:29.344361937 +0000 UTC m=+1204.287232891" watchObservedRunningTime="2025-10-02 11:11:29.352949692 +0000 UTC m=+1204.295820646" Oct 02 11:11:29 crc kubenswrapper[4766]: I1002 11:11:29.368922 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8" podStartSLOduration=3.922848235 podStartE2EDuration="23.368904981s" podCreationTimestamp="2025-10-02 11:11:06 +0000 UTC" firstStartedPulling="2025-10-02 11:11:08.747113193 +0000 UTC m=+1183.689984137" lastFinishedPulling="2025-10-02 11:11:28.193169939 +0000 UTC m=+1203.136040883" observedRunningTime="2025-10-02 11:11:29.362012712 +0000 UTC m=+1204.304883706" watchObservedRunningTime="2025-10-02 11:11:29.368904981 +0000 UTC m=+1204.311775945" Oct 02 11:11:36 crc kubenswrapper[4766]: I1002 11:11:36.767037 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-wtdmk" Oct 02 11:11:37 crc kubenswrapper[4766]: I1002 11:11:37.046808 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-x4rnr" Oct 02 11:11:37 crc kubenswrapper[4766]: I1002 11:11:37.124936 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-km85r" Oct 02 11:11:37 crc kubenswrapper[4766]: I1002 11:11:37.464431 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-85777745bb-t4r98" Oct 02 11:11:37 crc kubenswrapper[4766]: I1002 11:11:37.795728 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5869cb545-gbnjj" Oct 02 11:11:52 crc kubenswrapper[4766]: I1002 11:11:52.978419 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-92vzl"] Oct 02 11:11:52 crc kubenswrapper[4766]: I1002 11:11:52.982959 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-92vzl" Oct 02 11:11:52 crc kubenswrapper[4766]: I1002 11:11:52.986207 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jxkm5" Oct 02 11:11:52 crc kubenswrapper[4766]: I1002 11:11:52.986206 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 02 11:11:52 crc kubenswrapper[4766]: I1002 11:11:52.986483 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 02 11:11:52 crc kubenswrapper[4766]: I1002 11:11:52.989847 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 02 11:11:52 crc kubenswrapper[4766]: I1002 11:11:52.996738 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-92vzl"] Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.077492 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4s5bb"] Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.078987 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4s5bb" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.080296 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d1918a-81a8-4524-ad8d-94d76610b714-config\") pod \"dnsmasq-dns-675f4bcbfc-92vzl\" (UID: \"c9d1918a-81a8-4524-ad8d-94d76610b714\") " pod="openstack/dnsmasq-dns-675f4bcbfc-92vzl" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.080413 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mphb7\" (UniqueName: \"kubernetes.io/projected/c9d1918a-81a8-4524-ad8d-94d76610b714-kube-api-access-mphb7\") pod \"dnsmasq-dns-675f4bcbfc-92vzl\" (UID: \"c9d1918a-81a8-4524-ad8d-94d76610b714\") " pod="openstack/dnsmasq-dns-675f4bcbfc-92vzl" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.081479 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.090933 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4s5bb"] Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.182254 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9627d9ea-6f56-4671-93cd-e138686ee14c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4s5bb\" (UID: \"9627d9ea-6f56-4671-93cd-e138686ee14c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4s5bb" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.182356 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mphb7\" (UniqueName: \"kubernetes.io/projected/c9d1918a-81a8-4524-ad8d-94d76610b714-kube-api-access-mphb7\") pod \"dnsmasq-dns-675f4bcbfc-92vzl\" (UID: \"c9d1918a-81a8-4524-ad8d-94d76610b714\") " pod="openstack/dnsmasq-dns-675f4bcbfc-92vzl" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.182419 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d1918a-81a8-4524-ad8d-94d76610b714-config\") pod \"dnsmasq-dns-675f4bcbfc-92vzl\" (UID: \"c9d1918a-81a8-4524-ad8d-94d76610b714\") " pod="openstack/dnsmasq-dns-675f4bcbfc-92vzl" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.182438 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m665\" (UniqueName: \"kubernetes.io/projected/9627d9ea-6f56-4671-93cd-e138686ee14c-kube-api-access-5m665\") pod \"dnsmasq-dns-78dd6ddcc-4s5bb\" (UID: \"9627d9ea-6f56-4671-93cd-e138686ee14c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4s5bb" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.182466 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9627d9ea-6f56-4671-93cd-e138686ee14c-config\") pod \"dnsmasq-dns-78dd6ddcc-4s5bb\" (UID: \"9627d9ea-6f56-4671-93cd-e138686ee14c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4s5bb" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.183813 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d1918a-81a8-4524-ad8d-94d76610b714-config\") pod \"dnsmasq-dns-675f4bcbfc-92vzl\" (UID: \"c9d1918a-81a8-4524-ad8d-94d76610b714\") " pod="openstack/dnsmasq-dns-675f4bcbfc-92vzl" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.203753 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mphb7\" (UniqueName: \"kubernetes.io/projected/c9d1918a-81a8-4524-ad8d-94d76610b714-kube-api-access-mphb7\") pod \"dnsmasq-dns-675f4bcbfc-92vzl\" (UID: \"c9d1918a-81a8-4524-ad8d-94d76610b714\") " pod="openstack/dnsmasq-dns-675f4bcbfc-92vzl" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.283838 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m665\" (UniqueName: \"kubernetes.io/projected/9627d9ea-6f56-4671-93cd-e138686ee14c-kube-api-access-5m665\") pod \"dnsmasq-dns-78dd6ddcc-4s5bb\" (UID: \"9627d9ea-6f56-4671-93cd-e138686ee14c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4s5bb" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.283914 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9627d9ea-6f56-4671-93cd-e138686ee14c-config\") pod \"dnsmasq-dns-78dd6ddcc-4s5bb\" (UID: \"9627d9ea-6f56-4671-93cd-e138686ee14c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4s5bb" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.283988 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9627d9ea-6f56-4671-93cd-e138686ee14c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4s5bb\" (UID: \"9627d9ea-6f56-4671-93cd-e138686ee14c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4s5bb" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.285092 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9627d9ea-6f56-4671-93cd-e138686ee14c-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-4s5bb\" (UID: \"9627d9ea-6f56-4671-93cd-e138686ee14c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4s5bb" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.285109 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9627d9ea-6f56-4671-93cd-e138686ee14c-config\") pod \"dnsmasq-dns-78dd6ddcc-4s5bb\" (UID: \"9627d9ea-6f56-4671-93cd-e138686ee14c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4s5bb" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.300433 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m665\" (UniqueName: \"kubernetes.io/projected/9627d9ea-6f56-4671-93cd-e138686ee14c-kube-api-access-5m665\") pod \"dnsmasq-dns-78dd6ddcc-4s5bb\" (UID: \"9627d9ea-6f56-4671-93cd-e138686ee14c\") " pod="openstack/dnsmasq-dns-78dd6ddcc-4s5bb" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.335575 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-92vzl" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.398554 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4s5bb" Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.767230 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-92vzl"] Oct 02 11:11:53 crc kubenswrapper[4766]: W1002 11:11:53.863049 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9627d9ea_6f56_4671_93cd_e138686ee14c.slice/crio-e6767b3a429a80e29c0f79ef927c4b1f1a4afd8ad8c1700c9eb1edad9443c1f7 WatchSource:0}: Error finding container e6767b3a429a80e29c0f79ef927c4b1f1a4afd8ad8c1700c9eb1edad9443c1f7: Status 404 returned error can't find the container with id e6767b3a429a80e29c0f79ef927c4b1f1a4afd8ad8c1700c9eb1edad9443c1f7 Oct 02 11:11:53 crc kubenswrapper[4766]: I1002 11:11:53.863219 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4s5bb"] Oct 02 11:11:54 crc kubenswrapper[4766]: I1002 11:11:54.519463 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-92vzl" event={"ID":"c9d1918a-81a8-4524-ad8d-94d76610b714","Type":"ContainerStarted","Data":"c2ea62bf9e71525a3e0065aae452e93f2df73d28e038d3d5bb7f2d7b6d0c7998"} Oct 02 11:11:54 crc kubenswrapper[4766]: I1002 11:11:54.522840 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4s5bb" event={"ID":"9627d9ea-6f56-4671-93cd-e138686ee14c","Type":"ContainerStarted","Data":"e6767b3a429a80e29c0f79ef927c4b1f1a4afd8ad8c1700c9eb1edad9443c1f7"} Oct 02 11:11:55 crc kubenswrapper[4766]: I1002 11:11:55.707315 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-92vzl"] Oct 02 11:11:55 crc kubenswrapper[4766]: I1002 11:11:55.738773 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-46kpp"] Oct 02 11:11:55 crc kubenswrapper[4766]: I1002 11:11:55.740318 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-46kpp" Oct 02 11:11:55 crc kubenswrapper[4766]: I1002 11:11:55.755412 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-46kpp"] Oct 02 11:11:55 crc kubenswrapper[4766]: I1002 11:11:55.922941 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f3e57be-4f40-4dfe-a515-0ae16a727047-config\") pod \"dnsmasq-dns-666b6646f7-46kpp\" (UID: \"9f3e57be-4f40-4dfe-a515-0ae16a727047\") " pod="openstack/dnsmasq-dns-666b6646f7-46kpp" Oct 02 11:11:55 crc kubenswrapper[4766]: I1002 11:11:55.923068 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhqlq\" (UniqueName: \"kubernetes.io/projected/9f3e57be-4f40-4dfe-a515-0ae16a727047-kube-api-access-qhqlq\") pod \"dnsmasq-dns-666b6646f7-46kpp\" (UID: \"9f3e57be-4f40-4dfe-a515-0ae16a727047\") " pod="openstack/dnsmasq-dns-666b6646f7-46kpp" Oct 02 11:11:55 crc kubenswrapper[4766]: I1002 11:11:55.923134 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f3e57be-4f40-4dfe-a515-0ae16a727047-dns-svc\") pod \"dnsmasq-dns-666b6646f7-46kpp\" (UID: \"9f3e57be-4f40-4dfe-a515-0ae16a727047\") " pod="openstack/dnsmasq-dns-666b6646f7-46kpp" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.024297 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f3e57be-4f40-4dfe-a515-0ae16a727047-dns-svc\") pod \"dnsmasq-dns-666b6646f7-46kpp\" (UID: \"9f3e57be-4f40-4dfe-a515-0ae16a727047\") " pod="openstack/dnsmasq-dns-666b6646f7-46kpp" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.024409 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f3e57be-4f40-4dfe-a515-0ae16a727047-config\") pod \"dnsmasq-dns-666b6646f7-46kpp\" (UID: \"9f3e57be-4f40-4dfe-a515-0ae16a727047\") " pod="openstack/dnsmasq-dns-666b6646f7-46kpp" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.024480 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhqlq\" (UniqueName: \"kubernetes.io/projected/9f3e57be-4f40-4dfe-a515-0ae16a727047-kube-api-access-qhqlq\") pod \"dnsmasq-dns-666b6646f7-46kpp\" (UID: \"9f3e57be-4f40-4dfe-a515-0ae16a727047\") " pod="openstack/dnsmasq-dns-666b6646f7-46kpp" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.026288 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f3e57be-4f40-4dfe-a515-0ae16a727047-config\") pod \"dnsmasq-dns-666b6646f7-46kpp\" (UID: \"9f3e57be-4f40-4dfe-a515-0ae16a727047\") " pod="openstack/dnsmasq-dns-666b6646f7-46kpp" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.026306 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f3e57be-4f40-4dfe-a515-0ae16a727047-dns-svc\") pod \"dnsmasq-dns-666b6646f7-46kpp\" (UID: \"9f3e57be-4f40-4dfe-a515-0ae16a727047\") " pod="openstack/dnsmasq-dns-666b6646f7-46kpp" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.054623 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhqlq\" (UniqueName: \"kubernetes.io/projected/9f3e57be-4f40-4dfe-a515-0ae16a727047-kube-api-access-qhqlq\") pod \"dnsmasq-dns-666b6646f7-46kpp\" (UID: \"9f3e57be-4f40-4dfe-a515-0ae16a727047\") " pod="openstack/dnsmasq-dns-666b6646f7-46kpp" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.071822 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-46kpp" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.138347 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4s5bb"] Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.182099 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vf6vh"] Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.183210 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.204001 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vf6vh"] Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.335580 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcj9f\" (UniqueName: \"kubernetes.io/projected/6ad9e6dc-bd0a-4356-94a1-d5970590092c-kube-api-access-kcj9f\") pod \"dnsmasq-dns-57d769cc4f-vf6vh\" (UID: \"6ad9e6dc-bd0a-4356-94a1-d5970590092c\") " pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.336195 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad9e6dc-bd0a-4356-94a1-d5970590092c-config\") pod \"dnsmasq-dns-57d769cc4f-vf6vh\" (UID: \"6ad9e6dc-bd0a-4356-94a1-d5970590092c\") " pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.336238 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ad9e6dc-bd0a-4356-94a1-d5970590092c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vf6vh\" (UID: \"6ad9e6dc-bd0a-4356-94a1-d5970590092c\") " pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.440147 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad9e6dc-bd0a-4356-94a1-d5970590092c-config\") pod \"dnsmasq-dns-57d769cc4f-vf6vh\" (UID: \"6ad9e6dc-bd0a-4356-94a1-d5970590092c\") " pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.440236 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ad9e6dc-bd0a-4356-94a1-d5970590092c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vf6vh\" (UID: \"6ad9e6dc-bd0a-4356-94a1-d5970590092c\") " pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.441821 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad9e6dc-bd0a-4356-94a1-d5970590092c-config\") pod \"dnsmasq-dns-57d769cc4f-vf6vh\" (UID: \"6ad9e6dc-bd0a-4356-94a1-d5970590092c\") " pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.441854 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ad9e6dc-bd0a-4356-94a1-d5970590092c-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-vf6vh\" (UID: \"6ad9e6dc-bd0a-4356-94a1-d5970590092c\") " pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.441897 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcj9f\" (UniqueName: \"kubernetes.io/projected/6ad9e6dc-bd0a-4356-94a1-d5970590092c-kube-api-access-kcj9f\") pod \"dnsmasq-dns-57d769cc4f-vf6vh\" (UID: \"6ad9e6dc-bd0a-4356-94a1-d5970590092c\") " pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.471319 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcj9f\" (UniqueName: \"kubernetes.io/projected/6ad9e6dc-bd0a-4356-94a1-d5970590092c-kube-api-access-kcj9f\") pod \"dnsmasq-dns-57d769cc4f-vf6vh\" (UID: \"6ad9e6dc-bd0a-4356-94a1-d5970590092c\") " pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.596174 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.678815 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-46kpp"] Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.884944 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.886396 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.888732 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.888732 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.888732 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.893740 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.893931 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-n22qc" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.897909 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.902184 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 02 11:11:56 crc kubenswrapper[4766]: I1002 11:11:56.903827 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.050299 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-config-data\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.050346 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.050379 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xpn4\" (UniqueName: \"kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-kube-api-access-6xpn4\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.050561 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.050602 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.050668 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.050687 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.050821 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1282b506-728d-4c6f-aa9c-3d3c1f826b71-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.050847 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.050908 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1282b506-728d-4c6f-aa9c-3d3c1f826b71-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.050961 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.152739 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.152785 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.152836 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.152851 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1282b506-728d-4c6f-aa9c-3d3c1f826b71-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.152883 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1282b506-728d-4c6f-aa9c-3d3c1f826b71-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.152913 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.152952 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-config-data\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.152972 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.152996 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xpn4\" (UniqueName: \"kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-kube-api-access-6xpn4\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.153018 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.153039 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.153388 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.153757 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.156254 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.156542 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-config-data\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.157087 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.159102 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.160230 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.160854 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.163941 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1282b506-728d-4c6f-aa9c-3d3c1f826b71-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.169015 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1282b506-728d-4c6f-aa9c-3d3c1f826b71-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.173430 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xpn4\" (UniqueName: \"kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-kube-api-access-6xpn4\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.190030 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.209771 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.285003 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.287403 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.291004 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.291113 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-l4zz6" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.291113 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.291317 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.291440 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.291556 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.291773 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.295233 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.457740 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/874d062e-d2f8-462c-95b3-8f630b7120af-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.457856 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.457991 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.458194 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnt6k\" (UniqueName: \"kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-kube-api-access-bnt6k\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.458258 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.458298 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.458427 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.458475 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.458532 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.458579 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.458672 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/874d062e-d2f8-462c-95b3-8f630b7120af-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.560247 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnt6k\" (UniqueName: \"kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-kube-api-access-bnt6k\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.560295 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.560321 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.560364 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.560387 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.560415 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.560464 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.560494 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/874d062e-d2f8-462c-95b3-8f630b7120af-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.560540 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/874d062e-d2f8-462c-95b3-8f630b7120af-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.560543 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.560987 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.561277 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.561563 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.561763 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.561809 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.562419 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.565135 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/874d062e-d2f8-462c-95b3-8f630b7120af-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.565425 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.565624 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.566000 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/874d062e-d2f8-462c-95b3-8f630b7120af-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.566180 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.586583 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnt6k\" (UniqueName: \"kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-kube-api-access-bnt6k\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.589826 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:57 crc kubenswrapper[4766]: I1002 11:11:57.620096 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:11:58 crc kubenswrapper[4766]: I1002 11:11:58.893415 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:11:58 crc kubenswrapper[4766]: I1002 11:11:58.894919 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 11:11:58 crc kubenswrapper[4766]: I1002 11:11:58.898099 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 02 11:11:58 crc kubenswrapper[4766]: I1002 11:11:58.898356 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 02 11:11:58 crc kubenswrapper[4766]: I1002 11:11:58.909651 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 02 11:11:58 crc kubenswrapper[4766]: I1002 11:11:58.909885 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 02 11:11:58 crc kubenswrapper[4766]: I1002 11:11:58.911210 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-2tsrs" Oct 02 11:11:58 crc kubenswrapper[4766]: I1002 11:11:58.927554 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:11:58 crc kubenswrapper[4766]: I1002 11:11:58.935686 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.086619 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-secrets\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.086689 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-config-data-default\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.086709 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-kolla-config\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.086727 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.086906 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e63ea453-c8bd-4128-a47e-7b0d740a6066-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.087033 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.087078 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.087102 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.087178 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6w7d\" (UniqueName: \"kubernetes.io/projected/e63ea453-c8bd-4128-a47e-7b0d740a6066-kube-api-access-d6w7d\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.188301 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.188343 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.188362 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.188384 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6w7d\" (UniqueName: \"kubernetes.io/projected/e63ea453-c8bd-4128-a47e-7b0d740a6066-kube-api-access-d6w7d\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.188417 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-secrets\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.188453 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-config-data-default\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.188469 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-kolla-config\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.188483 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.188537 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e63ea453-c8bd-4128-a47e-7b0d740a6066-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.188640 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.188992 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e63ea453-c8bd-4128-a47e-7b0d740a6066-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.189741 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-kolla-config\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.190091 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-config-data-default\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.190171 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.192363 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.197409 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-secrets\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.197999 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.209170 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.209801 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6w7d\" (UniqueName: \"kubernetes.io/projected/e63ea453-c8bd-4128-a47e-7b0d740a6066-kube-api-access-d6w7d\") pod \"openstack-galera-0\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.234556 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.566688 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-46kpp" event={"ID":"9f3e57be-4f40-4dfe-a515-0ae16a727047","Type":"ContainerStarted","Data":"ef40464f99db160c9d070dfe32b5cc8ce5bb640baeb792baf1397e64daa0bc15"} Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.993254 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:11:59 crc kubenswrapper[4766]: I1002 11:11:59.996322 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:11:59.998901 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:11:59.998999 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:11:59.998905 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-q2pfv" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.000080 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.054378 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.102740 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4cbv\" (UniqueName: \"kubernetes.io/projected/4b9bc510-a878-4e06-8db9-fd6209039c75-kube-api-access-f4cbv\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.102792 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.102819 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.102867 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.102888 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.102915 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.102947 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4b9bc510-a878-4e06-8db9-fd6209039c75-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.103025 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.103058 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.203897 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.203949 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.204006 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.204223 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.205251 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.205920 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4b9bc510-a878-4e06-8db9-fd6209039c75-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.206037 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.206421 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.206468 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4cbv\" (UniqueName: \"kubernetes.io/projected/4b9bc510-a878-4e06-8db9-fd6209039c75-kube-api-access-f4cbv\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.206520 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.206549 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.206822 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4b9bc510-a878-4e06-8db9-fd6209039c75-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.207381 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.208150 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.209857 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.210044 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.210325 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.229933 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4cbv\" (UniqueName: \"kubernetes.io/projected/4b9bc510-a878-4e06-8db9-fd6209039c75-kube-api-access-f4cbv\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.241492 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.320389 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.322063 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.325064 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-trdgn" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.325288 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.331188 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.359311 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.367666 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.513744 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b5a88cf-8095-4025-a68a-349c579dddd3-config-data\") pod \"memcached-0\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " pod="openstack/memcached-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.514096 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5a88cf-8095-4025-a68a-349c579dddd3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " pod="openstack/memcached-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.514162 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dprhj\" (UniqueName: \"kubernetes.io/projected/1b5a88cf-8095-4025-a68a-349c579dddd3-kube-api-access-dprhj\") pod \"memcached-0\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " pod="openstack/memcached-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.514193 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5a88cf-8095-4025-a68a-349c579dddd3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " pod="openstack/memcached-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.514242 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1b5a88cf-8095-4025-a68a-349c579dddd3-kolla-config\") pod \"memcached-0\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " pod="openstack/memcached-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.615854 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b5a88cf-8095-4025-a68a-349c579dddd3-config-data\") pod \"memcached-0\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " pod="openstack/memcached-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.615910 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5a88cf-8095-4025-a68a-349c579dddd3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " pod="openstack/memcached-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.615983 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dprhj\" (UniqueName: \"kubernetes.io/projected/1b5a88cf-8095-4025-a68a-349c579dddd3-kube-api-access-dprhj\") pod \"memcached-0\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " pod="openstack/memcached-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.616014 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5a88cf-8095-4025-a68a-349c579dddd3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " pod="openstack/memcached-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.616076 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1b5a88cf-8095-4025-a68a-349c579dddd3-kolla-config\") pod \"memcached-0\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " pod="openstack/memcached-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.616854 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b5a88cf-8095-4025-a68a-349c579dddd3-config-data\") pod \"memcached-0\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " pod="openstack/memcached-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.617019 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1b5a88cf-8095-4025-a68a-349c579dddd3-kolla-config\") pod \"memcached-0\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " pod="openstack/memcached-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.619678 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5a88cf-8095-4025-a68a-349c579dddd3-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " pod="openstack/memcached-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.623574 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5a88cf-8095-4025-a68a-349c579dddd3-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " pod="openstack/memcached-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.636034 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dprhj\" (UniqueName: \"kubernetes.io/projected/1b5a88cf-8095-4025-a68a-349c579dddd3-kube-api-access-dprhj\") pod \"memcached-0\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " pod="openstack/memcached-0" Oct 02 11:12:00 crc kubenswrapper[4766]: I1002 11:12:00.653956 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 11:12:02 crc kubenswrapper[4766]: I1002 11:12:02.532228 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:12:02 crc kubenswrapper[4766]: I1002 11:12:02.533697 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:12:02 crc kubenswrapper[4766]: I1002 11:12:02.536897 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-hlnqz" Oct 02 11:12:02 crc kubenswrapper[4766]: I1002 11:12:02.540689 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:12:02 crc kubenswrapper[4766]: I1002 11:12:02.645761 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dqbd\" (UniqueName: \"kubernetes.io/projected/e914485f-05fc-4f85-b902-2e43bcfc0bb5-kube-api-access-2dqbd\") pod \"kube-state-metrics-0\" (UID: \"e914485f-05fc-4f85-b902-2e43bcfc0bb5\") " pod="openstack/kube-state-metrics-0" Oct 02 11:12:02 crc kubenswrapper[4766]: I1002 11:12:02.748232 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dqbd\" (UniqueName: \"kubernetes.io/projected/e914485f-05fc-4f85-b902-2e43bcfc0bb5-kube-api-access-2dqbd\") pod \"kube-state-metrics-0\" (UID: \"e914485f-05fc-4f85-b902-2e43bcfc0bb5\") " pod="openstack/kube-state-metrics-0" Oct 02 11:12:02 crc kubenswrapper[4766]: I1002 11:12:02.771974 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dqbd\" (UniqueName: \"kubernetes.io/projected/e914485f-05fc-4f85-b902-2e43bcfc0bb5-kube-api-access-2dqbd\") pod \"kube-state-metrics-0\" (UID: \"e914485f-05fc-4f85-b902-2e43bcfc0bb5\") " pod="openstack/kube-state-metrics-0" Oct 02 11:12:02 crc kubenswrapper[4766]: I1002 11:12:02.910918 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.300482 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dp6x5"] Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.301835 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.303804 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.303977 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-hvvqz" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.305593 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.315555 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dp6x5"] Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.329243 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-8wzw9"] Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.330936 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.368033 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8wzw9"] Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.390338 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9860354f-7494-4b02-bca3-adc731683f7f-ovn-controller-tls-certs\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.390416 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-run-ovn\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.390439 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw5zq\" (UniqueName: \"kubernetes.io/projected/9860354f-7494-4b02-bca3-adc731683f7f-kube-api-access-tw5zq\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.390466 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-run\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.390554 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9860354f-7494-4b02-bca3-adc731683f7f-combined-ca-bundle\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.390722 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-log-ovn\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.390815 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9860354f-7494-4b02-bca3-adc731683f7f-scripts\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.492165 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-etc-ovs\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.492242 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-lib\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.492263 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-run\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.492316 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9860354f-7494-4b02-bca3-adc731683f7f-ovn-controller-tls-certs\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.492540 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-run-ovn\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.492601 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw5zq\" (UniqueName: \"kubernetes.io/projected/9860354f-7494-4b02-bca3-adc731683f7f-kube-api-access-tw5zq\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.492673 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt4qd\" (UniqueName: \"kubernetes.io/projected/d90db976-cd03-4eb7-8e1d-361ef7c5045b-kube-api-access-wt4qd\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.492712 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-run\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.492756 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9860354f-7494-4b02-bca3-adc731683f7f-combined-ca-bundle\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.492815 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d90db976-cd03-4eb7-8e1d-361ef7c5045b-scripts\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.492846 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-log-ovn\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.492896 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-log\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.492954 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9860354f-7494-4b02-bca3-adc731683f7f-scripts\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.493774 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-run\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.493822 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-log-ovn\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.493858 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-run-ovn\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.495875 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9860354f-7494-4b02-bca3-adc731683f7f-scripts\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.498235 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9860354f-7494-4b02-bca3-adc731683f7f-combined-ca-bundle\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.498306 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9860354f-7494-4b02-bca3-adc731683f7f-ovn-controller-tls-certs\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.512062 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw5zq\" (UniqueName: \"kubernetes.io/projected/9860354f-7494-4b02-bca3-adc731683f7f-kube-api-access-tw5zq\") pod \"ovn-controller-dp6x5\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.594745 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-log\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.594845 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-etc-ovs\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.594878 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-lib\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.594902 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-run\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.594973 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt4qd\" (UniqueName: \"kubernetes.io/projected/d90db976-cd03-4eb7-8e1d-361ef7c5045b-kube-api-access-wt4qd\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.595015 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d90db976-cd03-4eb7-8e1d-361ef7c5045b-scripts\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.595599 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-run\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.595681 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-lib\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.595725 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-log\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.595869 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-etc-ovs\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.597785 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d90db976-cd03-4eb7-8e1d-361ef7c5045b-scripts\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.611780 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt4qd\" (UniqueName: \"kubernetes.io/projected/d90db976-cd03-4eb7-8e1d-361ef7c5045b-kube-api-access-wt4qd\") pod \"ovn-controller-ovs-8wzw9\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.629599 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:05 crc kubenswrapper[4766]: I1002 11:12:05.651702 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:08 crc kubenswrapper[4766]: I1002 11:12:08.133809 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:12:08 crc kubenswrapper[4766]: E1002 11:12:08.699130 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 02 11:12:08 crc kubenswrapper[4766]: E1002 11:12:08.699298 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5m665,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-4s5bb_openstack(9627d9ea-6f56-4671-93cd-e138686ee14c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:12:08 crc kubenswrapper[4766]: E1002 11:12:08.700532 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-4s5bb" podUID="9627d9ea-6f56-4671-93cd-e138686ee14c" Oct 02 11:12:08 crc kubenswrapper[4766]: E1002 11:12:08.735387 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 02 11:12:08 crc kubenswrapper[4766]: E1002 11:12:08.735573 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mphb7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-92vzl_openstack(c9d1918a-81a8-4524-ad8d-94d76610b714): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:12:08 crc kubenswrapper[4766]: E1002 11:12:08.737072 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-92vzl" podUID="c9d1918a-81a8-4524-ad8d-94d76610b714" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:08.997280 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.001659 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.004802 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.004922 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-lqtbx" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.005577 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.005814 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.005889 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.010809 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.150476 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b90dab1-a183-4adc-b415-b67bd0d782f7-config\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.150808 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.150838 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.150867 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdkkt\" (UniqueName: \"kubernetes.io/projected/6b90dab1-a183-4adc-b415-b67bd0d782f7-kube-api-access-zdkkt\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.151017 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6b90dab1-a183-4adc-b415-b67bd0d782f7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.151117 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.151176 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.151294 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b90dab1-a183-4adc-b415-b67bd0d782f7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.202449 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.203807 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.207863 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.208315 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-47hz8" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.208375 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.208595 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.219185 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.252261 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.252312 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdkkt\" (UniqueName: \"kubernetes.io/projected/6b90dab1-a183-4adc-b415-b67bd0d782f7-kube-api-access-zdkkt\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.252356 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6b90dab1-a183-4adc-b415-b67bd0d782f7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.252388 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.252415 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.252454 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b90dab1-a183-4adc-b415-b67bd0d782f7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.253091 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.253577 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6b90dab1-a183-4adc-b415-b67bd0d782f7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.253612 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b90dab1-a183-4adc-b415-b67bd0d782f7-config\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.253644 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.254292 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b90dab1-a183-4adc-b415-b67bd0d782f7-config\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.254358 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b90dab1-a183-4adc-b415-b67bd0d782f7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.264635 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.265481 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.268103 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.271903 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdkkt\" (UniqueName: \"kubernetes.io/projected/6b90dab1-a183-4adc-b415-b67bd0d782f7-kube-api-access-zdkkt\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.281213 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.282155 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.296220 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:12:09 crc kubenswrapper[4766]: W1002 11:12:09.299249 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1282b506_728d_4c6f_aa9c_3d3c1f826b71.slice/crio-2a9873b5a34829c9c836ac4d9b6cee686f7ad2f7d0121bc47a4bf389659b0dac WatchSource:0}: Error finding container 2a9873b5a34829c9c836ac4d9b6cee686f7ad2f7d0121bc47a4bf389659b0dac: Status 404 returned error can't find the container with id 2a9873b5a34829c9c836ac4d9b6cee686f7ad2f7d0121bc47a4bf389659b0dac Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.336099 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.354930 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2be5e935-0d64-4fed-a00a-bd0cb5891e75-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.354970 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2be5e935-0d64-4fed-a00a-bd0cb5891e75-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.355038 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.355062 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.355084 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be5e935-0d64-4fed-a00a-bd0cb5891e75-config\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.355102 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.355117 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.355162 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64cj7\" (UniqueName: \"kubernetes.io/projected/2be5e935-0d64-4fed-a00a-bd0cb5891e75-kube-api-access-64cj7\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.413619 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.426727 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vf6vh"] Oct 02 11:12:09 crc kubenswrapper[4766]: W1002 11:12:09.429798 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b9bc510_a878_4e06_8db9_fd6209039c75.slice/crio-6b77f3a526b5379881fc70c788250bd6ccaa071afb0137dd9dd28685b0ef78b0 WatchSource:0}: Error finding container 6b77f3a526b5379881fc70c788250bd6ccaa071afb0137dd9dd28685b0ef78b0: Status 404 returned error can't find the container with id 6b77f3a526b5379881fc70c788250bd6ccaa071afb0137dd9dd28685b0ef78b0 Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.432334 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 11:12:09 crc kubenswrapper[4766]: W1002 11:12:09.436570 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ad9e6dc_bd0a_4356_94a1_d5970590092c.slice/crio-38bf7c15b6496d1102f72e71ee41f4aa73efc5183e356cc1007b84b5e5b89dde WatchSource:0}: Error finding container 38bf7c15b6496d1102f72e71ee41f4aa73efc5183e356cc1007b84b5e5b89dde: Status 404 returned error can't find the container with id 38bf7c15b6496d1102f72e71ee41f4aa73efc5183e356cc1007b84b5e5b89dde Oct 02 11:12:09 crc kubenswrapper[4766]: W1002 11:12:09.439256 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b5a88cf_8095_4025_a68a_349c579dddd3.slice/crio-d7dca218c778173488c590afc7dcdc9ecc806cb40402063798eafd9ab7fd2afa WatchSource:0}: Error finding container d7dca218c778173488c590afc7dcdc9ecc806cb40402063798eafd9ab7fd2afa: Status 404 returned error can't find the container with id d7dca218c778173488c590afc7dcdc9ecc806cb40402063798eafd9ab7fd2afa Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.456452 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.456490 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.456531 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be5e935-0d64-4fed-a00a-bd0cb5891e75-config\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.456559 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.456577 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.456621 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64cj7\" (UniqueName: \"kubernetes.io/projected/2be5e935-0d64-4fed-a00a-bd0cb5891e75-kube-api-access-64cj7\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.456649 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2be5e935-0d64-4fed-a00a-bd0cb5891e75-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.456669 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2be5e935-0d64-4fed-a00a-bd0cb5891e75-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.456853 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.457159 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2be5e935-0d64-4fed-a00a-bd0cb5891e75-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.459667 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be5e935-0d64-4fed-a00a-bd0cb5891e75-config\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.462353 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2be5e935-0d64-4fed-a00a-bd0cb5891e75-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.463533 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.464433 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.465927 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.467704 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.477604 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64cj7\" (UniqueName: \"kubernetes.io/projected/2be5e935-0d64-4fed-a00a-bd0cb5891e75-kube-api-access-64cj7\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.491395 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.540663 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.667920 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dp6x5"] Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.671627 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"874d062e-d2f8-462c-95b3-8f630b7120af","Type":"ContainerStarted","Data":"58b8de417037e2e2d79b7476ad8bb2f7a43aaaf1c7bf0d8a32b4387a620f6f1b"} Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.673703 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1282b506-728d-4c6f-aa9c-3d3c1f826b71","Type":"ContainerStarted","Data":"2a9873b5a34829c9c836ac4d9b6cee686f7ad2f7d0121bc47a4bf389659b0dac"} Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.687480 4766 generic.go:334] "Generic (PLEG): container finished" podID="9f3e57be-4f40-4dfe-a515-0ae16a727047" containerID="9754998c625d8f3ee0089828c33cade24fecb5ebbae19bcd383c04ea43bdb14e" exitCode=0 Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.687568 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-46kpp" event={"ID":"9f3e57be-4f40-4dfe-a515-0ae16a727047","Type":"ContainerDied","Data":"9754998c625d8f3ee0089828c33cade24fecb5ebbae19bcd383c04ea43bdb14e"} Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.697308 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1b5a88cf-8095-4025-a68a-349c579dddd3","Type":"ContainerStarted","Data":"d7dca218c778173488c590afc7dcdc9ecc806cb40402063798eafd9ab7fd2afa"} Oct 02 11:12:09 crc kubenswrapper[4766]: W1002 11:12:09.698712 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9860354f_7494_4b02_bca3_adc731683f7f.slice/crio-c4d3c0190f6dd53c11762d08376b0bf12ee82e4e1a6d3719a4a765474ff672fe WatchSource:0}: Error finding container c4d3c0190f6dd53c11762d08376b0bf12ee82e4e1a6d3719a4a765474ff672fe: Status 404 returned error can't find the container with id c4d3c0190f6dd53c11762d08376b0bf12ee82e4e1a6d3719a4a765474ff672fe Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.699683 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e63ea453-c8bd-4128-a47e-7b0d740a6066","Type":"ContainerStarted","Data":"4292b0c03ec8562f4ede1d43538e612eb70c0fb04b1536d93666a80e1d999558"} Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.724989 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e914485f-05fc-4f85-b902-2e43bcfc0bb5","Type":"ContainerStarted","Data":"b5eccfb23c6ba881ba3874dc0e3a306b2ab69db403f398a47513cd2685f392a5"} Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.727957 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" event={"ID":"6ad9e6dc-bd0a-4356-94a1-d5970590092c","Type":"ContainerStarted","Data":"38bf7c15b6496d1102f72e71ee41f4aa73efc5183e356cc1007b84b5e5b89dde"} Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.730566 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4b9bc510-a878-4e06-8db9-fd6209039c75","Type":"ContainerStarted","Data":"6b77f3a526b5379881fc70c788250bd6ccaa071afb0137dd9dd28685b0ef78b0"} Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.860277 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8wzw9"] Oct 02 11:12:09 crc kubenswrapper[4766]: W1002 11:12:09.912948 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd90db976_cd03_4eb7_8e1d_361ef7c5045b.slice/crio-42a68a963fa83ad37a0b7b70ffbf3c85ac15b367a028dd797b60800bd4367db5 WatchSource:0}: Error finding container 42a68a963fa83ad37a0b7b70ffbf3c85ac15b367a028dd797b60800bd4367db5: Status 404 returned error can't find the container with id 42a68a963fa83ad37a0b7b70ffbf3c85ac15b367a028dd797b60800bd4367db5 Oct 02 11:12:09 crc kubenswrapper[4766]: I1002 11:12:09.956386 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.192943 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4s5bb" Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.277208 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9627d9ea-6f56-4671-93cd-e138686ee14c-config\") pod \"9627d9ea-6f56-4671-93cd-e138686ee14c\" (UID: \"9627d9ea-6f56-4671-93cd-e138686ee14c\") " Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.277392 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m665\" (UniqueName: \"kubernetes.io/projected/9627d9ea-6f56-4671-93cd-e138686ee14c-kube-api-access-5m665\") pod \"9627d9ea-6f56-4671-93cd-e138686ee14c\" (UID: \"9627d9ea-6f56-4671-93cd-e138686ee14c\") " Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.277574 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9627d9ea-6f56-4671-93cd-e138686ee14c-dns-svc\") pod \"9627d9ea-6f56-4671-93cd-e138686ee14c\" (UID: \"9627d9ea-6f56-4671-93cd-e138686ee14c\") " Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.277977 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9627d9ea-6f56-4671-93cd-e138686ee14c-config" (OuterVolumeSpecName: "config") pod "9627d9ea-6f56-4671-93cd-e138686ee14c" (UID: "9627d9ea-6f56-4671-93cd-e138686ee14c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.278353 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9627d9ea-6f56-4671-93cd-e138686ee14c-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.278786 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9627d9ea-6f56-4671-93cd-e138686ee14c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9627d9ea-6f56-4671-93cd-e138686ee14c" (UID: "9627d9ea-6f56-4671-93cd-e138686ee14c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.284364 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-92vzl" Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.288991 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9627d9ea-6f56-4671-93cd-e138686ee14c-kube-api-access-5m665" (OuterVolumeSpecName: "kube-api-access-5m665") pod "9627d9ea-6f56-4671-93cd-e138686ee14c" (UID: "9627d9ea-6f56-4671-93cd-e138686ee14c"). InnerVolumeSpecName "kube-api-access-5m665". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.379763 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mphb7\" (UniqueName: \"kubernetes.io/projected/c9d1918a-81a8-4524-ad8d-94d76610b714-kube-api-access-mphb7\") pod \"c9d1918a-81a8-4524-ad8d-94d76610b714\" (UID: \"c9d1918a-81a8-4524-ad8d-94d76610b714\") " Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.380040 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d1918a-81a8-4524-ad8d-94d76610b714-config\") pod \"c9d1918a-81a8-4524-ad8d-94d76610b714\" (UID: \"c9d1918a-81a8-4524-ad8d-94d76610b714\") " Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.380389 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m665\" (UniqueName: \"kubernetes.io/projected/9627d9ea-6f56-4671-93cd-e138686ee14c-kube-api-access-5m665\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.380404 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9627d9ea-6f56-4671-93cd-e138686ee14c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.380747 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9d1918a-81a8-4524-ad8d-94d76610b714-config" (OuterVolumeSpecName: "config") pod "c9d1918a-81a8-4524-ad8d-94d76610b714" (UID: "c9d1918a-81a8-4524-ad8d-94d76610b714"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.385610 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9d1918a-81a8-4524-ad8d-94d76610b714-kube-api-access-mphb7" (OuterVolumeSpecName: "kube-api-access-mphb7") pod "c9d1918a-81a8-4524-ad8d-94d76610b714" (UID: "c9d1918a-81a8-4524-ad8d-94d76610b714"). InnerVolumeSpecName "kube-api-access-mphb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.476486 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.483708 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mphb7\" (UniqueName: \"kubernetes.io/projected/c9d1918a-81a8-4524-ad8d-94d76610b714-kube-api-access-mphb7\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.483746 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9d1918a-81a8-4524-ad8d-94d76610b714-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.759100 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6b90dab1-a183-4adc-b415-b67bd0d782f7","Type":"ContainerStarted","Data":"9616b5ba569769404f649650902f1ec701ec8ad0ab367da6642ad863f86a8592"} Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.763909 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-92vzl" Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.763904 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-92vzl" event={"ID":"c9d1918a-81a8-4524-ad8d-94d76610b714","Type":"ContainerDied","Data":"c2ea62bf9e71525a3e0065aae452e93f2df73d28e038d3d5bb7f2d7b6d0c7998"} Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.767862 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-46kpp" event={"ID":"9f3e57be-4f40-4dfe-a515-0ae16a727047","Type":"ContainerStarted","Data":"12fa18be97c810b81d24f1b340fdee0c3cb8e2415218a5c125f56166adf693da"} Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.767987 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-46kpp" Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.769325 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-4s5bb" event={"ID":"9627d9ea-6f56-4671-93cd-e138686ee14c","Type":"ContainerDied","Data":"e6767b3a429a80e29c0f79ef927c4b1f1a4afd8ad8c1700c9eb1edad9443c1f7"} Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.769368 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-4s5bb" Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.770180 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dp6x5" event={"ID":"9860354f-7494-4b02-bca3-adc731683f7f","Type":"ContainerStarted","Data":"c4d3c0190f6dd53c11762d08376b0bf12ee82e4e1a6d3719a4a765474ff672fe"} Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.771587 4766 generic.go:334] "Generic (PLEG): container finished" podID="6ad9e6dc-bd0a-4356-94a1-d5970590092c" containerID="c4cf1e0aca6102d4f5cc0f7f8ce024755aea678e08199bd3aa9a0120d784d2f1" exitCode=0 Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.771654 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" event={"ID":"6ad9e6dc-bd0a-4356-94a1-d5970590092c","Type":"ContainerDied","Data":"c4cf1e0aca6102d4f5cc0f7f8ce024755aea678e08199bd3aa9a0120d784d2f1"} Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.772855 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8wzw9" event={"ID":"d90db976-cd03-4eb7-8e1d-361ef7c5045b","Type":"ContainerStarted","Data":"42a68a963fa83ad37a0b7b70ffbf3c85ac15b367a028dd797b60800bd4367db5"} Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.790859 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-46kpp" podStartSLOduration=6.293182885 podStartE2EDuration="15.790838126s" podCreationTimestamp="2025-10-02 11:11:55 +0000 UTC" firstStartedPulling="2025-10-02 11:11:59.263780416 +0000 UTC m=+1234.206651360" lastFinishedPulling="2025-10-02 11:12:08.761435657 +0000 UTC m=+1243.704306601" observedRunningTime="2025-10-02 11:12:10.78464154 +0000 UTC m=+1245.727512494" watchObservedRunningTime="2025-10-02 11:12:10.790838126 +0000 UTC m=+1245.733709070" Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.851017 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4s5bb"] Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.863143 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-4s5bb"] Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.897559 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-92vzl"] Oct 02 11:12:10 crc kubenswrapper[4766]: I1002 11:12:10.911613 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-92vzl"] Oct 02 11:12:11 crc kubenswrapper[4766]: I1002 11:12:11.794108 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2be5e935-0d64-4fed-a00a-bd0cb5891e75","Type":"ContainerStarted","Data":"74e9d0a9036f8f32f58b39c5d0babef8cb96a43502965fde9b8fa4d0219cd980"} Oct 02 11:12:11 crc kubenswrapper[4766]: I1002 11:12:11.893334 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9627d9ea-6f56-4671-93cd-e138686ee14c" path="/var/lib/kubelet/pods/9627d9ea-6f56-4671-93cd-e138686ee14c/volumes" Oct 02 11:12:11 crc kubenswrapper[4766]: I1002 11:12:11.893766 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9d1918a-81a8-4524-ad8d-94d76610b714" path="/var/lib/kubelet/pods/c9d1918a-81a8-4524-ad8d-94d76610b714/volumes" Oct 02 11:12:16 crc kubenswrapper[4766]: I1002 11:12:16.074667 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666b6646f7-46kpp" Oct 02 11:12:17 crc kubenswrapper[4766]: I1002 11:12:17.833472 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e63ea453-c8bd-4128-a47e-7b0d740a6066","Type":"ContainerStarted","Data":"b4a6774b602583404b313774393432a551d673663fd177292369cd290ea65f62"} Oct 02 11:12:17 crc kubenswrapper[4766]: I1002 11:12:17.836609 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1b5a88cf-8095-4025-a68a-349c579dddd3","Type":"ContainerStarted","Data":"909147f23d289681b51aaa92222ea0360b2be9611eda4060826e6759c819c03b"} Oct 02 11:12:17 crc kubenswrapper[4766]: I1002 11:12:17.836673 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 02 11:12:17 crc kubenswrapper[4766]: I1002 11:12:17.839788 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" event={"ID":"6ad9e6dc-bd0a-4356-94a1-d5970590092c","Type":"ContainerStarted","Data":"62812b90687640f4ddd00755721e83b67dc372a14a35af3f2ca9ae4a7472107f"} Oct 02 11:12:17 crc kubenswrapper[4766]: I1002 11:12:17.840292 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" Oct 02 11:12:17 crc kubenswrapper[4766]: I1002 11:12:17.841721 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8wzw9" event={"ID":"d90db976-cd03-4eb7-8e1d-361ef7c5045b","Type":"ContainerStarted","Data":"da98ca489f682c05c8621e04ef1b42fa609b160482856ffde00dc0bb522f3ea4"} Oct 02 11:12:17 crc kubenswrapper[4766]: I1002 11:12:17.843480 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4b9bc510-a878-4e06-8db9-fd6209039c75","Type":"ContainerStarted","Data":"3b287664b7e8befe1a7a58d7e649bb1c340794784b406077aa1021e801ae5c34"} Oct 02 11:12:17 crc kubenswrapper[4766]: I1002 11:12:17.894889 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" podStartSLOduration=21.894872896 podStartE2EDuration="21.894872896s" podCreationTimestamp="2025-10-02 11:11:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:17.888271176 +0000 UTC m=+1252.831142130" watchObservedRunningTime="2025-10-02 11:12:17.894872896 +0000 UTC m=+1252.837743840" Oct 02 11:12:17 crc kubenswrapper[4766]: I1002 11:12:17.910867 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=10.502581935 podStartE2EDuration="17.910849135s" podCreationTimestamp="2025-10-02 11:12:00 +0000 UTC" firstStartedPulling="2025-10-02 11:12:09.455021013 +0000 UTC m=+1244.397891957" lastFinishedPulling="2025-10-02 11:12:16.863288213 +0000 UTC m=+1251.806159157" observedRunningTime="2025-10-02 11:12:17.905485744 +0000 UTC m=+1252.848356688" watchObservedRunningTime="2025-10-02 11:12:17.910849135 +0000 UTC m=+1252.853720079" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.348387 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-tmhfd"] Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.349848 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.352629 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.366697 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-tmhfd"] Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.507191 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-265lf\" (UniqueName: \"kubernetes.io/projected/95842554-1651-4c34-b934-d4eb21c6c52d-kube-api-access-265lf\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.507274 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/95842554-1651-4c34-b934-d4eb21c6c52d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.507307 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95842554-1651-4c34-b934-d4eb21c6c52d-config\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.507366 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/95842554-1651-4c34-b934-d4eb21c6c52d-ovs-rundir\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.507391 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95842554-1651-4c34-b934-d4eb21c6c52d-combined-ca-bundle\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.507437 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/95842554-1651-4c34-b934-d4eb21c6c52d-ovn-rundir\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.509983 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vf6vh"] Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.525477 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-rm8jj"] Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.532027 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-rm8jj" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.541275 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-rm8jj"] Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.545104 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.608776 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/95842554-1651-4c34-b934-d4eb21c6c52d-ovn-rundir\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.608865 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-265lf\" (UniqueName: \"kubernetes.io/projected/95842554-1651-4c34-b934-d4eb21c6c52d-kube-api-access-265lf\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.608911 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/95842554-1651-4c34-b934-d4eb21c6c52d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.608941 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95842554-1651-4c34-b934-d4eb21c6c52d-config\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.608988 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/95842554-1651-4c34-b934-d4eb21c6c52d-ovs-rundir\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.609016 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95842554-1651-4c34-b934-d4eb21c6c52d-combined-ca-bundle\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.609136 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/95842554-1651-4c34-b934-d4eb21c6c52d-ovn-rundir\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.609860 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/95842554-1651-4c34-b934-d4eb21c6c52d-ovs-rundir\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.610443 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95842554-1651-4c34-b934-d4eb21c6c52d-config\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.617103 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95842554-1651-4c34-b934-d4eb21c6c52d-combined-ca-bundle\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.618034 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/95842554-1651-4c34-b934-d4eb21c6c52d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.635895 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-265lf\" (UniqueName: \"kubernetes.io/projected/95842554-1651-4c34-b934-d4eb21c6c52d-kube-api-access-265lf\") pod \"ovn-controller-metrics-tmhfd\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.672588 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.716493 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-rm8jj\" (UID: \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\") " pod="openstack/dnsmasq-dns-7fd796d7df-rm8jj" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.716576 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-config\") pod \"dnsmasq-dns-7fd796d7df-rm8jj\" (UID: \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\") " pod="openstack/dnsmasq-dns-7fd796d7df-rm8jj" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.716632 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8j9h\" (UniqueName: \"kubernetes.io/projected/b8c21eb5-acab-494c-b440-611ba0e3a0f0-kube-api-access-t8j9h\") pod \"dnsmasq-dns-7fd796d7df-rm8jj\" (UID: \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\") " pod="openstack/dnsmasq-dns-7fd796d7df-rm8jj" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.716825 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-rm8jj\" (UID: \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\") " pod="openstack/dnsmasq-dns-7fd796d7df-rm8jj" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.717906 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-rm8jj"] Oct 02 11:12:18 crc kubenswrapper[4766]: E1002 11:12:18.718486 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-t8j9h ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-7fd796d7df-rm8jj" podUID="b8c21eb5-acab-494c-b440-611ba0e3a0f0" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.741270 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-khww2"] Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.744079 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.747317 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.761103 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-khww2"] Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.818001 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-rm8jj\" (UID: \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\") " pod="openstack/dnsmasq-dns-7fd796d7df-rm8jj" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.818400 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-rm8jj\" (UID: \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\") " pod="openstack/dnsmasq-dns-7fd796d7df-rm8jj" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.818427 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-config\") pod \"dnsmasq-dns-7fd796d7df-rm8jj\" (UID: \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\") " pod="openstack/dnsmasq-dns-7fd796d7df-rm8jj" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.818476 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8j9h\" (UniqueName: \"kubernetes.io/projected/b8c21eb5-acab-494c-b440-611ba0e3a0f0-kube-api-access-t8j9h\") pod \"dnsmasq-dns-7fd796d7df-rm8jj\" (UID: \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\") " pod="openstack/dnsmasq-dns-7fd796d7df-rm8jj" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.819492 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-rm8jj\" (UID: \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\") " pod="openstack/dnsmasq-dns-7fd796d7df-rm8jj" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.820020 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-rm8jj\" (UID: \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\") " pod="openstack/dnsmasq-dns-7fd796d7df-rm8jj" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.821784 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-config\") pod \"dnsmasq-dns-7fd796d7df-rm8jj\" (UID: \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\") " pod="openstack/dnsmasq-dns-7fd796d7df-rm8jj" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.843051 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8j9h\" (UniqueName: \"kubernetes.io/projected/b8c21eb5-acab-494c-b440-611ba0e3a0f0-kube-api-access-t8j9h\") pod \"dnsmasq-dns-7fd796d7df-rm8jj\" (UID: \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\") " pod="openstack/dnsmasq-dns-7fd796d7df-rm8jj" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.878235 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2be5e935-0d64-4fed-a00a-bd0cb5891e75","Type":"ContainerStarted","Data":"2b993a56ebc1b37c4f1f1ec65f43687a9297e994b331c1c27c7eef4dd4bf4c33"} Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.881331 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"874d062e-d2f8-462c-95b3-8f630b7120af","Type":"ContainerStarted","Data":"8c379e630eaafe0740e76b9174157026bf0829370b8090bf639367d0c55aed1a"} Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.894654 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6b90dab1-a183-4adc-b415-b67bd0d782f7","Type":"ContainerStarted","Data":"a138202f6e7dfaa68f96f9c8f7a8329c46aee3fb87a8289a0c8e738c5d41167a"} Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.897855 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1282b506-728d-4c6f-aa9c-3d3c1f826b71","Type":"ContainerStarted","Data":"1f522cd7f555c48aa4a94cc691909b1a9f65c1ca666bbe2a28d742da033b7470"} Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.901800 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e914485f-05fc-4f85-b902-2e43bcfc0bb5","Type":"ContainerStarted","Data":"91eabd7220e84be6dc59a656dd4a2e76c92950e906521ca1ddc8c6e2db98afa7"} Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.901925 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.914317 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-rm8jj" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.918442 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dp6x5" event={"ID":"9860354f-7494-4b02-bca3-adc731683f7f","Type":"ContainerStarted","Data":"ff40e39793793465f75fb22c204001262aaca34bc6d55d604cabba72ae0c9eb7"} Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.918481 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-dp6x5" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.920409 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-khww2\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.920448 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-config\") pod \"dnsmasq-dns-86db49b7ff-khww2\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.920471 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-khww2\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.920542 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-khww2\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.920563 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9z4d\" (UniqueName: \"kubernetes.io/projected/8a667f22-88d5-4841-9f0d-1c272291f561-kube-api-access-p9z4d\") pod \"dnsmasq-dns-86db49b7ff-khww2\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.977472 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-rm8jj" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.984854 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=9.027977452 podStartE2EDuration="16.984831149s" podCreationTimestamp="2025-10-02 11:12:02 +0000 UTC" firstStartedPulling="2025-10-02 11:12:09.474338648 +0000 UTC m=+1244.417209592" lastFinishedPulling="2025-10-02 11:12:17.431192345 +0000 UTC m=+1252.374063289" observedRunningTime="2025-10-02 11:12:18.97513844 +0000 UTC m=+1253.918009384" watchObservedRunningTime="2025-10-02 11:12:18.984831149 +0000 UTC m=+1253.927702093" Oct 02 11:12:18 crc kubenswrapper[4766]: I1002 11:12:18.995467 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dp6x5" podStartSLOduration=6.759767744 podStartE2EDuration="13.995445817s" podCreationTimestamp="2025-10-02 11:12:05 +0000 UTC" firstStartedPulling="2025-10-02 11:12:09.720120647 +0000 UTC m=+1244.662991591" lastFinishedPulling="2025-10-02 11:12:16.95579872 +0000 UTC m=+1251.898669664" observedRunningTime="2025-10-02 11:12:18.989681643 +0000 UTC m=+1253.932552607" watchObservedRunningTime="2025-10-02 11:12:18.995445817 +0000 UTC m=+1253.938316761" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.027263 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-khww2\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.027345 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-config\") pod \"dnsmasq-dns-86db49b7ff-khww2\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.027389 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-khww2\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.027874 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-khww2\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.027925 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9z4d\" (UniqueName: \"kubernetes.io/projected/8a667f22-88d5-4841-9f0d-1c272291f561-kube-api-access-p9z4d\") pod \"dnsmasq-dns-86db49b7ff-khww2\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.032316 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-khww2\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.032911 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-khww2\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.033805 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-khww2\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.035418 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-config\") pod \"dnsmasq-dns-86db49b7ff-khww2\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.056827 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9z4d\" (UniqueName: \"kubernetes.io/projected/8a667f22-88d5-4841-9f0d-1c272291f561-kube-api-access-p9z4d\") pod \"dnsmasq-dns-86db49b7ff-khww2\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.114842 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.128778 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-ovsdbserver-nb\") pod \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\" (UID: \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\") " Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.128956 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-config\") pod \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\" (UID: \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\") " Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.129002 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-dns-svc\") pod \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\" (UID: \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\") " Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.129044 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8j9h\" (UniqueName: \"kubernetes.io/projected/b8c21eb5-acab-494c-b440-611ba0e3a0f0-kube-api-access-t8j9h\") pod \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\" (UID: \"b8c21eb5-acab-494c-b440-611ba0e3a0f0\") " Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.129288 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b8c21eb5-acab-494c-b440-611ba0e3a0f0" (UID: "b8c21eb5-acab-494c-b440-611ba0e3a0f0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.129535 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-config" (OuterVolumeSpecName: "config") pod "b8c21eb5-acab-494c-b440-611ba0e3a0f0" (UID: "b8c21eb5-acab-494c-b440-611ba0e3a0f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.129641 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.130203 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b8c21eb5-acab-494c-b440-611ba0e3a0f0" (UID: "b8c21eb5-acab-494c-b440-611ba0e3a0f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.132449 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8c21eb5-acab-494c-b440-611ba0e3a0f0-kube-api-access-t8j9h" (OuterVolumeSpecName: "kube-api-access-t8j9h") pod "b8c21eb5-acab-494c-b440-611ba0e3a0f0" (UID: "b8c21eb5-acab-494c-b440-611ba0e3a0f0"). InnerVolumeSpecName "kube-api-access-t8j9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:19 crc kubenswrapper[4766]: W1002 11:12:19.227991 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95842554_1651_4c34_b934_d4eb21c6c52d.slice/crio-a99bc15d6e4c358750c4c8a43063b576afd891ed37e5422e08006106c20d54e3 WatchSource:0}: Error finding container a99bc15d6e4c358750c4c8a43063b576afd891ed37e5422e08006106c20d54e3: Status 404 returned error can't find the container with id a99bc15d6e4c358750c4c8a43063b576afd891ed37e5422e08006106c20d54e3 Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.231024 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.231061 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8c21eb5-acab-494c-b440-611ba0e3a0f0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.231074 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8j9h\" (UniqueName: \"kubernetes.io/projected/b8c21eb5-acab-494c-b440-611ba0e3a0f0-kube-api-access-t8j9h\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.231633 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-tmhfd"] Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.640766 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-khww2"] Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.923297 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-tmhfd" event={"ID":"95842554-1651-4c34-b934-d4eb21c6c52d","Type":"ContainerStarted","Data":"a99bc15d6e4c358750c4c8a43063b576afd891ed37e5422e08006106c20d54e3"} Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.924947 4766 generic.go:334] "Generic (PLEG): container finished" podID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerID="da98ca489f682c05c8621e04ef1b42fa609b160482856ffde00dc0bb522f3ea4" exitCode=0 Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.925066 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8wzw9" event={"ID":"d90db976-cd03-4eb7-8e1d-361ef7c5045b","Type":"ContainerDied","Data":"da98ca489f682c05c8621e04ef1b42fa609b160482856ffde00dc0bb522f3ea4"} Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.926564 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-khww2" event={"ID":"8a667f22-88d5-4841-9f0d-1c272291f561","Type":"ContainerStarted","Data":"6e23deb0d5f010edb6de5b74b74e2cbaf1e8202ab813ca3897bb43ff02855e9d"} Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.926750 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-rm8jj" Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.928105 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" podUID="6ad9e6dc-bd0a-4356-94a1-d5970590092c" containerName="dnsmasq-dns" containerID="cri-o://62812b90687640f4ddd00755721e83b67dc372a14a35af3f2ca9ae4a7472107f" gracePeriod=10 Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.985211 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-rm8jj"] Oct 02 11:12:19 crc kubenswrapper[4766]: I1002 11:12:19.994526 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-rm8jj"] Oct 02 11:12:20 crc kubenswrapper[4766]: I1002 11:12:20.936260 4766 generic.go:334] "Generic (PLEG): container finished" podID="6ad9e6dc-bd0a-4356-94a1-d5970590092c" containerID="62812b90687640f4ddd00755721e83b67dc372a14a35af3f2ca9ae4a7472107f" exitCode=0 Oct 02 11:12:20 crc kubenswrapper[4766]: I1002 11:12:20.936299 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" event={"ID":"6ad9e6dc-bd0a-4356-94a1-d5970590092c","Type":"ContainerDied","Data":"62812b90687640f4ddd00755721e83b67dc372a14a35af3f2ca9ae4a7472107f"} Oct 02 11:12:20 crc kubenswrapper[4766]: I1002 11:12:20.939243 4766 generic.go:334] "Generic (PLEG): container finished" podID="8a667f22-88d5-4841-9f0d-1c272291f561" containerID="bf6025053b21cd7f8ba7a1e2074432dfda875d41206fe08b26d3183f068fdeaf" exitCode=0 Oct 02 11:12:20 crc kubenswrapper[4766]: I1002 11:12:20.939280 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-khww2" event={"ID":"8a667f22-88d5-4841-9f0d-1c272291f561","Type":"ContainerDied","Data":"bf6025053b21cd7f8ba7a1e2074432dfda875d41206fe08b26d3183f068fdeaf"} Oct 02 11:12:21 crc kubenswrapper[4766]: I1002 11:12:21.892964 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8c21eb5-acab-494c-b440-611ba0e3a0f0" path="/var/lib/kubelet/pods/b8c21eb5-acab-494c-b440-611ba0e3a0f0/volumes" Oct 02 11:12:22 crc kubenswrapper[4766]: I1002 11:12:22.916161 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 02 11:12:22 crc kubenswrapper[4766]: I1002 11:12:22.962657 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8wzw9" event={"ID":"d90db976-cd03-4eb7-8e1d-361ef7c5045b","Type":"ContainerStarted","Data":"3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa"} Oct 02 11:12:23 crc kubenswrapper[4766]: I1002 11:12:23.266818 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" Oct 02 11:12:23 crc kubenswrapper[4766]: I1002 11:12:23.421760 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ad9e6dc-bd0a-4356-94a1-d5970590092c-dns-svc\") pod \"6ad9e6dc-bd0a-4356-94a1-d5970590092c\" (UID: \"6ad9e6dc-bd0a-4356-94a1-d5970590092c\") " Oct 02 11:12:23 crc kubenswrapper[4766]: I1002 11:12:23.421913 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad9e6dc-bd0a-4356-94a1-d5970590092c-config\") pod \"6ad9e6dc-bd0a-4356-94a1-d5970590092c\" (UID: \"6ad9e6dc-bd0a-4356-94a1-d5970590092c\") " Oct 02 11:12:23 crc kubenswrapper[4766]: I1002 11:12:23.422028 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcj9f\" (UniqueName: \"kubernetes.io/projected/6ad9e6dc-bd0a-4356-94a1-d5970590092c-kube-api-access-kcj9f\") pod \"6ad9e6dc-bd0a-4356-94a1-d5970590092c\" (UID: \"6ad9e6dc-bd0a-4356-94a1-d5970590092c\") " Oct 02 11:12:23 crc kubenswrapper[4766]: I1002 11:12:23.429397 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ad9e6dc-bd0a-4356-94a1-d5970590092c-kube-api-access-kcj9f" (OuterVolumeSpecName: "kube-api-access-kcj9f") pod "6ad9e6dc-bd0a-4356-94a1-d5970590092c" (UID: "6ad9e6dc-bd0a-4356-94a1-d5970590092c"). InnerVolumeSpecName "kube-api-access-kcj9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:23 crc kubenswrapper[4766]: I1002 11:12:23.462832 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ad9e6dc-bd0a-4356-94a1-d5970590092c-config" (OuterVolumeSpecName: "config") pod "6ad9e6dc-bd0a-4356-94a1-d5970590092c" (UID: "6ad9e6dc-bd0a-4356-94a1-d5970590092c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:23 crc kubenswrapper[4766]: I1002 11:12:23.466827 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ad9e6dc-bd0a-4356-94a1-d5970590092c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6ad9e6dc-bd0a-4356-94a1-d5970590092c" (UID: "6ad9e6dc-bd0a-4356-94a1-d5970590092c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:23 crc kubenswrapper[4766]: I1002 11:12:23.523335 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcj9f\" (UniqueName: \"kubernetes.io/projected/6ad9e6dc-bd0a-4356-94a1-d5970590092c-kube-api-access-kcj9f\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:23 crc kubenswrapper[4766]: I1002 11:12:23.523362 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ad9e6dc-bd0a-4356-94a1-d5970590092c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:23 crc kubenswrapper[4766]: I1002 11:12:23.523372 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ad9e6dc-bd0a-4356-94a1-d5970590092c-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:23 crc kubenswrapper[4766]: I1002 11:12:23.970467 4766 generic.go:334] "Generic (PLEG): container finished" podID="4b9bc510-a878-4e06-8db9-fd6209039c75" containerID="3b287664b7e8befe1a7a58d7e649bb1c340794784b406077aa1021e801ae5c34" exitCode=0 Oct 02 11:12:23 crc kubenswrapper[4766]: I1002 11:12:23.970562 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4b9bc510-a878-4e06-8db9-fd6209039c75","Type":"ContainerDied","Data":"3b287664b7e8befe1a7a58d7e649bb1c340794784b406077aa1021e801ae5c34"} Oct 02 11:12:23 crc kubenswrapper[4766]: I1002 11:12:23.972935 4766 generic.go:334] "Generic (PLEG): container finished" podID="e63ea453-c8bd-4128-a47e-7b0d740a6066" containerID="b4a6774b602583404b313774393432a551d673663fd177292369cd290ea65f62" exitCode=0 Oct 02 11:12:23 crc kubenswrapper[4766]: I1002 11:12:23.973036 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e63ea453-c8bd-4128-a47e-7b0d740a6066","Type":"ContainerDied","Data":"b4a6774b602583404b313774393432a551d673663fd177292369cd290ea65f62"} Oct 02 11:12:23 crc kubenswrapper[4766]: I1002 11:12:23.977887 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" event={"ID":"6ad9e6dc-bd0a-4356-94a1-d5970590092c","Type":"ContainerDied","Data":"38bf7c15b6496d1102f72e71ee41f4aa73efc5183e356cc1007b84b5e5b89dde"} Oct 02 11:12:23 crc kubenswrapper[4766]: I1002 11:12:23.977944 4766 scope.go:117] "RemoveContainer" containerID="62812b90687640f4ddd00755721e83b67dc372a14a35af3f2ca9ae4a7472107f" Oct 02 11:12:23 crc kubenswrapper[4766]: I1002 11:12:23.978070 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-vf6vh" Oct 02 11:12:24 crc kubenswrapper[4766]: I1002 11:12:24.004335 4766 scope.go:117] "RemoveContainer" containerID="c4cf1e0aca6102d4f5cc0f7f8ce024755aea678e08199bd3aa9a0120d784d2f1" Oct 02 11:12:24 crc kubenswrapper[4766]: I1002 11:12:24.027779 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vf6vh"] Oct 02 11:12:24 crc kubenswrapper[4766]: I1002 11:12:24.032378 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-vf6vh"] Oct 02 11:12:24 crc kubenswrapper[4766]: I1002 11:12:24.432220 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:12:24 crc kubenswrapper[4766]: I1002 11:12:24.432574 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:12:24 crc kubenswrapper[4766]: I1002 11:12:24.990300 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e63ea453-c8bd-4128-a47e-7b0d740a6066","Type":"ContainerStarted","Data":"059c180c1ce2eff2818e38ff90b0da551c9f6423a5517609968f6e8b7d8825e4"} Oct 02 11:12:24 crc kubenswrapper[4766]: I1002 11:12:24.993641 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8wzw9" event={"ID":"d90db976-cd03-4eb7-8e1d-361ef7c5045b","Type":"ContainerStarted","Data":"e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a"} Oct 02 11:12:24 crc kubenswrapper[4766]: I1002 11:12:24.994488 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:24 crc kubenswrapper[4766]: I1002 11:12:24.994543 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:24 crc kubenswrapper[4766]: I1002 11:12:24.997268 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-khww2" event={"ID":"8a667f22-88d5-4841-9f0d-1c272291f561","Type":"ContainerStarted","Data":"c89f5c84d0409c16045c6c291ae1e61451bea1df55e8d8660824f0b613db853e"} Oct 02 11:12:24 crc kubenswrapper[4766]: I1002 11:12:24.997701 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:25 crc kubenswrapper[4766]: I1002 11:12:25.001477 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4b9bc510-a878-4e06-8db9-fd6209039c75","Type":"ContainerStarted","Data":"b13934ad964a9b61c115770887bcefe16bbb075f526931fb0fa8c919bf1f20b6"} Oct 02 11:12:25 crc kubenswrapper[4766]: I1002 11:12:25.016255 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.97397084 podStartE2EDuration="28.016237958s" podCreationTimestamp="2025-10-02 11:11:57 +0000 UTC" firstStartedPulling="2025-10-02 11:12:09.276907768 +0000 UTC m=+1244.219778712" lastFinishedPulling="2025-10-02 11:12:17.319174886 +0000 UTC m=+1252.262045830" observedRunningTime="2025-10-02 11:12:25.016149285 +0000 UTC m=+1259.959020259" watchObservedRunningTime="2025-10-02 11:12:25.016237958 +0000 UTC m=+1259.959108892" Oct 02 11:12:25 crc kubenswrapper[4766]: I1002 11:12:25.062182 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-8wzw9" podStartSLOduration=13.111711904 podStartE2EDuration="20.062163471s" podCreationTimestamp="2025-10-02 11:12:05 +0000 UTC" firstStartedPulling="2025-10-02 11:12:09.916918987 +0000 UTC m=+1244.859789931" lastFinishedPulling="2025-10-02 11:12:16.867370554 +0000 UTC m=+1251.810241498" observedRunningTime="2025-10-02 11:12:25.043083293 +0000 UTC m=+1259.985954237" watchObservedRunningTime="2025-10-02 11:12:25.062163471 +0000 UTC m=+1260.005034415" Oct 02 11:12:25 crc kubenswrapper[4766]: I1002 11:12:25.077622 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-khww2" podStartSLOduration=7.077598442 podStartE2EDuration="7.077598442s" podCreationTimestamp="2025-10-02 11:12:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:25.070029421 +0000 UTC m=+1260.012900395" watchObservedRunningTime="2025-10-02 11:12:25.077598442 +0000 UTC m=+1260.020469386" Oct 02 11:12:25 crc kubenswrapper[4766]: I1002 11:12:25.105935 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=19.669058512 podStartE2EDuration="27.105912124s" podCreationTimestamp="2025-10-02 11:11:58 +0000 UTC" firstStartedPulling="2025-10-02 11:12:09.432470604 +0000 UTC m=+1244.375341548" lastFinishedPulling="2025-10-02 11:12:16.869324216 +0000 UTC m=+1251.812195160" observedRunningTime="2025-10-02 11:12:25.089742769 +0000 UTC m=+1260.032613723" watchObservedRunningTime="2025-10-02 11:12:25.105912124 +0000 UTC m=+1260.048783068" Oct 02 11:12:25 crc kubenswrapper[4766]: I1002 11:12:25.655764 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 02 11:12:25 crc kubenswrapper[4766]: I1002 11:12:25.895763 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ad9e6dc-bd0a-4356-94a1-d5970590092c" path="/var/lib/kubelet/pods/6ad9e6dc-bd0a-4356-94a1-d5970590092c/volumes" Oct 02 11:12:28 crc kubenswrapper[4766]: I1002 11:12:28.037395 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6b90dab1-a183-4adc-b415-b67bd0d782f7","Type":"ContainerStarted","Data":"f7d477d76ec53de2f17dbd54641073519838399c06bbcb97d465a22ac565351a"} Oct 02 11:12:29 crc kubenswrapper[4766]: I1002 11:12:29.068424 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.770106429 podStartE2EDuration="22.068405605s" podCreationTimestamp="2025-10-02 11:12:07 +0000 UTC" firstStartedPulling="2025-10-02 11:12:09.962940533 +0000 UTC m=+1244.905811477" lastFinishedPulling="2025-10-02 11:12:26.261239699 +0000 UTC m=+1261.204110653" observedRunningTime="2025-10-02 11:12:29.064053246 +0000 UTC m=+1264.006924190" watchObservedRunningTime="2025-10-02 11:12:29.068405605 +0000 UTC m=+1264.011276549" Oct 02 11:12:29 crc kubenswrapper[4766]: I1002 11:12:29.121018 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:29 crc kubenswrapper[4766]: I1002 11:12:29.197093 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-46kpp"] Oct 02 11:12:29 crc kubenswrapper[4766]: I1002 11:12:29.197333 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-46kpp" podUID="9f3e57be-4f40-4dfe-a515-0ae16a727047" containerName="dnsmasq-dns" containerID="cri-o://12fa18be97c810b81d24f1b340fdee0c3cb8e2415218a5c125f56166adf693da" gracePeriod=10 Oct 02 11:12:29 crc kubenswrapper[4766]: I1002 11:12:29.236424 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 02 11:12:29 crc kubenswrapper[4766]: I1002 11:12:29.236933 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 02 11:12:29 crc kubenswrapper[4766]: I1002 11:12:29.336475 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:30 crc kubenswrapper[4766]: I1002 11:12:30.336767 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:30 crc kubenswrapper[4766]: I1002 11:12:30.369393 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:30 crc kubenswrapper[4766]: I1002 11:12:30.369795 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:30 crc kubenswrapper[4766]: I1002 11:12:30.378313 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:31 crc kubenswrapper[4766]: I1002 11:12:31.057947 4766 generic.go:334] "Generic (PLEG): container finished" podID="9f3e57be-4f40-4dfe-a515-0ae16a727047" containerID="12fa18be97c810b81d24f1b340fdee0c3cb8e2415218a5c125f56166adf693da" exitCode=0 Oct 02 11:12:31 crc kubenswrapper[4766]: I1002 11:12:31.058562 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-46kpp" event={"ID":"9f3e57be-4f40-4dfe-a515-0ae16a727047","Type":"ContainerDied","Data":"12fa18be97c810b81d24f1b340fdee0c3cb8e2415218a5c125f56166adf693da"} Oct 02 11:12:31 crc kubenswrapper[4766]: I1002 11:12:31.073682 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-46kpp" podUID="9f3e57be-4f40-4dfe-a515-0ae16a727047" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.96:5353: connect: connection refused" Oct 02 11:12:31 crc kubenswrapper[4766]: I1002 11:12:31.096545 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.607421 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-46kpp" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.687879 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f3e57be-4f40-4dfe-a515-0ae16a727047-config\") pod \"9f3e57be-4f40-4dfe-a515-0ae16a727047\" (UID: \"9f3e57be-4f40-4dfe-a515-0ae16a727047\") " Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.687967 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhqlq\" (UniqueName: \"kubernetes.io/projected/9f3e57be-4f40-4dfe-a515-0ae16a727047-kube-api-access-qhqlq\") pod \"9f3e57be-4f40-4dfe-a515-0ae16a727047\" (UID: \"9f3e57be-4f40-4dfe-a515-0ae16a727047\") " Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.688012 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f3e57be-4f40-4dfe-a515-0ae16a727047-dns-svc\") pod \"9f3e57be-4f40-4dfe-a515-0ae16a727047\" (UID: \"9f3e57be-4f40-4dfe-a515-0ae16a727047\") " Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.698842 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f3e57be-4f40-4dfe-a515-0ae16a727047-kube-api-access-qhqlq" (OuterVolumeSpecName: "kube-api-access-qhqlq") pod "9f3e57be-4f40-4dfe-a515-0ae16a727047" (UID: "9f3e57be-4f40-4dfe-a515-0ae16a727047"). InnerVolumeSpecName "kube-api-access-qhqlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.732270 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f3e57be-4f40-4dfe-a515-0ae16a727047-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f3e57be-4f40-4dfe-a515-0ae16a727047" (UID: "9f3e57be-4f40-4dfe-a515-0ae16a727047"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.743932 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f3e57be-4f40-4dfe-a515-0ae16a727047-config" (OuterVolumeSpecName: "config") pod "9f3e57be-4f40-4dfe-a515-0ae16a727047" (UID: "9f3e57be-4f40-4dfe-a515-0ae16a727047"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.790322 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f3e57be-4f40-4dfe-a515-0ae16a727047-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.790353 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f3e57be-4f40-4dfe-a515-0ae16a727047-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.790366 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhqlq\" (UniqueName: \"kubernetes.io/projected/9f3e57be-4f40-4dfe-a515-0ae16a727047-kube-api-access-qhqlq\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.876383 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-t7x8d"] Oct 02 11:12:32 crc kubenswrapper[4766]: E1002 11:12:32.877843 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ad9e6dc-bd0a-4356-94a1-d5970590092c" containerName="init" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.877864 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ad9e6dc-bd0a-4356-94a1-d5970590092c" containerName="init" Oct 02 11:12:32 crc kubenswrapper[4766]: E1002 11:12:32.877877 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f3e57be-4f40-4dfe-a515-0ae16a727047" containerName="dnsmasq-dns" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.877883 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3e57be-4f40-4dfe-a515-0ae16a727047" containerName="dnsmasq-dns" Oct 02 11:12:32 crc kubenswrapper[4766]: E1002 11:12:32.877896 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ad9e6dc-bd0a-4356-94a1-d5970590092c" containerName="dnsmasq-dns" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.877901 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ad9e6dc-bd0a-4356-94a1-d5970590092c" containerName="dnsmasq-dns" Oct 02 11:12:32 crc kubenswrapper[4766]: E1002 11:12:32.877908 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f3e57be-4f40-4dfe-a515-0ae16a727047" containerName="init" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.877914 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3e57be-4f40-4dfe-a515-0ae16a727047" containerName="init" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.878072 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ad9e6dc-bd0a-4356-94a1-d5970590092c" containerName="dnsmasq-dns" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.878092 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f3e57be-4f40-4dfe-a515-0ae16a727047" containerName="dnsmasq-dns" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.881114 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.896431 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-t7x8d"] Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.899435 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-config\") pod \"dnsmasq-dns-698758b865-t7x8d\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.899488 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-dns-svc\") pod \"dnsmasq-dns-698758b865-t7x8d\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.899533 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-t7x8d\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.899551 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-t7x8d\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:32 crc kubenswrapper[4766]: I1002 11:12:32.899571 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gflgq\" (UniqueName: \"kubernetes.io/projected/f2569c54-b0a8-456b-b311-264d6605d4ed-kube-api-access-gflgq\") pod \"dnsmasq-dns-698758b865-t7x8d\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.001270 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-config\") pod \"dnsmasq-dns-698758b865-t7x8d\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.001345 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-dns-svc\") pod \"dnsmasq-dns-698758b865-t7x8d\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.001394 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-t7x8d\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.001412 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-t7x8d\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.001432 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gflgq\" (UniqueName: \"kubernetes.io/projected/f2569c54-b0a8-456b-b311-264d6605d4ed-kube-api-access-gflgq\") pod \"dnsmasq-dns-698758b865-t7x8d\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.003832 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-config\") pod \"dnsmasq-dns-698758b865-t7x8d\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.004468 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-dns-svc\") pod \"dnsmasq-dns-698758b865-t7x8d\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.005064 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-t7x8d\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.005915 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-t7x8d\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.024266 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gflgq\" (UniqueName: \"kubernetes.io/projected/f2569c54-b0a8-456b-b311-264d6605d4ed-kube-api-access-gflgq\") pod \"dnsmasq-dns-698758b865-t7x8d\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.075294 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-46kpp" Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.078635 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-46kpp" event={"ID":"9f3e57be-4f40-4dfe-a515-0ae16a727047","Type":"ContainerDied","Data":"ef40464f99db160c9d070dfe32b5cc8ce5bb640baeb792baf1397e64daa0bc15"} Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.078680 4766 scope.go:117] "RemoveContainer" containerID="12fa18be97c810b81d24f1b340fdee0c3cb8e2415218a5c125f56166adf693da" Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.105468 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-46kpp"] Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.109739 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-46kpp"] Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.202030 4766 scope.go:117] "RemoveContainer" containerID="9754998c625d8f3ee0089828c33cade24fecb5ebbae19bcd383c04ea43bdb14e" Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.209698 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.651020 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-t7x8d"] Oct 02 11:12:33 crc kubenswrapper[4766]: W1002 11:12:33.656690 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2569c54_b0a8_456b_b311_264d6605d4ed.slice/crio-a8a2e591d3a8f764b2111381f79db9125f9da1be9e9446593f2c7c6eed6d8690 WatchSource:0}: Error finding container a8a2e591d3a8f764b2111381f79db9125f9da1be9e9446593f2c7c6eed6d8690: Status 404 returned error can't find the container with id a8a2e591d3a8f764b2111381f79db9125f9da1be9e9446593f2c7c6eed6d8690 Oct 02 11:12:33 crc kubenswrapper[4766]: I1002 11:12:33.891437 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f3e57be-4f40-4dfe-a515-0ae16a727047" path="/var/lib/kubelet/pods/9f3e57be-4f40-4dfe-a515-0ae16a727047/volumes" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.050595 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.057013 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.059324 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.059356 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.059675 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-jjdwm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.060015 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.072099 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.086595 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-t7x8d" event={"ID":"f2569c54-b0a8-456b-b311-264d6605d4ed","Type":"ContainerStarted","Data":"a8a2e591d3a8f764b2111381f79db9125f9da1be9e9446593f2c7c6eed6d8690"} Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.116604 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.116662 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.116844 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1ba556fb-6ff5-4418-a2b9-f26a51003d79-lock\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.116926 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd964\" (UniqueName: \"kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-kube-api-access-wd964\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.116947 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1ba556fb-6ff5-4418-a2b9-f26a51003d79-cache\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.218330 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.218383 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.218453 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1ba556fb-6ff5-4418-a2b9-f26a51003d79-lock\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.218489 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd964\" (UniqueName: \"kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-kube-api-access-wd964\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.218525 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1ba556fb-6ff5-4418-a2b9-f26a51003d79-cache\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:34 crc kubenswrapper[4766]: E1002 11:12:34.218591 4766 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 11:12:34 crc kubenswrapper[4766]: E1002 11:12:34.218612 4766 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 11:12:34 crc kubenswrapper[4766]: E1002 11:12:34.218674 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift podName:1ba556fb-6ff5-4418-a2b9-f26a51003d79 nodeName:}" failed. No retries permitted until 2025-10-02 11:12:34.718656024 +0000 UTC m=+1269.661526968 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift") pod "swift-storage-0" (UID: "1ba556fb-6ff5-4418-a2b9-f26a51003d79") : configmap "swift-ring-files" not found Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.218813 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/swift-storage-0" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.219194 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1ba556fb-6ff5-4418-a2b9-f26a51003d79-lock\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.219201 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1ba556fb-6ff5-4418-a2b9-f26a51003d79-cache\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.237923 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd964\" (UniqueName: \"kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-kube-api-access-wd964\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.239046 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.579584 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-nnwzp"] Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.582097 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: W1002 11:12:34.597571 4766 reflector.go:561] object-"openstack"/"swift-proxy-config-data": failed to list *v1.Secret: secrets "swift-proxy-config-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Oct 02 11:12:34 crc kubenswrapper[4766]: E1002 11:12:34.597615 4766 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"swift-proxy-config-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"swift-proxy-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.598626 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.599395 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.604095 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nnwzp"] Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.627357 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-nnwzp"] Oct 02 11:12:34 crc kubenswrapper[4766]: E1002 11:12:34.628760 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-w96ch ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-w96ch ring-data-devices swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-nnwzp" podUID="fcaadfd4-16f1-469d-a21d-4fd33c91db36" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.676514 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-rhczm"] Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.677625 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.692227 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rhczm"] Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.733759 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-scripts\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.733816 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-dispersionconf\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.733835 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcaadfd4-16f1-469d-a21d-4fd33c91db36-scripts\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.733852 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-swiftconf\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.733868 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fcaadfd4-16f1-469d-a21d-4fd33c91db36-ring-data-devices\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.733890 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w96ch\" (UniqueName: \"kubernetes.io/projected/fcaadfd4-16f1-469d-a21d-4fd33c91db36-kube-api-access-w96ch\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.733909 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fcaadfd4-16f1-469d-a21d-4fd33c91db36-etc-swift\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.733935 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-combined-ca-bundle\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.733972 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.733997 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-ring-data-devices\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.734012 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-combined-ca-bundle\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.734031 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-swiftconf\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.734055 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nltk\" (UniqueName: \"kubernetes.io/projected/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-kube-api-access-6nltk\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.734080 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-etc-swift\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.734102 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-dispersionconf\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: E1002 11:12:34.734256 4766 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 11:12:34 crc kubenswrapper[4766]: E1002 11:12:34.734270 4766 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 11:12:34 crc kubenswrapper[4766]: E1002 11:12:34.734306 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift podName:1ba556fb-6ff5-4418-a2b9-f26a51003d79 nodeName:}" failed. No retries permitted until 2025-10-02 11:12:35.7342921 +0000 UTC m=+1270.677163044 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift") pod "swift-storage-0" (UID: "1ba556fb-6ff5-4418-a2b9-f26a51003d79") : configmap "swift-ring-files" not found Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.834859 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-dispersionconf\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.834915 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-scripts\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.834939 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-dispersionconf\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.834959 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcaadfd4-16f1-469d-a21d-4fd33c91db36-scripts\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.834976 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-swiftconf\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.834996 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fcaadfd4-16f1-469d-a21d-4fd33c91db36-ring-data-devices\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.835018 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w96ch\" (UniqueName: \"kubernetes.io/projected/fcaadfd4-16f1-469d-a21d-4fd33c91db36-kube-api-access-w96ch\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.835041 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fcaadfd4-16f1-469d-a21d-4fd33c91db36-etc-swift\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.835075 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-combined-ca-bundle\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.835189 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-ring-data-devices\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.835230 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-combined-ca-bundle\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.835260 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-swiftconf\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.835733 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fcaadfd4-16f1-469d-a21d-4fd33c91db36-etc-swift\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.835770 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-scripts\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.835862 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-ring-data-devices\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.835987 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fcaadfd4-16f1-469d-a21d-4fd33c91db36-ring-data-devices\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.835998 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcaadfd4-16f1-469d-a21d-4fd33c91db36-scripts\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.836119 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nltk\" (UniqueName: \"kubernetes.io/projected/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-kube-api-access-6nltk\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.836197 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-etc-swift\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.836576 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-etc-swift\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.843926 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-swiftconf\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.844314 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-combined-ca-bundle\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.844797 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-swiftconf\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.844992 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-combined-ca-bundle\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.851860 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nltk\" (UniqueName: \"kubernetes.io/projected/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-kube-api-access-6nltk\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:34 crc kubenswrapper[4766]: I1002 11:12:34.852340 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w96ch\" (UniqueName: \"kubernetes.io/projected/fcaadfd4-16f1-469d-a21d-4fd33c91db36-kube-api-access-w96ch\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.095142 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.095736 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2be5e935-0d64-4fed-a00a-bd0cb5891e75","Type":"ContainerStarted","Data":"763fc40370b8c637b94e717b3d4022b399a9064357843aa69085ce7652a30954"} Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.126382 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.798261646 podStartE2EDuration="27.12636408s" podCreationTimestamp="2025-10-02 11:12:08 +0000 UTC" firstStartedPulling="2025-10-02 11:12:10.879283374 +0000 UTC m=+1245.822154318" lastFinishedPulling="2025-10-02 11:12:33.207385808 +0000 UTC m=+1268.150256752" observedRunningTime="2025-10-02 11:12:35.124530761 +0000 UTC m=+1270.067401705" watchObservedRunningTime="2025-10-02 11:12:35.12636408 +0000 UTC m=+1270.069235024" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.146750 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.345724 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-swiftconf\") pod \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.345794 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fcaadfd4-16f1-469d-a21d-4fd33c91db36-ring-data-devices\") pod \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.345859 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcaadfd4-16f1-469d-a21d-4fd33c91db36-scripts\") pod \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.345894 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w96ch\" (UniqueName: \"kubernetes.io/projected/fcaadfd4-16f1-469d-a21d-4fd33c91db36-kube-api-access-w96ch\") pod \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.345927 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-combined-ca-bundle\") pod \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.345984 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fcaadfd4-16f1-469d-a21d-4fd33c91db36-etc-swift\") pod \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.346202 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcaadfd4-16f1-469d-a21d-4fd33c91db36-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "fcaadfd4-16f1-469d-a21d-4fd33c91db36" (UID: "fcaadfd4-16f1-469d-a21d-4fd33c91db36"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.346295 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcaadfd4-16f1-469d-a21d-4fd33c91db36-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fcaadfd4-16f1-469d-a21d-4fd33c91db36" (UID: "fcaadfd4-16f1-469d-a21d-4fd33c91db36"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.346331 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fcaadfd4-16f1-469d-a21d-4fd33c91db36-scripts" (OuterVolumeSpecName: "scripts") pod "fcaadfd4-16f1-469d-a21d-4fd33c91db36" (UID: "fcaadfd4-16f1-469d-a21d-4fd33c91db36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.346760 4766 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fcaadfd4-16f1-469d-a21d-4fd33c91db36-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.346782 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcaadfd4-16f1-469d-a21d-4fd33c91db36-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.346792 4766 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fcaadfd4-16f1-469d-a21d-4fd33c91db36-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.350270 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "fcaadfd4-16f1-469d-a21d-4fd33c91db36" (UID: "fcaadfd4-16f1-469d-a21d-4fd33c91db36"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.353490 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcaadfd4-16f1-469d-a21d-4fd33c91db36-kube-api-access-w96ch" (OuterVolumeSpecName: "kube-api-access-w96ch") pod "fcaadfd4-16f1-469d-a21d-4fd33c91db36" (UID: "fcaadfd4-16f1-469d-a21d-4fd33c91db36"). InnerVolumeSpecName "kube-api-access-w96ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.353546 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcaadfd4-16f1-469d-a21d-4fd33c91db36" (UID: "fcaadfd4-16f1-469d-a21d-4fd33c91db36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.448280 4766 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.448317 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w96ch\" (UniqueName: \"kubernetes.io/projected/fcaadfd4-16f1-469d-a21d-4fd33c91db36-kube-api-access-w96ch\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.448328 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.606694 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.619113 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-dispersionconf\") pod \"swift-ring-rebalance-nnwzp\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.620235 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-dispersionconf\") pod \"swift-ring-rebalance-rhczm\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.752756 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-dispersionconf\") pod \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\" (UID: \"fcaadfd4-16f1-469d-a21d-4fd33c91db36\") " Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.753984 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:35 crc kubenswrapper[4766]: E1002 11:12:35.754180 4766 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 11:12:35 crc kubenswrapper[4766]: E1002 11:12:35.754214 4766 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 11:12:35 crc kubenswrapper[4766]: E1002 11:12:35.754269 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift podName:1ba556fb-6ff5-4418-a2b9-f26a51003d79 nodeName:}" failed. No retries permitted until 2025-10-02 11:12:37.754250413 +0000 UTC m=+1272.697121367 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift") pod "swift-storage-0" (UID: "1ba556fb-6ff5-4418-a2b9-f26a51003d79") : configmap "swift-ring-files" not found Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.755569 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "fcaadfd4-16f1-469d-a21d-4fd33c91db36" (UID: "fcaadfd4-16f1-469d-a21d-4fd33c91db36"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.855253 4766 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fcaadfd4-16f1-469d-a21d-4fd33c91db36-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:35 crc kubenswrapper[4766]: I1002 11:12:35.898615 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:12:36 crc kubenswrapper[4766]: I1002 11:12:36.101272 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nnwzp" Oct 02 11:12:36 crc kubenswrapper[4766]: I1002 11:12:36.136475 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-nnwzp"] Oct 02 11:12:36 crc kubenswrapper[4766]: I1002 11:12:36.144434 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-nnwzp"] Oct 02 11:12:36 crc kubenswrapper[4766]: I1002 11:12:36.541613 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:36 crc kubenswrapper[4766]: I1002 11:12:36.587326 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.107059 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.147226 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.354175 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.368777 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.374255 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.377380 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.377769 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-dpx9d" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.377941 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.385003 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.492842 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.494750 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.494804 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7498a37c-33a3-4a3a-9c72-64a0c533282c-config\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.494933 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d77mk\" (UniqueName: \"kubernetes.io/projected/7498a37c-33a3-4a3a-9c72-64a0c533282c-kube-api-access-d77mk\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.494984 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7498a37c-33a3-4a3a-9c72-64a0c533282c-scripts\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.495003 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.495022 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7498a37c-33a3-4a3a-9c72-64a0c533282c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.596185 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.596231 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7498a37c-33a3-4a3a-9c72-64a0c533282c-config\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.596310 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d77mk\" (UniqueName: \"kubernetes.io/projected/7498a37c-33a3-4a3a-9c72-64a0c533282c-kube-api-access-d77mk\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.596343 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7498a37c-33a3-4a3a-9c72-64a0c533282c-scripts\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.596363 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.596386 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7498a37c-33a3-4a3a-9c72-64a0c533282c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.596406 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.597599 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7498a37c-33a3-4a3a-9c72-64a0c533282c-config\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.597908 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7498a37c-33a3-4a3a-9c72-64a0c533282c-scripts\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.598083 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7498a37c-33a3-4a3a-9c72-64a0c533282c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.602032 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.604124 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.612095 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.614722 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d77mk\" (UniqueName: \"kubernetes.io/projected/7498a37c-33a3-4a3a-9c72-64a0c533282c-kube-api-access-d77mk\") pod \"ovn-northd-0\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.793130 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.800279 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:37 crc kubenswrapper[4766]: E1002 11:12:37.800415 4766 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 11:12:37 crc kubenswrapper[4766]: E1002 11:12:37.800428 4766 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 11:12:37 crc kubenswrapper[4766]: E1002 11:12:37.800469 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift podName:1ba556fb-6ff5-4418-a2b9-f26a51003d79 nodeName:}" failed. No retries permitted until 2025-10-02 11:12:41.800456587 +0000 UTC m=+1276.743327531 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift") pod "swift-storage-0" (UID: "1ba556fb-6ff5-4418-a2b9-f26a51003d79") : configmap "swift-ring-files" not found Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.857144 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-rhczm"] Oct 02 11:12:37 crc kubenswrapper[4766]: W1002 11:12:37.861821 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b6e3b29_de5d_4b1e_a14f_b942f0653bbb.slice/crio-90bf4dad058e7ff2511b9b0167a8148d01be00cc38c8e90673670a357a5784c7 WatchSource:0}: Error finding container 90bf4dad058e7ff2511b9b0167a8148d01be00cc38c8e90673670a357a5784c7: Status 404 returned error can't find the container with id 90bf4dad058e7ff2511b9b0167a8148d01be00cc38c8e90673670a357a5784c7 Oct 02 11:12:37 crc kubenswrapper[4766]: I1002 11:12:37.891419 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcaadfd4-16f1-469d-a21d-4fd33c91db36" path="/var/lib/kubelet/pods/fcaadfd4-16f1-469d-a21d-4fd33c91db36/volumes" Oct 02 11:12:38 crc kubenswrapper[4766]: I1002 11:12:38.114364 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-t7x8d" event={"ID":"f2569c54-b0a8-456b-b311-264d6605d4ed","Type":"ContainerStarted","Data":"ebe4e7a9575c91241d0e641d7a4188a1625c7bac14d55f996a3e6627b50b28b9"} Oct 02 11:12:38 crc kubenswrapper[4766]: I1002 11:12:38.115869 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-tmhfd" event={"ID":"95842554-1651-4c34-b934-d4eb21c6c52d","Type":"ContainerStarted","Data":"85d92d1d1bf439fc0a8bfe76963371dba146b935670f36dd525197b9b9036410"} Oct 02 11:12:38 crc kubenswrapper[4766]: I1002 11:12:38.116996 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rhczm" event={"ID":"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb","Type":"ContainerStarted","Data":"90bf4dad058e7ff2511b9b0167a8148d01be00cc38c8e90673670a357a5784c7"} Oct 02 11:12:38 crc kubenswrapper[4766]: I1002 11:12:38.136989 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-tmhfd" podStartSLOduration=1.9599891029999998 podStartE2EDuration="20.136965847s" podCreationTimestamp="2025-10-02 11:12:18 +0000 UTC" firstStartedPulling="2025-10-02 11:12:19.231073153 +0000 UTC m=+1254.173944097" lastFinishedPulling="2025-10-02 11:12:37.408049897 +0000 UTC m=+1272.350920841" observedRunningTime="2025-10-02 11:12:38.129685355 +0000 UTC m=+1273.072556309" watchObservedRunningTime="2025-10-02 11:12:38.136965847 +0000 UTC m=+1273.079836791" Oct 02 11:12:38 crc kubenswrapper[4766]: I1002 11:12:38.220086 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:12:38 crc kubenswrapper[4766]: W1002 11:12:38.222560 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7498a37c_33a3_4a3a_9c72_64a0c533282c.slice/crio-a67be1c3772782750932201c89ba7971944be78138b8c2ae7a803c36b9559b86 WatchSource:0}: Error finding container a67be1c3772782750932201c89ba7971944be78138b8c2ae7a803c36b9559b86: Status 404 returned error can't find the container with id a67be1c3772782750932201c89ba7971944be78138b8c2ae7a803c36b9559b86 Oct 02 11:12:39 crc kubenswrapper[4766]: I1002 11:12:39.128495 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7498a37c-33a3-4a3a-9c72-64a0c533282c","Type":"ContainerStarted","Data":"a67be1c3772782750932201c89ba7971944be78138b8c2ae7a803c36b9559b86"} Oct 02 11:12:39 crc kubenswrapper[4766]: I1002 11:12:39.130244 4766 generic.go:334] "Generic (PLEG): container finished" podID="f2569c54-b0a8-456b-b311-264d6605d4ed" containerID="ebe4e7a9575c91241d0e641d7a4188a1625c7bac14d55f996a3e6627b50b28b9" exitCode=0 Oct 02 11:12:39 crc kubenswrapper[4766]: I1002 11:12:39.130316 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-t7x8d" event={"ID":"f2569c54-b0a8-456b-b311-264d6605d4ed","Type":"ContainerDied","Data":"ebe4e7a9575c91241d0e641d7a4188a1625c7bac14d55f996a3e6627b50b28b9"} Oct 02 11:12:41 crc kubenswrapper[4766]: I1002 11:12:41.868289 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:41 crc kubenswrapper[4766]: E1002 11:12:41.868658 4766 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 11:12:41 crc kubenswrapper[4766]: E1002 11:12:41.868764 4766 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 11:12:41 crc kubenswrapper[4766]: E1002 11:12:41.868823 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift podName:1ba556fb-6ff5-4418-a2b9-f26a51003d79 nodeName:}" failed. No retries permitted until 2025-10-02 11:12:49.868808821 +0000 UTC m=+1284.811679765 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift") pod "swift-storage-0" (UID: "1ba556fb-6ff5-4418-a2b9-f26a51003d79") : configmap "swift-ring-files" not found Oct 02 11:12:45 crc kubenswrapper[4766]: I1002 11:12:45.181253 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-t7x8d" event={"ID":"f2569c54-b0a8-456b-b311-264d6605d4ed","Type":"ContainerStarted","Data":"7d8c49753a9da23134fe46b471ef13f2409e29f8b10dba2b308f93ed7b837080"} Oct 02 11:12:47 crc kubenswrapper[4766]: I1002 11:12:47.072876 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 02 11:12:47 crc kubenswrapper[4766]: I1002 11:12:47.134512 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="e63ea453-c8bd-4128-a47e-7b0d740a6066" containerName="galera" probeResult="failure" output=< Oct 02 11:12:47 crc kubenswrapper[4766]: wsrep_local_state_comment (Joined) differs from Synced Oct 02 11:12:47 crc kubenswrapper[4766]: > Oct 02 11:12:47 crc kubenswrapper[4766]: I1002 11:12:47.197389 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:47 crc kubenswrapper[4766]: I1002 11:12:47.225879 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-t7x8d" podStartSLOduration=15.22585257 podStartE2EDuration="15.22585257s" podCreationTimestamp="2025-10-02 11:12:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:12:47.213443604 +0000 UTC m=+1282.156314548" watchObservedRunningTime="2025-10-02 11:12:47.22585257 +0000 UTC m=+1282.168723514" Oct 02 11:12:47 crc kubenswrapper[4766]: I1002 11:12:47.533627 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:47 crc kubenswrapper[4766]: I1002 11:12:47.583729 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 02 11:12:49 crc kubenswrapper[4766]: I1002 11:12:49.273301 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 02 11:12:49 crc kubenswrapper[4766]: I1002 11:12:49.905858 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:12:49 crc kubenswrapper[4766]: E1002 11:12:49.906027 4766 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 02 11:12:49 crc kubenswrapper[4766]: E1002 11:12:49.906535 4766 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 02 11:12:49 crc kubenswrapper[4766]: E1002 11:12:49.906590 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift podName:1ba556fb-6ff5-4418-a2b9-f26a51003d79 nodeName:}" failed. No retries permitted until 2025-10-02 11:13:05.906573597 +0000 UTC m=+1300.849444541 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift") pod "swift-storage-0" (UID: "1ba556fb-6ff5-4418-a2b9-f26a51003d79") : configmap "swift-ring-files" not found Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.220644 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rhczm" event={"ID":"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb","Type":"ContainerStarted","Data":"29e42fd248d0b6b516b05f79a0ab4bd6931fba6de08f16588f76950913e24142"} Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.224623 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7498a37c-33a3-4a3a-9c72-64a0c533282c","Type":"ContainerStarted","Data":"73791752d4b908acb0c2c84f95d0aaba2c3fbd09a79daaa50e9a88de0dd2d032"} Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.224664 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7498a37c-33a3-4a3a-9c72-64a0c533282c","Type":"ContainerStarted","Data":"84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca"} Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.225217 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.278079 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-rhczm" podStartSLOduration=4.710133366 podStartE2EDuration="16.278060382s" podCreationTimestamp="2025-10-02 11:12:34 +0000 UTC" firstStartedPulling="2025-10-02 11:12:37.864030272 +0000 UTC m=+1272.806901216" lastFinishedPulling="2025-10-02 11:12:49.431957288 +0000 UTC m=+1284.374828232" observedRunningTime="2025-10-02 11:12:50.254876883 +0000 UTC m=+1285.197747837" watchObservedRunningTime="2025-10-02 11:12:50.278060382 +0000 UTC m=+1285.220931336" Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.281014 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.137096537 podStartE2EDuration="13.280661065s" podCreationTimestamp="2025-10-02 11:12:37 +0000 UTC" firstStartedPulling="2025-10-02 11:12:38.224765734 +0000 UTC m=+1273.167636678" lastFinishedPulling="2025-10-02 11:12:49.368330262 +0000 UTC m=+1284.311201206" observedRunningTime="2025-10-02 11:12:50.269887202 +0000 UTC m=+1285.212758156" watchObservedRunningTime="2025-10-02 11:12:50.280661065 +0000 UTC m=+1285.223532019" Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.365781 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-6hj8b"] Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.366982 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6hj8b" Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.372072 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-6hj8b"] Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.516471 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqn5s\" (UniqueName: \"kubernetes.io/projected/3a65d329-5efc-4762-aafa-e2a1a3e7b378-kube-api-access-jqn5s\") pod \"keystone-db-create-6hj8b\" (UID: \"3a65d329-5efc-4762-aafa-e2a1a3e7b378\") " pod="openstack/keystone-db-create-6hj8b" Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.542989 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-4lfj7"] Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.544406 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4lfj7" Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.553864 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4lfj7"] Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.619462 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqn5s\" (UniqueName: \"kubernetes.io/projected/3a65d329-5efc-4762-aafa-e2a1a3e7b378-kube-api-access-jqn5s\") pod \"keystone-db-create-6hj8b\" (UID: \"3a65d329-5efc-4762-aafa-e2a1a3e7b378\") " pod="openstack/keystone-db-create-6hj8b" Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.644825 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqn5s\" (UniqueName: \"kubernetes.io/projected/3a65d329-5efc-4762-aafa-e2a1a3e7b378-kube-api-access-jqn5s\") pod \"keystone-db-create-6hj8b\" (UID: \"3a65d329-5efc-4762-aafa-e2a1a3e7b378\") " pod="openstack/keystone-db-create-6hj8b" Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.669186 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-dp6x5" podUID="9860354f-7494-4b02-bca3-adc731683f7f" containerName="ovn-controller" probeResult="failure" output=< Oct 02 11:12:50 crc kubenswrapper[4766]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 02 11:12:50 crc kubenswrapper[4766]: > Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.720964 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2z7t\" (UniqueName: \"kubernetes.io/projected/7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe-kube-api-access-d2z7t\") pod \"placement-db-create-4lfj7\" (UID: \"7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe\") " pod="openstack/placement-db-create-4lfj7" Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.726347 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6hj8b" Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.822644 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2z7t\" (UniqueName: \"kubernetes.io/projected/7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe-kube-api-access-d2z7t\") pod \"placement-db-create-4lfj7\" (UID: \"7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe\") " pod="openstack/placement-db-create-4lfj7" Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.848435 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2z7t\" (UniqueName: \"kubernetes.io/projected/7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe-kube-api-access-d2z7t\") pod \"placement-db-create-4lfj7\" (UID: \"7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe\") " pod="openstack/placement-db-create-4lfj7" Oct 02 11:12:50 crc kubenswrapper[4766]: I1002 11:12:50.866089 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4lfj7" Oct 02 11:12:51 crc kubenswrapper[4766]: I1002 11:12:51.176278 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-6hj8b"] Oct 02 11:12:51 crc kubenswrapper[4766]: W1002 11:12:51.181225 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a65d329_5efc_4762_aafa_e2a1a3e7b378.slice/crio-8a0af37e92186e1e76d7002de26ff0368f50dc5d5c932133322d1e9db5bcea11 WatchSource:0}: Error finding container 8a0af37e92186e1e76d7002de26ff0368f50dc5d5c932133322d1e9db5bcea11: Status 404 returned error can't find the container with id 8a0af37e92186e1e76d7002de26ff0368f50dc5d5c932133322d1e9db5bcea11 Oct 02 11:12:51 crc kubenswrapper[4766]: I1002 11:12:51.233890 4766 generic.go:334] "Generic (PLEG): container finished" podID="874d062e-d2f8-462c-95b3-8f630b7120af" containerID="8c379e630eaafe0740e76b9174157026bf0829370b8090bf639367d0c55aed1a" exitCode=0 Oct 02 11:12:51 crc kubenswrapper[4766]: I1002 11:12:51.233948 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"874d062e-d2f8-462c-95b3-8f630b7120af","Type":"ContainerDied","Data":"8c379e630eaafe0740e76b9174157026bf0829370b8090bf639367d0c55aed1a"} Oct 02 11:12:51 crc kubenswrapper[4766]: I1002 11:12:51.235438 4766 generic.go:334] "Generic (PLEG): container finished" podID="1282b506-728d-4c6f-aa9c-3d3c1f826b71" containerID="1f522cd7f555c48aa4a94cc691909b1a9f65c1ca666bbe2a28d742da033b7470" exitCode=0 Oct 02 11:12:51 crc kubenswrapper[4766]: I1002 11:12:51.235569 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1282b506-728d-4c6f-aa9c-3d3c1f826b71","Type":"ContainerDied","Data":"1f522cd7f555c48aa4a94cc691909b1a9f65c1ca666bbe2a28d742da033b7470"} Oct 02 11:12:51 crc kubenswrapper[4766]: I1002 11:12:51.237950 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6hj8b" event={"ID":"3a65d329-5efc-4762-aafa-e2a1a3e7b378","Type":"ContainerStarted","Data":"8a0af37e92186e1e76d7002de26ff0368f50dc5d5c932133322d1e9db5bcea11"} Oct 02 11:12:51 crc kubenswrapper[4766]: I1002 11:12:51.318620 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4lfj7"] Oct 02 11:12:52 crc kubenswrapper[4766]: I1002 11:12:52.249234 4766 generic.go:334] "Generic (PLEG): container finished" podID="3a65d329-5efc-4762-aafa-e2a1a3e7b378" containerID="dc2b681fecd8d81f1e30f9755b958fbc42b452ed488323cea6e899db4471b0f6" exitCode=0 Oct 02 11:12:52 crc kubenswrapper[4766]: I1002 11:12:52.249296 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6hj8b" event={"ID":"3a65d329-5efc-4762-aafa-e2a1a3e7b378","Type":"ContainerDied","Data":"dc2b681fecd8d81f1e30f9755b958fbc42b452ed488323cea6e899db4471b0f6"} Oct 02 11:12:52 crc kubenswrapper[4766]: I1002 11:12:52.252084 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"874d062e-d2f8-462c-95b3-8f630b7120af","Type":"ContainerStarted","Data":"197d65982becb7d1b3560136e8ef3d13532ea4c97a2f09eb585ab09256067519"} Oct 02 11:12:52 crc kubenswrapper[4766]: I1002 11:12:52.252432 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:12:52 crc kubenswrapper[4766]: I1002 11:12:52.253686 4766 generic.go:334] "Generic (PLEG): container finished" podID="7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe" containerID="f303d98e6e10d3d9cba51492add063a9d0321b2f0bf299e9009af06e8aa409eb" exitCode=0 Oct 02 11:12:52 crc kubenswrapper[4766]: I1002 11:12:52.253736 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4lfj7" event={"ID":"7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe","Type":"ContainerDied","Data":"f303d98e6e10d3d9cba51492add063a9d0321b2f0bf299e9009af06e8aa409eb"} Oct 02 11:12:52 crc kubenswrapper[4766]: I1002 11:12:52.253755 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4lfj7" event={"ID":"7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe","Type":"ContainerStarted","Data":"f34ba9475fac0da74073807f4fab379002e8c11161d4b33e80b65d76da4799ba"} Oct 02 11:12:52 crc kubenswrapper[4766]: I1002 11:12:52.256821 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1282b506-728d-4c6f-aa9c-3d3c1f826b71","Type":"ContainerStarted","Data":"366bc0e5d2fdf94dff0819d1232ca88a875b2fd3ae879cab99a8d75f0ceae62c"} Oct 02 11:12:52 crc kubenswrapper[4766]: I1002 11:12:52.257097 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 02 11:12:52 crc kubenswrapper[4766]: I1002 11:12:52.323458 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=49.747783146 podStartE2EDuration="57.32344287s" podCreationTimestamp="2025-10-02 11:11:55 +0000 UTC" firstStartedPulling="2025-10-02 11:12:09.301486451 +0000 UTC m=+1244.244357395" lastFinishedPulling="2025-10-02 11:12:16.877146175 +0000 UTC m=+1251.820017119" observedRunningTime="2025-10-02 11:12:52.320897219 +0000 UTC m=+1287.263768193" watchObservedRunningTime="2025-10-02 11:12:52.32344287 +0000 UTC m=+1287.266313814" Oct 02 11:12:52 crc kubenswrapper[4766]: I1002 11:12:52.352617 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=48.52439807 podStartE2EDuration="56.35260261s" podCreationTimestamp="2025-10-02 11:11:56 +0000 UTC" firstStartedPulling="2025-10-02 11:12:08.722842847 +0000 UTC m=+1243.665713791" lastFinishedPulling="2025-10-02 11:12:16.551047387 +0000 UTC m=+1251.493918331" observedRunningTime="2025-10-02 11:12:52.351110782 +0000 UTC m=+1287.293981736" watchObservedRunningTime="2025-10-02 11:12:52.35260261 +0000 UTC m=+1287.295473554" Oct 02 11:12:53 crc kubenswrapper[4766]: I1002 11:12:53.211575 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:12:53 crc kubenswrapper[4766]: I1002 11:12:53.261924 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-khww2"] Oct 02 11:12:53 crc kubenswrapper[4766]: I1002 11:12:53.262200 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-khww2" podUID="8a667f22-88d5-4841-9f0d-1c272291f561" containerName="dnsmasq-dns" containerID="cri-o://c89f5c84d0409c16045c6c291ae1e61451bea1df55e8d8660824f0b613db853e" gracePeriod=10 Oct 02 11:12:53 crc kubenswrapper[4766]: I1002 11:12:53.784287 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4lfj7" Oct 02 11:12:53 crc kubenswrapper[4766]: I1002 11:12:53.789110 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6hj8b" Oct 02 11:12:53 crc kubenswrapper[4766]: I1002 11:12:53.878945 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2z7t\" (UniqueName: \"kubernetes.io/projected/7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe-kube-api-access-d2z7t\") pod \"7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe\" (UID: \"7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe\") " Oct 02 11:12:53 crc kubenswrapper[4766]: I1002 11:12:53.879120 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqn5s\" (UniqueName: \"kubernetes.io/projected/3a65d329-5efc-4762-aafa-e2a1a3e7b378-kube-api-access-jqn5s\") pod \"3a65d329-5efc-4762-aafa-e2a1a3e7b378\" (UID: \"3a65d329-5efc-4762-aafa-e2a1a3e7b378\") " Oct 02 11:12:53 crc kubenswrapper[4766]: I1002 11:12:53.885561 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe-kube-api-access-d2z7t" (OuterVolumeSpecName: "kube-api-access-d2z7t") pod "7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe" (UID: "7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe"). InnerVolumeSpecName "kube-api-access-d2z7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:53 crc kubenswrapper[4766]: I1002 11:12:53.892957 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a65d329-5efc-4762-aafa-e2a1a3e7b378-kube-api-access-jqn5s" (OuterVolumeSpecName: "kube-api-access-jqn5s") pod "3a65d329-5efc-4762-aafa-e2a1a3e7b378" (UID: "3a65d329-5efc-4762-aafa-e2a1a3e7b378"). InnerVolumeSpecName "kube-api-access-jqn5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:53 crc kubenswrapper[4766]: I1002 11:12:53.982409 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2z7t\" (UniqueName: \"kubernetes.io/projected/7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe-kube-api-access-d2z7t\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:53 crc kubenswrapper[4766]: I1002 11:12:53.982463 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqn5s\" (UniqueName: \"kubernetes.io/projected/3a65d329-5efc-4762-aafa-e2a1a3e7b378-kube-api-access-jqn5s\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:54 crc kubenswrapper[4766]: I1002 11:12:54.115346 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-khww2" podUID="8a667f22-88d5-4841-9f0d-1c272291f561" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.110:5353: connect: connection refused" Oct 02 11:12:54 crc kubenswrapper[4766]: I1002 11:12:54.270838 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-6hj8b" event={"ID":"3a65d329-5efc-4762-aafa-e2a1a3e7b378","Type":"ContainerDied","Data":"8a0af37e92186e1e76d7002de26ff0368f50dc5d5c932133322d1e9db5bcea11"} Oct 02 11:12:54 crc kubenswrapper[4766]: I1002 11:12:54.270888 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a0af37e92186e1e76d7002de26ff0368f50dc5d5c932133322d1e9db5bcea11" Oct 02 11:12:54 crc kubenswrapper[4766]: I1002 11:12:54.270854 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-6hj8b" Oct 02 11:12:54 crc kubenswrapper[4766]: I1002 11:12:54.272889 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4lfj7" event={"ID":"7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe","Type":"ContainerDied","Data":"f34ba9475fac0da74073807f4fab379002e8c11161d4b33e80b65d76da4799ba"} Oct 02 11:12:54 crc kubenswrapper[4766]: I1002 11:12:54.272929 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f34ba9475fac0da74073807f4fab379002e8c11161d4b33e80b65d76da4799ba" Oct 02 11:12:54 crc kubenswrapper[4766]: I1002 11:12:54.273024 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4lfj7" Oct 02 11:12:54 crc kubenswrapper[4766]: I1002 11:12:54.431956 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:12:54 crc kubenswrapper[4766]: I1002 11:12:54.432325 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:12:55 crc kubenswrapper[4766]: I1002 11:12:55.671513 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-dp6x5" podUID="9860354f-7494-4b02-bca3-adc731683f7f" containerName="ovn-controller" probeResult="failure" output=< Oct 02 11:12:55 crc kubenswrapper[4766]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 02 11:12:55 crc kubenswrapper[4766]: > Oct 02 11:12:55 crc kubenswrapper[4766]: I1002 11:12:55.692178 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:55 crc kubenswrapper[4766]: I1002 11:12:55.694592 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:12:55 crc kubenswrapper[4766]: I1002 11:12:55.788744 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-wsj29"] Oct 02 11:12:55 crc kubenswrapper[4766]: E1002 11:12:55.789093 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe" containerName="mariadb-database-create" Oct 02 11:12:55 crc kubenswrapper[4766]: I1002 11:12:55.789111 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe" containerName="mariadb-database-create" Oct 02 11:12:55 crc kubenswrapper[4766]: E1002 11:12:55.789129 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a65d329-5efc-4762-aafa-e2a1a3e7b378" containerName="mariadb-database-create" Oct 02 11:12:55 crc kubenswrapper[4766]: I1002 11:12:55.789136 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a65d329-5efc-4762-aafa-e2a1a3e7b378" containerName="mariadb-database-create" Oct 02 11:12:55 crc kubenswrapper[4766]: I1002 11:12:55.789370 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe" containerName="mariadb-database-create" Oct 02 11:12:55 crc kubenswrapper[4766]: I1002 11:12:55.789397 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a65d329-5efc-4762-aafa-e2a1a3e7b378" containerName="mariadb-database-create" Oct 02 11:12:55 crc kubenswrapper[4766]: I1002 11:12:55.789913 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wsj29" Oct 02 11:12:55 crc kubenswrapper[4766]: I1002 11:12:55.795645 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wsj29"] Oct 02 11:12:55 crc kubenswrapper[4766]: I1002 11:12:55.809871 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lngd4\" (UniqueName: \"kubernetes.io/projected/5d1ba928-5e2f-45dc-a660-f1c9fc375829-kube-api-access-lngd4\") pod \"glance-db-create-wsj29\" (UID: \"5d1ba928-5e2f-45dc-a660-f1c9fc375829\") " pod="openstack/glance-db-create-wsj29" Oct 02 11:12:55 crc kubenswrapper[4766]: I1002 11:12:55.911492 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lngd4\" (UniqueName: \"kubernetes.io/projected/5d1ba928-5e2f-45dc-a660-f1c9fc375829-kube-api-access-lngd4\") pod \"glance-db-create-wsj29\" (UID: \"5d1ba928-5e2f-45dc-a660-f1c9fc375829\") " pod="openstack/glance-db-create-wsj29" Oct 02 11:12:55 crc kubenswrapper[4766]: I1002 11:12:55.919897 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dp6x5-config-h4qs7"] Oct 02 11:12:55 crc kubenswrapper[4766]: I1002 11:12:55.920939 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:55 crc kubenswrapper[4766]: I1002 11:12:55.924185 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 02 11:12:55 crc kubenswrapper[4766]: I1002 11:12:55.935737 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lngd4\" (UniqueName: \"kubernetes.io/projected/5d1ba928-5e2f-45dc-a660-f1c9fc375829-kube-api-access-lngd4\") pod \"glance-db-create-wsj29\" (UID: \"5d1ba928-5e2f-45dc-a660-f1c9fc375829\") " pod="openstack/glance-db-create-wsj29" Oct 02 11:12:55 crc kubenswrapper[4766]: I1002 11:12:55.948733 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dp6x5-config-h4qs7"] Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.012656 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7pwm\" (UniqueName: \"kubernetes.io/projected/acf99248-c836-40e9-afcf-1c67207085c0-kube-api-access-l7pwm\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.012698 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-log-ovn\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.012719 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acf99248-c836-40e9-afcf-1c67207085c0-scripts\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.012837 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-run\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.012857 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-run-ovn\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.012881 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/acf99248-c836-40e9-afcf-1c67207085c0-additional-scripts\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.107596 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wsj29" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.120577 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7pwm\" (UniqueName: \"kubernetes.io/projected/acf99248-c836-40e9-afcf-1c67207085c0-kube-api-access-l7pwm\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.120635 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-log-ovn\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.120663 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acf99248-c836-40e9-afcf-1c67207085c0-scripts\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.120750 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-run\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.120770 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-run-ovn\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.120797 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/acf99248-c836-40e9-afcf-1c67207085c0-additional-scripts\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.121085 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-run\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.121106 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-log-ovn\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.121205 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-run-ovn\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.121552 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/acf99248-c836-40e9-afcf-1c67207085c0-additional-scripts\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.123452 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acf99248-c836-40e9-afcf-1c67207085c0-scripts\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.138012 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7pwm\" (UniqueName: \"kubernetes.io/projected/acf99248-c836-40e9-afcf-1c67207085c0-kube-api-access-l7pwm\") pod \"ovn-controller-dp6x5-config-h4qs7\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.287338 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.293457 4766 generic.go:334] "Generic (PLEG): container finished" podID="8a667f22-88d5-4841-9f0d-1c272291f561" containerID="c89f5c84d0409c16045c6c291ae1e61451bea1df55e8d8660824f0b613db853e" exitCode=0 Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.293535 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-khww2" event={"ID":"8a667f22-88d5-4841-9f0d-1c272291f561","Type":"ContainerDied","Data":"c89f5c84d0409c16045c6c291ae1e61451bea1df55e8d8660824f0b613db853e"} Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.547666 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-wsj29"] Oct 02 11:12:56 crc kubenswrapper[4766]: I1002 11:12:56.717310 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dp6x5-config-h4qs7"] Oct 02 11:12:56 crc kubenswrapper[4766]: W1002 11:12:56.720051 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacf99248_c836_40e9_afcf_1c67207085c0.slice/crio-52fefa5eff903cc1df4778c004fa41f98ba116cbee58b61191651214dc322960 WatchSource:0}: Error finding container 52fefa5eff903cc1df4778c004fa41f98ba116cbee58b61191651214dc322960: Status 404 returned error can't find the container with id 52fefa5eff903cc1df4778c004fa41f98ba116cbee58b61191651214dc322960 Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.312175 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dp6x5-config-h4qs7" event={"ID":"acf99248-c836-40e9-afcf-1c67207085c0","Type":"ContainerStarted","Data":"52fefa5eff903cc1df4778c004fa41f98ba116cbee58b61191651214dc322960"} Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.313540 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wsj29" event={"ID":"5d1ba928-5e2f-45dc-a660-f1c9fc375829","Type":"ContainerStarted","Data":"82a78301519df94a54730f7315b392fb65a0b1d4477a74589b7813ff8e15631e"} Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.313590 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wsj29" event={"ID":"5d1ba928-5e2f-45dc-a660-f1c9fc375829","Type":"ContainerStarted","Data":"59f0decdd7f7af2d81f39a0c002a9a13e55ef619fb228f1f20140108f336fc08"} Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.315886 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-khww2" event={"ID":"8a667f22-88d5-4841-9f0d-1c272291f561","Type":"ContainerDied","Data":"6e23deb0d5f010edb6de5b74b74e2cbaf1e8202ab813ca3897bb43ff02855e9d"} Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.315918 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e23deb0d5f010edb6de5b74b74e2cbaf1e8202ab813ca3897bb43ff02855e9d" Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.371692 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.442490 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-ovsdbserver-nb\") pod \"8a667f22-88d5-4841-9f0d-1c272291f561\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.442576 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-config\") pod \"8a667f22-88d5-4841-9f0d-1c272291f561\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.442728 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9z4d\" (UniqueName: \"kubernetes.io/projected/8a667f22-88d5-4841-9f0d-1c272291f561-kube-api-access-p9z4d\") pod \"8a667f22-88d5-4841-9f0d-1c272291f561\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.442782 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-ovsdbserver-sb\") pod \"8a667f22-88d5-4841-9f0d-1c272291f561\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.442826 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-dns-svc\") pod \"8a667f22-88d5-4841-9f0d-1c272291f561\" (UID: \"8a667f22-88d5-4841-9f0d-1c272291f561\") " Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.458715 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a667f22-88d5-4841-9f0d-1c272291f561-kube-api-access-p9z4d" (OuterVolumeSpecName: "kube-api-access-p9z4d") pod "8a667f22-88d5-4841-9f0d-1c272291f561" (UID: "8a667f22-88d5-4841-9f0d-1c272291f561"). InnerVolumeSpecName "kube-api-access-p9z4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.493349 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8a667f22-88d5-4841-9f0d-1c272291f561" (UID: "8a667f22-88d5-4841-9f0d-1c272291f561"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.499861 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-config" (OuterVolumeSpecName: "config") pod "8a667f22-88d5-4841-9f0d-1c272291f561" (UID: "8a667f22-88d5-4841-9f0d-1c272291f561"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.499905 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8a667f22-88d5-4841-9f0d-1c272291f561" (UID: "8a667f22-88d5-4841-9f0d-1c272291f561"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.502115 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8a667f22-88d5-4841-9f0d-1c272291f561" (UID: "8a667f22-88d5-4841-9f0d-1c272291f561"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.544620 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.544663 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.544678 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.544690 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a667f22-88d5-4841-9f0d-1c272291f561-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:57 crc kubenswrapper[4766]: I1002 11:12:57.544704 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9z4d\" (UniqueName: \"kubernetes.io/projected/8a667f22-88d5-4841-9f0d-1c272291f561-kube-api-access-p9z4d\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:58 crc kubenswrapper[4766]: I1002 11:12:58.324295 4766 generic.go:334] "Generic (PLEG): container finished" podID="acf99248-c836-40e9-afcf-1c67207085c0" containerID="8ecd910a6ea237f16877cbe960d6203045eacb4bb226a18f16662ac0b8c5ad78" exitCode=0 Oct 02 11:12:58 crc kubenswrapper[4766]: I1002 11:12:58.324364 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dp6x5-config-h4qs7" event={"ID":"acf99248-c836-40e9-afcf-1c67207085c0","Type":"ContainerDied","Data":"8ecd910a6ea237f16877cbe960d6203045eacb4bb226a18f16662ac0b8c5ad78"} Oct 02 11:12:58 crc kubenswrapper[4766]: I1002 11:12:58.325884 4766 generic.go:334] "Generic (PLEG): container finished" podID="5d1ba928-5e2f-45dc-a660-f1c9fc375829" containerID="82a78301519df94a54730f7315b392fb65a0b1d4477a74589b7813ff8e15631e" exitCode=0 Oct 02 11:12:58 crc kubenswrapper[4766]: I1002 11:12:58.326026 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-khww2" Oct 02 11:12:58 crc kubenswrapper[4766]: I1002 11:12:58.326042 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wsj29" event={"ID":"5d1ba928-5e2f-45dc-a660-f1c9fc375829","Type":"ContainerDied","Data":"82a78301519df94a54730f7315b392fb65a0b1d4477a74589b7813ff8e15631e"} Oct 02 11:12:58 crc kubenswrapper[4766]: I1002 11:12:58.384471 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-khww2"] Oct 02 11:12:58 crc kubenswrapper[4766]: I1002 11:12:58.394870 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-khww2"] Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.668393 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wsj29" Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.738619 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.778791 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-run\") pod \"acf99248-c836-40e9-afcf-1c67207085c0\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.778846 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-run-ovn\") pod \"acf99248-c836-40e9-afcf-1c67207085c0\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.778924 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-log-ovn\") pod \"acf99248-c836-40e9-afcf-1c67207085c0\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.778927 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-run" (OuterVolumeSpecName: "var-run") pod "acf99248-c836-40e9-afcf-1c67207085c0" (UID: "acf99248-c836-40e9-afcf-1c67207085c0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.778967 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "acf99248-c836-40e9-afcf-1c67207085c0" (UID: "acf99248-c836-40e9-afcf-1c67207085c0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.779003 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/acf99248-c836-40e9-afcf-1c67207085c0-additional-scripts\") pod \"acf99248-c836-40e9-afcf-1c67207085c0\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.779035 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lngd4\" (UniqueName: \"kubernetes.io/projected/5d1ba928-5e2f-45dc-a660-f1c9fc375829-kube-api-access-lngd4\") pod \"5d1ba928-5e2f-45dc-a660-f1c9fc375829\" (UID: \"5d1ba928-5e2f-45dc-a660-f1c9fc375829\") " Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.779063 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7pwm\" (UniqueName: \"kubernetes.io/projected/acf99248-c836-40e9-afcf-1c67207085c0-kube-api-access-l7pwm\") pod \"acf99248-c836-40e9-afcf-1c67207085c0\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.779083 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acf99248-c836-40e9-afcf-1c67207085c0-scripts\") pod \"acf99248-c836-40e9-afcf-1c67207085c0\" (UID: \"acf99248-c836-40e9-afcf-1c67207085c0\") " Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.779816 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "acf99248-c836-40e9-afcf-1c67207085c0" (UID: "acf99248-c836-40e9-afcf-1c67207085c0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.779983 4766 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.780000 4766 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.780373 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acf99248-c836-40e9-afcf-1c67207085c0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "acf99248-c836-40e9-afcf-1c67207085c0" (UID: "acf99248-c836-40e9-afcf-1c67207085c0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.780560 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acf99248-c836-40e9-afcf-1c67207085c0-scripts" (OuterVolumeSpecName: "scripts") pod "acf99248-c836-40e9-afcf-1c67207085c0" (UID: "acf99248-c836-40e9-afcf-1c67207085c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.795155 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d1ba928-5e2f-45dc-a660-f1c9fc375829-kube-api-access-lngd4" (OuterVolumeSpecName: "kube-api-access-lngd4") pod "5d1ba928-5e2f-45dc-a660-f1c9fc375829" (UID: "5d1ba928-5e2f-45dc-a660-f1c9fc375829"). InnerVolumeSpecName "kube-api-access-lngd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.798031 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acf99248-c836-40e9-afcf-1c67207085c0-kube-api-access-l7pwm" (OuterVolumeSpecName: "kube-api-access-l7pwm") pod "acf99248-c836-40e9-afcf-1c67207085c0" (UID: "acf99248-c836-40e9-afcf-1c67207085c0"). InnerVolumeSpecName "kube-api-access-l7pwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.886831 4766 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/acf99248-c836-40e9-afcf-1c67207085c0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.887106 4766 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/acf99248-c836-40e9-afcf-1c67207085c0-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.887178 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lngd4\" (UniqueName: \"kubernetes.io/projected/5d1ba928-5e2f-45dc-a660-f1c9fc375829-kube-api-access-lngd4\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.887244 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7pwm\" (UniqueName: \"kubernetes.io/projected/acf99248-c836-40e9-afcf-1c67207085c0-kube-api-access-l7pwm\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.887307 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acf99248-c836-40e9-afcf-1c67207085c0-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:12:59 crc kubenswrapper[4766]: I1002 11:12:59.891711 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a667f22-88d5-4841-9f0d-1c272291f561" path="/var/lib/kubelet/pods/8a667f22-88d5-4841-9f0d-1c272291f561/volumes" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.340642 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dp6x5-config-h4qs7" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.340637 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dp6x5-config-h4qs7" event={"ID":"acf99248-c836-40e9-afcf-1c67207085c0","Type":"ContainerDied","Data":"52fefa5eff903cc1df4778c004fa41f98ba116cbee58b61191651214dc322960"} Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.340769 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52fefa5eff903cc1df4778c004fa41f98ba116cbee58b61191651214dc322960" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.342300 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-wsj29" event={"ID":"5d1ba928-5e2f-45dc-a660-f1c9fc375829","Type":"ContainerDied","Data":"59f0decdd7f7af2d81f39a0c002a9a13e55ef619fb228f1f20140108f336fc08"} Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.342319 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59f0decdd7f7af2d81f39a0c002a9a13e55ef619fb228f1f20140108f336fc08" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.342569 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-wsj29" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.374359 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c257-account-create-k72jx"] Oct 02 11:13:00 crc kubenswrapper[4766]: E1002 11:13:00.374735 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a667f22-88d5-4841-9f0d-1c272291f561" containerName="init" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.374753 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a667f22-88d5-4841-9f0d-1c272291f561" containerName="init" Oct 02 11:13:00 crc kubenswrapper[4766]: E1002 11:13:00.374767 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d1ba928-5e2f-45dc-a660-f1c9fc375829" containerName="mariadb-database-create" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.374773 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d1ba928-5e2f-45dc-a660-f1c9fc375829" containerName="mariadb-database-create" Oct 02 11:13:00 crc kubenswrapper[4766]: E1002 11:13:00.374796 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a667f22-88d5-4841-9f0d-1c272291f561" containerName="dnsmasq-dns" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.374803 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a667f22-88d5-4841-9f0d-1c272291f561" containerName="dnsmasq-dns" Oct 02 11:13:00 crc kubenswrapper[4766]: E1002 11:13:00.374812 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf99248-c836-40e9-afcf-1c67207085c0" containerName="ovn-config" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.374818 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf99248-c836-40e9-afcf-1c67207085c0" containerName="ovn-config" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.374976 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="acf99248-c836-40e9-afcf-1c67207085c0" containerName="ovn-config" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.375004 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d1ba928-5e2f-45dc-a660-f1c9fc375829" containerName="mariadb-database-create" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.375026 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a667f22-88d5-4841-9f0d-1c272291f561" containerName="dnsmasq-dns" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.375522 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c257-account-create-k72jx" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.378785 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.392467 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c257-account-create-k72jx"] Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.502171 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42vpw\" (UniqueName: \"kubernetes.io/projected/08bb38b9-010e-4371-970f-bfe8e7310011-kube-api-access-42vpw\") pod \"keystone-c257-account-create-k72jx\" (UID: \"08bb38b9-010e-4371-970f-bfe8e7310011\") " pod="openstack/keystone-c257-account-create-k72jx" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.604522 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42vpw\" (UniqueName: \"kubernetes.io/projected/08bb38b9-010e-4371-970f-bfe8e7310011-kube-api-access-42vpw\") pod \"keystone-c257-account-create-k72jx\" (UID: \"08bb38b9-010e-4371-970f-bfe8e7310011\") " pod="openstack/keystone-c257-account-create-k72jx" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.623363 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42vpw\" (UniqueName: \"kubernetes.io/projected/08bb38b9-010e-4371-970f-bfe8e7310011-kube-api-access-42vpw\") pod \"keystone-c257-account-create-k72jx\" (UID: \"08bb38b9-010e-4371-970f-bfe8e7310011\") " pod="openstack/keystone-c257-account-create-k72jx" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.680565 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-dp6x5" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.684420 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d1c9-account-create-xdktv"] Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.686402 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d1c9-account-create-xdktv" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.689270 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c257-account-create-k72jx" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.694166 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.706748 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d1c9-account-create-xdktv"] Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.809301 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxnsf\" (UniqueName: \"kubernetes.io/projected/ed4fc73c-1d8b-4209-871c-2971f74c963a-kube-api-access-dxnsf\") pod \"placement-d1c9-account-create-xdktv\" (UID: \"ed4fc73c-1d8b-4209-871c-2971f74c963a\") " pod="openstack/placement-d1c9-account-create-xdktv" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.857460 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dp6x5-config-h4qs7"] Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.866044 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dp6x5-config-h4qs7"] Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.910803 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxnsf\" (UniqueName: \"kubernetes.io/projected/ed4fc73c-1d8b-4209-871c-2971f74c963a-kube-api-access-dxnsf\") pod \"placement-d1c9-account-create-xdktv\" (UID: \"ed4fc73c-1d8b-4209-871c-2971f74c963a\") " pod="openstack/placement-d1c9-account-create-xdktv" Oct 02 11:13:00 crc kubenswrapper[4766]: I1002 11:13:00.938180 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxnsf\" (UniqueName: \"kubernetes.io/projected/ed4fc73c-1d8b-4209-871c-2971f74c963a-kube-api-access-dxnsf\") pod \"placement-d1c9-account-create-xdktv\" (UID: \"ed4fc73c-1d8b-4209-871c-2971f74c963a\") " pod="openstack/placement-d1c9-account-create-xdktv" Oct 02 11:13:01 crc kubenswrapper[4766]: I1002 11:13:01.008778 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d1c9-account-create-xdktv" Oct 02 11:13:01 crc kubenswrapper[4766]: I1002 11:13:01.220079 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c257-account-create-k72jx"] Oct 02 11:13:01 crc kubenswrapper[4766]: I1002 11:13:01.255461 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d1c9-account-create-xdktv"] Oct 02 11:13:01 crc kubenswrapper[4766]: W1002 11:13:01.263078 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded4fc73c_1d8b_4209_871c_2971f74c963a.slice/crio-ea7d7d38748d79f5f3ff1ce05cd95b30e0ff82331b18059c1e1bfffa3549de15 WatchSource:0}: Error finding container ea7d7d38748d79f5f3ff1ce05cd95b30e0ff82331b18059c1e1bfffa3549de15: Status 404 returned error can't find the container with id ea7d7d38748d79f5f3ff1ce05cd95b30e0ff82331b18059c1e1bfffa3549de15 Oct 02 11:13:01 crc kubenswrapper[4766]: I1002 11:13:01.355077 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c257-account-create-k72jx" event={"ID":"08bb38b9-010e-4371-970f-bfe8e7310011","Type":"ContainerStarted","Data":"a355b71a6350f77108d2109cd80cae6c071b561bf3387d91d343628b7e6246de"} Oct 02 11:13:01 crc kubenswrapper[4766]: I1002 11:13:01.357035 4766 generic.go:334] "Generic (PLEG): container finished" podID="8b6e3b29-de5d-4b1e-a14f-b942f0653bbb" containerID="29e42fd248d0b6b516b05f79a0ab4bd6931fba6de08f16588f76950913e24142" exitCode=0 Oct 02 11:13:01 crc kubenswrapper[4766]: I1002 11:13:01.357074 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rhczm" event={"ID":"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb","Type":"ContainerDied","Data":"29e42fd248d0b6b516b05f79a0ab4bd6931fba6de08f16588f76950913e24142"} Oct 02 11:13:01 crc kubenswrapper[4766]: I1002 11:13:01.358667 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d1c9-account-create-xdktv" event={"ID":"ed4fc73c-1d8b-4209-871c-2971f74c963a","Type":"ContainerStarted","Data":"ea7d7d38748d79f5f3ff1ce05cd95b30e0ff82331b18059c1e1bfffa3549de15"} Oct 02 11:13:01 crc kubenswrapper[4766]: E1002 11:13:01.573291 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08bb38b9_010e_4371_970f_bfe8e7310011.slice/crio-ebe8d8190ea569d86618041b0db9265a5d28b6160651bd622c8b41a97932fbfe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded4fc73c_1d8b_4209_871c_2971f74c963a.slice/crio-df063c24cf752127f4e79a9b20cff91247c60f33737c20cd1aa1191d6550913d.scope\": RecentStats: unable to find data in memory cache]" Oct 02 11:13:01 crc kubenswrapper[4766]: I1002 11:13:01.898053 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acf99248-c836-40e9-afcf-1c67207085c0" path="/var/lib/kubelet/pods/acf99248-c836-40e9-afcf-1c67207085c0/volumes" Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.371877 4766 generic.go:334] "Generic (PLEG): container finished" podID="ed4fc73c-1d8b-4209-871c-2971f74c963a" containerID="df063c24cf752127f4e79a9b20cff91247c60f33737c20cd1aa1191d6550913d" exitCode=0 Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.371964 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d1c9-account-create-xdktv" event={"ID":"ed4fc73c-1d8b-4209-871c-2971f74c963a","Type":"ContainerDied","Data":"df063c24cf752127f4e79a9b20cff91247c60f33737c20cd1aa1191d6550913d"} Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.374080 4766 generic.go:334] "Generic (PLEG): container finished" podID="08bb38b9-010e-4371-970f-bfe8e7310011" containerID="ebe8d8190ea569d86618041b0db9265a5d28b6160651bd622c8b41a97932fbfe" exitCode=0 Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.374176 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c257-account-create-k72jx" event={"ID":"08bb38b9-010e-4371-970f-bfe8e7310011","Type":"ContainerDied","Data":"ebe8d8190ea569d86618041b0db9265a5d28b6160651bd622c8b41a97932fbfe"} Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.712641 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.843810 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-etc-swift\") pod \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.843880 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-scripts\") pod \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.843996 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-swiftconf\") pod \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.844038 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nltk\" (UniqueName: \"kubernetes.io/projected/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-kube-api-access-6nltk\") pod \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.844101 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-ring-data-devices\") pod \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.844148 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-combined-ca-bundle\") pod \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.844216 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-dispersionconf\") pod \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\" (UID: \"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb\") " Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.844720 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8b6e3b29-de5d-4b1e-a14f-b942f0653bbb" (UID: "8b6e3b29-de5d-4b1e-a14f-b942f0653bbb"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.845741 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8b6e3b29-de5d-4b1e-a14f-b942f0653bbb" (UID: "8b6e3b29-de5d-4b1e-a14f-b942f0653bbb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.849753 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-kube-api-access-6nltk" (OuterVolumeSpecName: "kube-api-access-6nltk") pod "8b6e3b29-de5d-4b1e-a14f-b942f0653bbb" (UID: "8b6e3b29-de5d-4b1e-a14f-b942f0653bbb"). InnerVolumeSpecName "kube-api-access-6nltk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.853645 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8b6e3b29-de5d-4b1e-a14f-b942f0653bbb" (UID: "8b6e3b29-de5d-4b1e-a14f-b942f0653bbb"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.861584 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.870298 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8b6e3b29-de5d-4b1e-a14f-b942f0653bbb" (UID: "8b6e3b29-de5d-4b1e-a14f-b942f0653bbb"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.874738 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-scripts" (OuterVolumeSpecName: "scripts") pod "8b6e3b29-de5d-4b1e-a14f-b942f0653bbb" (UID: "8b6e3b29-de5d-4b1e-a14f-b942f0653bbb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.875749 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b6e3b29-de5d-4b1e-a14f-b942f0653bbb" (UID: "8b6e3b29-de5d-4b1e-a14f-b942f0653bbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.948369 4766 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.948699 4766 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.948712 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.948725 4766 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.948738 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nltk\" (UniqueName: \"kubernetes.io/projected/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-kube-api-access-6nltk\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.948750 4766 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:02 crc kubenswrapper[4766]: I1002 11:13:02.948761 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:03 crc kubenswrapper[4766]: I1002 11:13:03.384373 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rhczm" Oct 02 11:13:03 crc kubenswrapper[4766]: I1002 11:13:03.384468 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-rhczm" event={"ID":"8b6e3b29-de5d-4b1e-a14f-b942f0653bbb","Type":"ContainerDied","Data":"90bf4dad058e7ff2511b9b0167a8148d01be00cc38c8e90673670a357a5784c7"} Oct 02 11:13:03 crc kubenswrapper[4766]: I1002 11:13:03.384546 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90bf4dad058e7ff2511b9b0167a8148d01be00cc38c8e90673670a357a5784c7" Oct 02 11:13:03 crc kubenswrapper[4766]: I1002 11:13:03.761697 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c257-account-create-k72jx" Oct 02 11:13:03 crc kubenswrapper[4766]: I1002 11:13:03.768272 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d1c9-account-create-xdktv" Oct 02 11:13:03 crc kubenswrapper[4766]: I1002 11:13:03.862785 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42vpw\" (UniqueName: \"kubernetes.io/projected/08bb38b9-010e-4371-970f-bfe8e7310011-kube-api-access-42vpw\") pod \"08bb38b9-010e-4371-970f-bfe8e7310011\" (UID: \"08bb38b9-010e-4371-970f-bfe8e7310011\") " Oct 02 11:13:03 crc kubenswrapper[4766]: I1002 11:13:03.862948 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxnsf\" (UniqueName: \"kubernetes.io/projected/ed4fc73c-1d8b-4209-871c-2971f74c963a-kube-api-access-dxnsf\") pod \"ed4fc73c-1d8b-4209-871c-2971f74c963a\" (UID: \"ed4fc73c-1d8b-4209-871c-2971f74c963a\") " Oct 02 11:13:03 crc kubenswrapper[4766]: I1002 11:13:03.867548 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08bb38b9-010e-4371-970f-bfe8e7310011-kube-api-access-42vpw" (OuterVolumeSpecName: "kube-api-access-42vpw") pod "08bb38b9-010e-4371-970f-bfe8e7310011" (UID: "08bb38b9-010e-4371-970f-bfe8e7310011"). InnerVolumeSpecName "kube-api-access-42vpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:03 crc kubenswrapper[4766]: I1002 11:13:03.868636 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed4fc73c-1d8b-4209-871c-2971f74c963a-kube-api-access-dxnsf" (OuterVolumeSpecName: "kube-api-access-dxnsf") pod "ed4fc73c-1d8b-4209-871c-2971f74c963a" (UID: "ed4fc73c-1d8b-4209-871c-2971f74c963a"). InnerVolumeSpecName "kube-api-access-dxnsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:03 crc kubenswrapper[4766]: I1002 11:13:03.964960 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42vpw\" (UniqueName: \"kubernetes.io/projected/08bb38b9-010e-4371-970f-bfe8e7310011-kube-api-access-42vpw\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:03 crc kubenswrapper[4766]: I1002 11:13:03.964988 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxnsf\" (UniqueName: \"kubernetes.io/projected/ed4fc73c-1d8b-4209-871c-2971f74c963a-kube-api-access-dxnsf\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:04 crc kubenswrapper[4766]: I1002 11:13:04.394662 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c257-account-create-k72jx" event={"ID":"08bb38b9-010e-4371-970f-bfe8e7310011","Type":"ContainerDied","Data":"a355b71a6350f77108d2109cd80cae6c071b561bf3387d91d343628b7e6246de"} Oct 02 11:13:04 crc kubenswrapper[4766]: I1002 11:13:04.394706 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a355b71a6350f77108d2109cd80cae6c071b561bf3387d91d343628b7e6246de" Oct 02 11:13:04 crc kubenswrapper[4766]: I1002 11:13:04.394917 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c257-account-create-k72jx" Oct 02 11:13:04 crc kubenswrapper[4766]: I1002 11:13:04.399048 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d1c9-account-create-xdktv" event={"ID":"ed4fc73c-1d8b-4209-871c-2971f74c963a","Type":"ContainerDied","Data":"ea7d7d38748d79f5f3ff1ce05cd95b30e0ff82331b18059c1e1bfffa3549de15"} Oct 02 11:13:04 crc kubenswrapper[4766]: I1002 11:13:04.399309 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea7d7d38748d79f5f3ff1ce05cd95b30e0ff82331b18059c1e1bfffa3549de15" Oct 02 11:13:04 crc kubenswrapper[4766]: I1002 11:13:04.399104 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d1c9-account-create-xdktv" Oct 02 11:13:05 crc kubenswrapper[4766]: I1002 11:13:05.934753 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-0d4d-account-create-thx8j"] Oct 02 11:13:05 crc kubenswrapper[4766]: E1002 11:13:05.935188 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b6e3b29-de5d-4b1e-a14f-b942f0653bbb" containerName="swift-ring-rebalance" Oct 02 11:13:05 crc kubenswrapper[4766]: I1002 11:13:05.935208 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b6e3b29-de5d-4b1e-a14f-b942f0653bbb" containerName="swift-ring-rebalance" Oct 02 11:13:05 crc kubenswrapper[4766]: E1002 11:13:05.935232 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08bb38b9-010e-4371-970f-bfe8e7310011" containerName="mariadb-account-create" Oct 02 11:13:05 crc kubenswrapper[4766]: I1002 11:13:05.935241 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="08bb38b9-010e-4371-970f-bfe8e7310011" containerName="mariadb-account-create" Oct 02 11:13:05 crc kubenswrapper[4766]: E1002 11:13:05.935262 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed4fc73c-1d8b-4209-871c-2971f74c963a" containerName="mariadb-account-create" Oct 02 11:13:05 crc kubenswrapper[4766]: I1002 11:13:05.935270 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed4fc73c-1d8b-4209-871c-2971f74c963a" containerName="mariadb-account-create" Oct 02 11:13:05 crc kubenswrapper[4766]: I1002 11:13:05.935474 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed4fc73c-1d8b-4209-871c-2971f74c963a" containerName="mariadb-account-create" Oct 02 11:13:05 crc kubenswrapper[4766]: I1002 11:13:05.935495 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b6e3b29-de5d-4b1e-a14f-b942f0653bbb" containerName="swift-ring-rebalance" Oct 02 11:13:05 crc kubenswrapper[4766]: I1002 11:13:05.935533 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="08bb38b9-010e-4371-970f-bfe8e7310011" containerName="mariadb-account-create" Oct 02 11:13:05 crc kubenswrapper[4766]: I1002 11:13:05.936246 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0d4d-account-create-thx8j" Oct 02 11:13:05 crc kubenswrapper[4766]: I1002 11:13:05.938900 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 02 11:13:05 crc kubenswrapper[4766]: I1002 11:13:05.945632 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0d4d-account-create-thx8j"] Oct 02 11:13:06 crc kubenswrapper[4766]: I1002 11:13:06.004241 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpt9k\" (UniqueName: \"kubernetes.io/projected/dbf4b594-4776-4a98-acc2-75de39597d5e-kube-api-access-mpt9k\") pod \"glance-0d4d-account-create-thx8j\" (UID: \"dbf4b594-4776-4a98-acc2-75de39597d5e\") " pod="openstack/glance-0d4d-account-create-thx8j" Oct 02 11:13:06 crc kubenswrapper[4766]: I1002 11:13:06.004678 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:13:06 crc kubenswrapper[4766]: I1002 11:13:06.012157 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift\") pod \"swift-storage-0\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " pod="openstack/swift-storage-0" Oct 02 11:13:06 crc kubenswrapper[4766]: I1002 11:13:06.106117 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpt9k\" (UniqueName: \"kubernetes.io/projected/dbf4b594-4776-4a98-acc2-75de39597d5e-kube-api-access-mpt9k\") pod \"glance-0d4d-account-create-thx8j\" (UID: \"dbf4b594-4776-4a98-acc2-75de39597d5e\") " pod="openstack/glance-0d4d-account-create-thx8j" Oct 02 11:13:06 crc kubenswrapper[4766]: I1002 11:13:06.125114 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpt9k\" (UniqueName: \"kubernetes.io/projected/dbf4b594-4776-4a98-acc2-75de39597d5e-kube-api-access-mpt9k\") pod \"glance-0d4d-account-create-thx8j\" (UID: \"dbf4b594-4776-4a98-acc2-75de39597d5e\") " pod="openstack/glance-0d4d-account-create-thx8j" Oct 02 11:13:06 crc kubenswrapper[4766]: I1002 11:13:06.182771 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 11:13:06 crc kubenswrapper[4766]: I1002 11:13:06.256902 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0d4d-account-create-thx8j" Oct 02 11:13:06 crc kubenswrapper[4766]: W1002 11:13:06.815326 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbf4b594_4776_4a98_acc2_75de39597d5e.slice/crio-3ad980ca3c241a28e2f49a0ffba50f2648270b0450f943223d5d3a0640d54466 WatchSource:0}: Error finding container 3ad980ca3c241a28e2f49a0ffba50f2648270b0450f943223d5d3a0640d54466: Status 404 returned error can't find the container with id 3ad980ca3c241a28e2f49a0ffba50f2648270b0450f943223d5d3a0640d54466 Oct 02 11:13:06 crc kubenswrapper[4766]: I1002 11:13:06.819879 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-0d4d-account-create-thx8j"] Oct 02 11:13:06 crc kubenswrapper[4766]: I1002 11:13:06.847256 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 02 11:13:06 crc kubenswrapper[4766]: W1002 11:13:06.859331 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba556fb_6ff5_4418_a2b9_f26a51003d79.slice/crio-021195eb124e8620d60c6158d85f33dcf5ec39aad1705b71cd2a00ec5747d331 WatchSource:0}: Error finding container 021195eb124e8620d60c6158d85f33dcf5ec39aad1705b71cd2a00ec5747d331: Status 404 returned error can't find the container with id 021195eb124e8620d60c6158d85f33dcf5ec39aad1705b71cd2a00ec5747d331 Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.211897 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.439450 4766 generic.go:334] "Generic (PLEG): container finished" podID="dbf4b594-4776-4a98-acc2-75de39597d5e" containerID="512ce4ebb3045b47a1e615569ed542e6ca78e2a66f4974e41be6a1f77ef1f5f0" exitCode=0 Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.439528 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0d4d-account-create-thx8j" event={"ID":"dbf4b594-4776-4a98-acc2-75de39597d5e","Type":"ContainerDied","Data":"512ce4ebb3045b47a1e615569ed542e6ca78e2a66f4974e41be6a1f77ef1f5f0"} Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.439592 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0d4d-account-create-thx8j" event={"ID":"dbf4b594-4776-4a98-acc2-75de39597d5e","Type":"ContainerStarted","Data":"3ad980ca3c241a28e2f49a0ffba50f2648270b0450f943223d5d3a0640d54466"} Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.440851 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerStarted","Data":"021195eb124e8620d60c6158d85f33dcf5ec39aad1705b71cd2a00ec5747d331"} Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.520490 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-nbrf7"] Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.521717 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nbrf7" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.528039 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nbrf7"] Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.624406 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-r8tj5"] Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.626383 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.626520 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r8tj5" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.632064 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsqwx\" (UniqueName: \"kubernetes.io/projected/bdeac64a-3b04-4d56-af88-35d4cdddeaac-kube-api-access-jsqwx\") pod \"cinder-db-create-nbrf7\" (UID: \"bdeac64a-3b04-4d56-af88-35d4cdddeaac\") " pod="openstack/cinder-db-create-nbrf7" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.633925 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-r8tj5"] Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.734109 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsqwx\" (UniqueName: \"kubernetes.io/projected/bdeac64a-3b04-4d56-af88-35d4cdddeaac-kube-api-access-jsqwx\") pod \"cinder-db-create-nbrf7\" (UID: \"bdeac64a-3b04-4d56-af88-35d4cdddeaac\") " pod="openstack/cinder-db-create-nbrf7" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.734243 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz7zz\" (UniqueName: \"kubernetes.io/projected/98d83c0b-9e98-49c8-9764-1c4a5792d1b5-kube-api-access-wz7zz\") pod \"barbican-db-create-r8tj5\" (UID: \"98d83c0b-9e98-49c8-9764-1c4a5792d1b5\") " pod="openstack/barbican-db-create-r8tj5" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.770289 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsqwx\" (UniqueName: \"kubernetes.io/projected/bdeac64a-3b04-4d56-af88-35d4cdddeaac-kube-api-access-jsqwx\") pod \"cinder-db-create-nbrf7\" (UID: \"bdeac64a-3b04-4d56-af88-35d4cdddeaac\") " pod="openstack/cinder-db-create-nbrf7" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.827990 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-5rwpr"] Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.831983 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5rwpr" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.835397 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz7zz\" (UniqueName: \"kubernetes.io/projected/98d83c0b-9e98-49c8-9764-1c4a5792d1b5-kube-api-access-wz7zz\") pod \"barbican-db-create-r8tj5\" (UID: \"98d83c0b-9e98-49c8-9764-1c4a5792d1b5\") " pod="openstack/barbican-db-create-r8tj5" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.841700 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5rwpr"] Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.855473 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nbrf7" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.869690 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz7zz\" (UniqueName: \"kubernetes.io/projected/98d83c0b-9e98-49c8-9764-1c4a5792d1b5-kube-api-access-wz7zz\") pod \"barbican-db-create-r8tj5\" (UID: \"98d83c0b-9e98-49c8-9764-1c4a5792d1b5\") " pod="openstack/barbican-db-create-r8tj5" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.932781 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-6xbv2"] Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.940076 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzpqz\" (UniqueName: \"kubernetes.io/projected/60439b73-f45d-40e6-a492-8135da66a9a4-kube-api-access-kzpqz\") pod \"neutron-db-create-5rwpr\" (UID: \"60439b73-f45d-40e6-a492-8135da66a9a4\") " pod="openstack/neutron-db-create-5rwpr" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.940428 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6xbv2" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.946524 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.946883 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kf9gx" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.947083 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.947280 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.950125 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-6xbv2"] Oct 02 11:13:07 crc kubenswrapper[4766]: I1002 11:13:07.950662 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r8tj5" Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.049164 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cppl7\" (UniqueName: \"kubernetes.io/projected/bbf52703-5083-4a00-a732-864efe21269f-kube-api-access-cppl7\") pod \"keystone-db-sync-6xbv2\" (UID: \"bbf52703-5083-4a00-a732-864efe21269f\") " pod="openstack/keystone-db-sync-6xbv2" Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.049359 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf52703-5083-4a00-a732-864efe21269f-combined-ca-bundle\") pod \"keystone-db-sync-6xbv2\" (UID: \"bbf52703-5083-4a00-a732-864efe21269f\") " pod="openstack/keystone-db-sync-6xbv2" Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.049404 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzpqz\" (UniqueName: \"kubernetes.io/projected/60439b73-f45d-40e6-a492-8135da66a9a4-kube-api-access-kzpqz\") pod \"neutron-db-create-5rwpr\" (UID: \"60439b73-f45d-40e6-a492-8135da66a9a4\") " pod="openstack/neutron-db-create-5rwpr" Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.049512 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf52703-5083-4a00-a732-864efe21269f-config-data\") pod \"keystone-db-sync-6xbv2\" (UID: \"bbf52703-5083-4a00-a732-864efe21269f\") " pod="openstack/keystone-db-sync-6xbv2" Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.079336 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzpqz\" (UniqueName: \"kubernetes.io/projected/60439b73-f45d-40e6-a492-8135da66a9a4-kube-api-access-kzpqz\") pod \"neutron-db-create-5rwpr\" (UID: \"60439b73-f45d-40e6-a492-8135da66a9a4\") " pod="openstack/neutron-db-create-5rwpr" Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.151658 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf52703-5083-4a00-a732-864efe21269f-combined-ca-bundle\") pod \"keystone-db-sync-6xbv2\" (UID: \"bbf52703-5083-4a00-a732-864efe21269f\") " pod="openstack/keystone-db-sync-6xbv2" Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.152083 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf52703-5083-4a00-a732-864efe21269f-config-data\") pod \"keystone-db-sync-6xbv2\" (UID: \"bbf52703-5083-4a00-a732-864efe21269f\") " pod="openstack/keystone-db-sync-6xbv2" Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.152182 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cppl7\" (UniqueName: \"kubernetes.io/projected/bbf52703-5083-4a00-a732-864efe21269f-kube-api-access-cppl7\") pod \"keystone-db-sync-6xbv2\" (UID: \"bbf52703-5083-4a00-a732-864efe21269f\") " pod="openstack/keystone-db-sync-6xbv2" Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.160440 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf52703-5083-4a00-a732-864efe21269f-combined-ca-bundle\") pod \"keystone-db-sync-6xbv2\" (UID: \"bbf52703-5083-4a00-a732-864efe21269f\") " pod="openstack/keystone-db-sync-6xbv2" Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.160987 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf52703-5083-4a00-a732-864efe21269f-config-data\") pod \"keystone-db-sync-6xbv2\" (UID: \"bbf52703-5083-4a00-a732-864efe21269f\") " pod="openstack/keystone-db-sync-6xbv2" Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.173194 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cppl7\" (UniqueName: \"kubernetes.io/projected/bbf52703-5083-4a00-a732-864efe21269f-kube-api-access-cppl7\") pod \"keystone-db-sync-6xbv2\" (UID: \"bbf52703-5083-4a00-a732-864efe21269f\") " pod="openstack/keystone-db-sync-6xbv2" Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.293959 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5rwpr" Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.306914 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6xbv2" Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.515673 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-r8tj5"] Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.522758 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nbrf7"] Oct 02 11:13:08 crc kubenswrapper[4766]: W1002 11:13:08.546988 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdeac64a_3b04_4d56_af88_35d4cdddeaac.slice/crio-31f078e3307ba3b0b8407c4001c44344115bc940495de32999428ccb6ba4c4d9 WatchSource:0}: Error finding container 31f078e3307ba3b0b8407c4001c44344115bc940495de32999428ccb6ba4c4d9: Status 404 returned error can't find the container with id 31f078e3307ba3b0b8407c4001c44344115bc940495de32999428ccb6ba4c4d9 Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.765304 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0d4d-account-create-thx8j" Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.862029 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpt9k\" (UniqueName: \"kubernetes.io/projected/dbf4b594-4776-4a98-acc2-75de39597d5e-kube-api-access-mpt9k\") pod \"dbf4b594-4776-4a98-acc2-75de39597d5e\" (UID: \"dbf4b594-4776-4a98-acc2-75de39597d5e\") " Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.868693 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf4b594-4776-4a98-acc2-75de39597d5e-kube-api-access-mpt9k" (OuterVolumeSpecName: "kube-api-access-mpt9k") pod "dbf4b594-4776-4a98-acc2-75de39597d5e" (UID: "dbf4b594-4776-4a98-acc2-75de39597d5e"). InnerVolumeSpecName "kube-api-access-mpt9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:08 crc kubenswrapper[4766]: I1002 11:13:08.964585 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpt9k\" (UniqueName: \"kubernetes.io/projected/dbf4b594-4776-4a98-acc2-75de39597d5e-kube-api-access-mpt9k\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:09 crc kubenswrapper[4766]: I1002 11:13:09.076847 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-6xbv2"] Oct 02 11:13:09 crc kubenswrapper[4766]: W1002 11:13:09.079421 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbf52703_5083_4a00_a732_864efe21269f.slice/crio-8074dc812bcebe508647ac172920dc1f9320250710210232210cdcdf226c6a07 WatchSource:0}: Error finding container 8074dc812bcebe508647ac172920dc1f9320250710210232210cdcdf226c6a07: Status 404 returned error can't find the container with id 8074dc812bcebe508647ac172920dc1f9320250710210232210cdcdf226c6a07 Oct 02 11:13:09 crc kubenswrapper[4766]: I1002 11:13:09.179800 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-5rwpr"] Oct 02 11:13:09 crc kubenswrapper[4766]: I1002 11:13:09.473298 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-0d4d-account-create-thx8j" Oct 02 11:13:09 crc kubenswrapper[4766]: I1002 11:13:09.473297 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-0d4d-account-create-thx8j" event={"ID":"dbf4b594-4776-4a98-acc2-75de39597d5e","Type":"ContainerDied","Data":"3ad980ca3c241a28e2f49a0ffba50f2648270b0450f943223d5d3a0640d54466"} Oct 02 11:13:09 crc kubenswrapper[4766]: I1002 11:13:09.473594 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ad980ca3c241a28e2f49a0ffba50f2648270b0450f943223d5d3a0640d54466" Oct 02 11:13:09 crc kubenswrapper[4766]: I1002 11:13:09.475335 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6xbv2" event={"ID":"bbf52703-5083-4a00-a732-864efe21269f","Type":"ContainerStarted","Data":"8074dc812bcebe508647ac172920dc1f9320250710210232210cdcdf226c6a07"} Oct 02 11:13:09 crc kubenswrapper[4766]: I1002 11:13:09.477062 4766 generic.go:334] "Generic (PLEG): container finished" podID="bdeac64a-3b04-4d56-af88-35d4cdddeaac" containerID="f3b321357fa79f48983ef5c023b5c5cbac1d1998bc3b87360e098266b95ca8e7" exitCode=0 Oct 02 11:13:09 crc kubenswrapper[4766]: I1002 11:13:09.477174 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nbrf7" event={"ID":"bdeac64a-3b04-4d56-af88-35d4cdddeaac","Type":"ContainerDied","Data":"f3b321357fa79f48983ef5c023b5c5cbac1d1998bc3b87360e098266b95ca8e7"} Oct 02 11:13:09 crc kubenswrapper[4766]: I1002 11:13:09.477201 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nbrf7" event={"ID":"bdeac64a-3b04-4d56-af88-35d4cdddeaac","Type":"ContainerStarted","Data":"31f078e3307ba3b0b8407c4001c44344115bc940495de32999428ccb6ba4c4d9"} Oct 02 11:13:09 crc kubenswrapper[4766]: I1002 11:13:09.478854 4766 generic.go:334] "Generic (PLEG): container finished" podID="98d83c0b-9e98-49c8-9764-1c4a5792d1b5" containerID="c51327eb234af217d66c3ca5a46e6220ad81f8519d7a4eaf37c6d30894c229f7" exitCode=0 Oct 02 11:13:09 crc kubenswrapper[4766]: I1002 11:13:09.478889 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-r8tj5" event={"ID":"98d83c0b-9e98-49c8-9764-1c4a5792d1b5","Type":"ContainerDied","Data":"c51327eb234af217d66c3ca5a46e6220ad81f8519d7a4eaf37c6d30894c229f7"} Oct 02 11:13:09 crc kubenswrapper[4766]: I1002 11:13:09.478920 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-r8tj5" event={"ID":"98d83c0b-9e98-49c8-9764-1c4a5792d1b5","Type":"ContainerStarted","Data":"84e87fa8ae8bc1090a027e5864076900e7b1dee0cb1bbd7df5293f53a47f8f79"} Oct 02 11:13:09 crc kubenswrapper[4766]: I1002 11:13:09.481474 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerStarted","Data":"3df0bb832b8ac1eab32d7e7a4d74b6f6630e1a9104832fae36b218c6eafdc058"} Oct 02 11:13:09 crc kubenswrapper[4766]: I1002 11:13:09.481524 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerStarted","Data":"2e00b32e05f94d6db2ce326a131639f65e69eaa8e091e4ac19058f4bc69304fa"} Oct 02 11:13:09 crc kubenswrapper[4766]: I1002 11:13:09.482728 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5rwpr" event={"ID":"60439b73-f45d-40e6-a492-8135da66a9a4","Type":"ContainerStarted","Data":"53f6938c56cac4c9b05be4153cab666bf234105167fdd09b7131756520561226"} Oct 02 11:13:10 crc kubenswrapper[4766]: I1002 11:13:10.498739 4766 generic.go:334] "Generic (PLEG): container finished" podID="60439b73-f45d-40e6-a492-8135da66a9a4" containerID="95d4da30e876b118cdc4f269c126b3e96240f4d4fdf17c6c9ab901a8dca25a20" exitCode=0 Oct 02 11:13:10 crc kubenswrapper[4766]: I1002 11:13:10.498793 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5rwpr" event={"ID":"60439b73-f45d-40e6-a492-8135da66a9a4","Type":"ContainerDied","Data":"95d4da30e876b118cdc4f269c126b3e96240f4d4fdf17c6c9ab901a8dca25a20"} Oct 02 11:13:10 crc kubenswrapper[4766]: I1002 11:13:10.502578 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerStarted","Data":"33cb37528efacc79ce75b3a9c57737f3259f3f0c3402a200d781a3fa066c92aa"} Oct 02 11:13:10 crc kubenswrapper[4766]: I1002 11:13:10.502602 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerStarted","Data":"da7cf8c62c28a42c7de4196b70da74f94b7baed423d08b118379806f1e593aa6"} Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.062633 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nbrf7" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.073785 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r8tj5" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.079770 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-k4x48"] Oct 02 11:13:11 crc kubenswrapper[4766]: E1002 11:13:11.080228 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf4b594-4776-4a98-acc2-75de39597d5e" containerName="mariadb-account-create" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.080246 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf4b594-4776-4a98-acc2-75de39597d5e" containerName="mariadb-account-create" Oct 02 11:13:11 crc kubenswrapper[4766]: E1002 11:13:11.080263 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdeac64a-3b04-4d56-af88-35d4cdddeaac" containerName="mariadb-database-create" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.080269 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdeac64a-3b04-4d56-af88-35d4cdddeaac" containerName="mariadb-database-create" Oct 02 11:13:11 crc kubenswrapper[4766]: E1002 11:13:11.080285 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98d83c0b-9e98-49c8-9764-1c4a5792d1b5" containerName="mariadb-database-create" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.080291 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="98d83c0b-9e98-49c8-9764-1c4a5792d1b5" containerName="mariadb-database-create" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.080443 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdeac64a-3b04-4d56-af88-35d4cdddeaac" containerName="mariadb-database-create" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.080465 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf4b594-4776-4a98-acc2-75de39597d5e" containerName="mariadb-account-create" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.080473 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="98d83c0b-9e98-49c8-9764-1c4a5792d1b5" containerName="mariadb-database-create" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.081025 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-k4x48" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.084862 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-k4x48"] Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.100398 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2ttzh" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.100851 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.107542 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-config-data\") pod \"glance-db-sync-k4x48\" (UID: \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\") " pod="openstack/glance-db-sync-k4x48" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.107689 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-combined-ca-bundle\") pod \"glance-db-sync-k4x48\" (UID: \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\") " pod="openstack/glance-db-sync-k4x48" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.107811 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-db-sync-config-data\") pod \"glance-db-sync-k4x48\" (UID: \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\") " pod="openstack/glance-db-sync-k4x48" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.107852 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nfjl\" (UniqueName: \"kubernetes.io/projected/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-kube-api-access-8nfjl\") pod \"glance-db-sync-k4x48\" (UID: \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\") " pod="openstack/glance-db-sync-k4x48" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.210090 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsqwx\" (UniqueName: \"kubernetes.io/projected/bdeac64a-3b04-4d56-af88-35d4cdddeaac-kube-api-access-jsqwx\") pod \"bdeac64a-3b04-4d56-af88-35d4cdddeaac\" (UID: \"bdeac64a-3b04-4d56-af88-35d4cdddeaac\") " Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.210218 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz7zz\" (UniqueName: \"kubernetes.io/projected/98d83c0b-9e98-49c8-9764-1c4a5792d1b5-kube-api-access-wz7zz\") pod \"98d83c0b-9e98-49c8-9764-1c4a5792d1b5\" (UID: \"98d83c0b-9e98-49c8-9764-1c4a5792d1b5\") " Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.210460 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-combined-ca-bundle\") pod \"glance-db-sync-k4x48\" (UID: \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\") " pod="openstack/glance-db-sync-k4x48" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.210540 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-db-sync-config-data\") pod \"glance-db-sync-k4x48\" (UID: \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\") " pod="openstack/glance-db-sync-k4x48" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.210565 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nfjl\" (UniqueName: \"kubernetes.io/projected/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-kube-api-access-8nfjl\") pod \"glance-db-sync-k4x48\" (UID: \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\") " pod="openstack/glance-db-sync-k4x48" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.210614 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-config-data\") pod \"glance-db-sync-k4x48\" (UID: \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\") " pod="openstack/glance-db-sync-k4x48" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.218157 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-config-data\") pod \"glance-db-sync-k4x48\" (UID: \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\") " pod="openstack/glance-db-sync-k4x48" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.218158 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-db-sync-config-data\") pod \"glance-db-sync-k4x48\" (UID: \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\") " pod="openstack/glance-db-sync-k4x48" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.220314 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdeac64a-3b04-4d56-af88-35d4cdddeaac-kube-api-access-jsqwx" (OuterVolumeSpecName: "kube-api-access-jsqwx") pod "bdeac64a-3b04-4d56-af88-35d4cdddeaac" (UID: "bdeac64a-3b04-4d56-af88-35d4cdddeaac"). InnerVolumeSpecName "kube-api-access-jsqwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.221136 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-combined-ca-bundle\") pod \"glance-db-sync-k4x48\" (UID: \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\") " pod="openstack/glance-db-sync-k4x48" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.221633 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d83c0b-9e98-49c8-9764-1c4a5792d1b5-kube-api-access-wz7zz" (OuterVolumeSpecName: "kube-api-access-wz7zz") pod "98d83c0b-9e98-49c8-9764-1c4a5792d1b5" (UID: "98d83c0b-9e98-49c8-9764-1c4a5792d1b5"). InnerVolumeSpecName "kube-api-access-wz7zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.242150 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nfjl\" (UniqueName: \"kubernetes.io/projected/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-kube-api-access-8nfjl\") pod \"glance-db-sync-k4x48\" (UID: \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\") " pod="openstack/glance-db-sync-k4x48" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.311594 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz7zz\" (UniqueName: \"kubernetes.io/projected/98d83c0b-9e98-49c8-9764-1c4a5792d1b5-kube-api-access-wz7zz\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.311634 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsqwx\" (UniqueName: \"kubernetes.io/projected/bdeac64a-3b04-4d56-af88-35d4cdddeaac-kube-api-access-jsqwx\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.424591 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-k4x48" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.523756 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerStarted","Data":"9da3dd83501b5003165c1f29947b2365058f97cf1d114c7cef65d3123aa7bf9a"} Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.527651 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nbrf7" event={"ID":"bdeac64a-3b04-4d56-af88-35d4cdddeaac","Type":"ContainerDied","Data":"31f078e3307ba3b0b8407c4001c44344115bc940495de32999428ccb6ba4c4d9"} Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.527692 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31f078e3307ba3b0b8407c4001c44344115bc940495de32999428ccb6ba4c4d9" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.527750 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nbrf7" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.532903 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r8tj5" Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.532929 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-r8tj5" event={"ID":"98d83c0b-9e98-49c8-9764-1c4a5792d1b5","Type":"ContainerDied","Data":"84e87fa8ae8bc1090a027e5864076900e7b1dee0cb1bbd7df5293f53a47f8f79"} Oct 02 11:13:11 crc kubenswrapper[4766]: I1002 11:13:11.533005 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84e87fa8ae8bc1090a027e5864076900e7b1dee0cb1bbd7df5293f53a47f8f79" Oct 02 11:13:14 crc kubenswrapper[4766]: I1002 11:13:14.583654 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-5rwpr" event={"ID":"60439b73-f45d-40e6-a492-8135da66a9a4","Type":"ContainerDied","Data":"53f6938c56cac4c9b05be4153cab666bf234105167fdd09b7131756520561226"} Oct 02 11:13:14 crc kubenswrapper[4766]: I1002 11:13:14.584026 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53f6938c56cac4c9b05be4153cab666bf234105167fdd09b7131756520561226" Oct 02 11:13:14 crc kubenswrapper[4766]: I1002 11:13:14.608372 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5rwpr" Oct 02 11:13:14 crc kubenswrapper[4766]: I1002 11:13:14.770772 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzpqz\" (UniqueName: \"kubernetes.io/projected/60439b73-f45d-40e6-a492-8135da66a9a4-kube-api-access-kzpqz\") pod \"60439b73-f45d-40e6-a492-8135da66a9a4\" (UID: \"60439b73-f45d-40e6-a492-8135da66a9a4\") " Oct 02 11:13:14 crc kubenswrapper[4766]: I1002 11:13:14.774636 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60439b73-f45d-40e6-a492-8135da66a9a4-kube-api-access-kzpqz" (OuterVolumeSpecName: "kube-api-access-kzpqz") pod "60439b73-f45d-40e6-a492-8135da66a9a4" (UID: "60439b73-f45d-40e6-a492-8135da66a9a4"). InnerVolumeSpecName "kube-api-access-kzpqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:14 crc kubenswrapper[4766]: I1002 11:13:14.893523 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzpqz\" (UniqueName: \"kubernetes.io/projected/60439b73-f45d-40e6-a492-8135da66a9a4-kube-api-access-kzpqz\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:15 crc kubenswrapper[4766]: I1002 11:13:15.011286 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-k4x48"] Oct 02 11:13:15 crc kubenswrapper[4766]: I1002 11:13:15.591759 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6xbv2" event={"ID":"bbf52703-5083-4a00-a732-864efe21269f","Type":"ContainerStarted","Data":"c261e4d0c71a2d44b8fccc74ed25a2644106f2f91983fb1b4e335f1366bcfec6"} Oct 02 11:13:15 crc kubenswrapper[4766]: I1002 11:13:15.596677 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerStarted","Data":"2d24232fdec7040ef6a5964e5689deb80c9416e2b9a5928b6c44a05db6a24a58"} Oct 02 11:13:15 crc kubenswrapper[4766]: I1002 11:13:15.596721 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerStarted","Data":"3647b01248aaadded5263e23d36e7d1f488a1f32c6f5c9c8f0363cd9c896464c"} Oct 02 11:13:15 crc kubenswrapper[4766]: I1002 11:13:15.596732 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerStarted","Data":"a6e14799401d44ada3dfee676cdeb9c70cb0d90eacc2dca91f8d1079c24c183c"} Oct 02 11:13:15 crc kubenswrapper[4766]: I1002 11:13:15.598113 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-5rwpr" Oct 02 11:13:15 crc kubenswrapper[4766]: I1002 11:13:15.604543 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-k4x48" event={"ID":"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae","Type":"ContainerStarted","Data":"0f04f1165e5f07cde4fef940ddaf54ee09af0ec060811699110af58b65ea69c6"} Oct 02 11:13:15 crc kubenswrapper[4766]: I1002 11:13:15.609258 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-6xbv2" podStartSLOduration=2.9901953199999998 podStartE2EDuration="8.609241004s" podCreationTimestamp="2025-10-02 11:13:07 +0000 UTC" firstStartedPulling="2025-10-02 11:13:09.082033969 +0000 UTC m=+1304.024904913" lastFinishedPulling="2025-10-02 11:13:14.701079653 +0000 UTC m=+1309.643950597" observedRunningTime="2025-10-02 11:13:15.605544127 +0000 UTC m=+1310.548415081" watchObservedRunningTime="2025-10-02 11:13:15.609241004 +0000 UTC m=+1310.552111948" Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.620359 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerStarted","Data":"2f0ac5d9172eee436579154fd4936b18259605ce7d7deaad27a10a2c5cdbf7f4"} Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.620704 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerStarted","Data":"37549a28195c14e158d5a907b7e78a1e6d69aa6e6efd13ac3ffc312e52ea12e6"} Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.666265 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-ce01-account-create-wr9nd"] Oct 02 11:13:17 crc kubenswrapper[4766]: E1002 11:13:17.666654 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60439b73-f45d-40e6-a492-8135da66a9a4" containerName="mariadb-database-create" Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.666671 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="60439b73-f45d-40e6-a492-8135da66a9a4" containerName="mariadb-database-create" Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.666834 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="60439b73-f45d-40e6-a492-8135da66a9a4" containerName="mariadb-database-create" Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.667736 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ce01-account-create-wr9nd" Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.676850 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.680619 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ce01-account-create-wr9nd"] Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.758893 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-891f-account-create-5stxg"] Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.761180 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-891f-account-create-5stxg" Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.763999 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.765049 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vv6x\" (UniqueName: \"kubernetes.io/projected/2e6935d6-839b-4f02-86f4-b79ad98cf891-kube-api-access-4vv6x\") pod \"barbican-ce01-account-create-wr9nd\" (UID: \"2e6935d6-839b-4f02-86f4-b79ad98cf891\") " pod="openstack/barbican-ce01-account-create-wr9nd" Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.769712 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-891f-account-create-5stxg"] Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.866334 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vv6x\" (UniqueName: \"kubernetes.io/projected/2e6935d6-839b-4f02-86f4-b79ad98cf891-kube-api-access-4vv6x\") pod \"barbican-ce01-account-create-wr9nd\" (UID: \"2e6935d6-839b-4f02-86f4-b79ad98cf891\") " pod="openstack/barbican-ce01-account-create-wr9nd" Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.866778 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvzw2\" (UniqueName: \"kubernetes.io/projected/858ff098-bb65-418c-832e-9cd9d8cd75d6-kube-api-access-lvzw2\") pod \"cinder-891f-account-create-5stxg\" (UID: \"858ff098-bb65-418c-832e-9cd9d8cd75d6\") " pod="openstack/cinder-891f-account-create-5stxg" Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.885269 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vv6x\" (UniqueName: \"kubernetes.io/projected/2e6935d6-839b-4f02-86f4-b79ad98cf891-kube-api-access-4vv6x\") pod \"barbican-ce01-account-create-wr9nd\" (UID: \"2e6935d6-839b-4f02-86f4-b79ad98cf891\") " pod="openstack/barbican-ce01-account-create-wr9nd" Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.977332 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvzw2\" (UniqueName: \"kubernetes.io/projected/858ff098-bb65-418c-832e-9cd9d8cd75d6-kube-api-access-lvzw2\") pod \"cinder-891f-account-create-5stxg\" (UID: \"858ff098-bb65-418c-832e-9cd9d8cd75d6\") " pod="openstack/cinder-891f-account-create-5stxg" Oct 02 11:13:17 crc kubenswrapper[4766]: I1002 11:13:17.993744 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvzw2\" (UniqueName: \"kubernetes.io/projected/858ff098-bb65-418c-832e-9cd9d8cd75d6-kube-api-access-lvzw2\") pod \"cinder-891f-account-create-5stxg\" (UID: \"858ff098-bb65-418c-832e-9cd9d8cd75d6\") " pod="openstack/cinder-891f-account-create-5stxg" Oct 02 11:13:18 crc kubenswrapper[4766]: I1002 11:13:18.117199 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ce01-account-create-wr9nd" Oct 02 11:13:18 crc kubenswrapper[4766]: I1002 11:13:18.136872 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-891f-account-create-5stxg" Oct 02 11:13:18 crc kubenswrapper[4766]: I1002 11:13:18.646321 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerStarted","Data":"711bf815d45f9226c3bfc08294214785c60a9b9bd2e213b6e5c41884b2a87710"} Oct 02 11:13:18 crc kubenswrapper[4766]: I1002 11:13:18.681737 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ce01-account-create-wr9nd"] Oct 02 11:13:18 crc kubenswrapper[4766]: I1002 11:13:18.735306 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-891f-account-create-5stxg"] Oct 02 11:13:18 crc kubenswrapper[4766]: W1002 11:13:18.740753 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod858ff098_bb65_418c_832e_9cd9d8cd75d6.slice/crio-f92ed44166cb9dba96c20054727a7a4018912073a73d93a2c3d681a8530bf3eb WatchSource:0}: Error finding container f92ed44166cb9dba96c20054727a7a4018912073a73d93a2c3d681a8530bf3eb: Status 404 returned error can't find the container with id f92ed44166cb9dba96c20054727a7a4018912073a73d93a2c3d681a8530bf3eb Oct 02 11:13:19 crc kubenswrapper[4766]: I1002 11:13:19.662905 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerStarted","Data":"018613b4ace3495529050cc51bbcbc25c762db064d56bf3f3b2fa7c0ac1213cf"} Oct 02 11:13:19 crc kubenswrapper[4766]: I1002 11:13:19.663350 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerStarted","Data":"48f8731a29f544c073845eb8fcd06b0efc46da3e9e5d54fb23e339018591d7f7"} Oct 02 11:13:19 crc kubenswrapper[4766]: I1002 11:13:19.663393 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerStarted","Data":"d4d2f3213d0d8347ad4fc0fda1819961e3e48ca84f6a1a08530c3e457a9145ee"} Oct 02 11:13:19 crc kubenswrapper[4766]: I1002 11:13:19.663408 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerStarted","Data":"1b4624fea8b62d613092718192f8a9e9faa6138904a93866528c86735b20b493"} Oct 02 11:13:19 crc kubenswrapper[4766]: I1002 11:13:19.664913 4766 generic.go:334] "Generic (PLEG): container finished" podID="858ff098-bb65-418c-832e-9cd9d8cd75d6" containerID="423fd22f502bcbb76fa7bee55fb87f790a0e24c2ab7604de639bcbd74611dd08" exitCode=0 Oct 02 11:13:19 crc kubenswrapper[4766]: I1002 11:13:19.665016 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-891f-account-create-5stxg" event={"ID":"858ff098-bb65-418c-832e-9cd9d8cd75d6","Type":"ContainerDied","Data":"423fd22f502bcbb76fa7bee55fb87f790a0e24c2ab7604de639bcbd74611dd08"} Oct 02 11:13:19 crc kubenswrapper[4766]: I1002 11:13:19.665034 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-891f-account-create-5stxg" event={"ID":"858ff098-bb65-418c-832e-9cd9d8cd75d6","Type":"ContainerStarted","Data":"f92ed44166cb9dba96c20054727a7a4018912073a73d93a2c3d681a8530bf3eb"} Oct 02 11:13:19 crc kubenswrapper[4766]: I1002 11:13:19.666868 4766 generic.go:334] "Generic (PLEG): container finished" podID="2e6935d6-839b-4f02-86f4-b79ad98cf891" containerID="412fce6dd8d64d547cc77de430881d5d15bf12a7aa0bc9988b3411682724ce9e" exitCode=0 Oct 02 11:13:19 crc kubenswrapper[4766]: I1002 11:13:19.666905 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ce01-account-create-wr9nd" event={"ID":"2e6935d6-839b-4f02-86f4-b79ad98cf891","Type":"ContainerDied","Data":"412fce6dd8d64d547cc77de430881d5d15bf12a7aa0bc9988b3411682724ce9e"} Oct 02 11:13:19 crc kubenswrapper[4766]: I1002 11:13:19.666930 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ce01-account-create-wr9nd" event={"ID":"2e6935d6-839b-4f02-86f4-b79ad98cf891","Type":"ContainerStarted","Data":"9ad5d048bf9321316268bfe6c4ffd50b272621f3b14298b004c94aa672de0356"} Oct 02 11:13:19 crc kubenswrapper[4766]: I1002 11:13:19.717419 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.356272685 podStartE2EDuration="46.717394695s" podCreationTimestamp="2025-10-02 11:12:33 +0000 UTC" firstStartedPulling="2025-10-02 11:13:06.860744446 +0000 UTC m=+1301.803615390" lastFinishedPulling="2025-10-02 11:13:17.221866456 +0000 UTC m=+1312.164737400" observedRunningTime="2025-10-02 11:13:19.692343317 +0000 UTC m=+1314.635214271" watchObservedRunningTime="2025-10-02 11:13:19.717394695 +0000 UTC m=+1314.660265629" Oct 02 11:13:19 crc kubenswrapper[4766]: I1002 11:13:19.945195 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vrk47"] Oct 02 11:13:19 crc kubenswrapper[4766]: I1002 11:13:19.946565 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:19 crc kubenswrapper[4766]: I1002 11:13:19.948209 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 02 11:13:19 crc kubenswrapper[4766]: I1002 11:13:19.954094 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vrk47"] Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.018601 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmbz9\" (UniqueName: \"kubernetes.io/projected/cee5abbb-6bd8-4f13-b224-bb3434877e72-kube-api-access-fmbz9\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.018756 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.018797 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.018828 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.019165 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-config\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.019234 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.120986 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.121043 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.121063 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.121119 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-config\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.121142 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.121168 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmbz9\" (UniqueName: \"kubernetes.io/projected/cee5abbb-6bd8-4f13-b224-bb3434877e72-kube-api-access-fmbz9\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.122145 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.122212 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.122395 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-config\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.123056 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.123823 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.139560 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmbz9\" (UniqueName: \"kubernetes.io/projected/cee5abbb-6bd8-4f13-b224-bb3434877e72-kube-api-access-fmbz9\") pod \"dnsmasq-dns-77585f5f8c-vrk47\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.264578 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:20 crc kubenswrapper[4766]: I1002 11:13:20.813170 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vrk47"] Oct 02 11:13:20 crc kubenswrapper[4766]: W1002 11:13:20.815193 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcee5abbb_6bd8_4f13_b224_bb3434877e72.slice/crio-64c1c014447d7078306625309a4e262149515dc6f443d0344bdaea19f1c28440 WatchSource:0}: Error finding container 64c1c014447d7078306625309a4e262149515dc6f443d0344bdaea19f1c28440: Status 404 returned error can't find the container with id 64c1c014447d7078306625309a4e262149515dc6f443d0344bdaea19f1c28440 Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.000193 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ce01-account-create-wr9nd" Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.066149 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-891f-account-create-5stxg" Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.146626 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvzw2\" (UniqueName: \"kubernetes.io/projected/858ff098-bb65-418c-832e-9cd9d8cd75d6-kube-api-access-lvzw2\") pod \"858ff098-bb65-418c-832e-9cd9d8cd75d6\" (UID: \"858ff098-bb65-418c-832e-9cd9d8cd75d6\") " Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.146771 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vv6x\" (UniqueName: \"kubernetes.io/projected/2e6935d6-839b-4f02-86f4-b79ad98cf891-kube-api-access-4vv6x\") pod \"2e6935d6-839b-4f02-86f4-b79ad98cf891\" (UID: \"2e6935d6-839b-4f02-86f4-b79ad98cf891\") " Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.152271 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e6935d6-839b-4f02-86f4-b79ad98cf891-kube-api-access-4vv6x" (OuterVolumeSpecName: "kube-api-access-4vv6x") pod "2e6935d6-839b-4f02-86f4-b79ad98cf891" (UID: "2e6935d6-839b-4f02-86f4-b79ad98cf891"). InnerVolumeSpecName "kube-api-access-4vv6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.159371 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/858ff098-bb65-418c-832e-9cd9d8cd75d6-kube-api-access-lvzw2" (OuterVolumeSpecName: "kube-api-access-lvzw2") pod "858ff098-bb65-418c-832e-9cd9d8cd75d6" (UID: "858ff098-bb65-418c-832e-9cd9d8cd75d6"). InnerVolumeSpecName "kube-api-access-lvzw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.252115 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vv6x\" (UniqueName: \"kubernetes.io/projected/2e6935d6-839b-4f02-86f4-b79ad98cf891-kube-api-access-4vv6x\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.252154 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvzw2\" (UniqueName: \"kubernetes.io/projected/858ff098-bb65-418c-832e-9cd9d8cd75d6-kube-api-access-lvzw2\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.708688 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-891f-account-create-5stxg" Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.708712 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-891f-account-create-5stxg" event={"ID":"858ff098-bb65-418c-832e-9cd9d8cd75d6","Type":"ContainerDied","Data":"f92ed44166cb9dba96c20054727a7a4018912073a73d93a2c3d681a8530bf3eb"} Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.708749 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f92ed44166cb9dba96c20054727a7a4018912073a73d93a2c3d681a8530bf3eb" Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.711698 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ce01-account-create-wr9nd" event={"ID":"2e6935d6-839b-4f02-86f4-b79ad98cf891","Type":"ContainerDied","Data":"9ad5d048bf9321316268bfe6c4ffd50b272621f3b14298b004c94aa672de0356"} Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.712084 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ad5d048bf9321316268bfe6c4ffd50b272621f3b14298b004c94aa672de0356" Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.712153 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ce01-account-create-wr9nd" Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.715464 4766 generic.go:334] "Generic (PLEG): container finished" podID="bbf52703-5083-4a00-a732-864efe21269f" containerID="c261e4d0c71a2d44b8fccc74ed25a2644106f2f91983fb1b4e335f1366bcfec6" exitCode=0 Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.715550 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6xbv2" event={"ID":"bbf52703-5083-4a00-a732-864efe21269f","Type":"ContainerDied","Data":"c261e4d0c71a2d44b8fccc74ed25a2644106f2f91983fb1b4e335f1366bcfec6"} Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.717349 4766 generic.go:334] "Generic (PLEG): container finished" podID="cee5abbb-6bd8-4f13-b224-bb3434877e72" containerID="75f02eafed39214ffa2fae29bc21ea5aedc9c11f99a3a84d457a57f87b46ddad" exitCode=0 Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.717393 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" event={"ID":"cee5abbb-6bd8-4f13-b224-bb3434877e72","Type":"ContainerDied","Data":"75f02eafed39214ffa2fae29bc21ea5aedc9c11f99a3a84d457a57f87b46ddad"} Oct 02 11:13:21 crc kubenswrapper[4766]: I1002 11:13:21.717422 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" event={"ID":"cee5abbb-6bd8-4f13-b224-bb3434877e72","Type":"ContainerStarted","Data":"64c1c014447d7078306625309a4e262149515dc6f443d0344bdaea19f1c28440"} Oct 02 11:13:24 crc kubenswrapper[4766]: I1002 11:13:24.431735 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:13:24 crc kubenswrapper[4766]: I1002 11:13:24.432118 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:13:24 crc kubenswrapper[4766]: I1002 11:13:24.432172 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 11:13:24 crc kubenswrapper[4766]: I1002 11:13:24.433006 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"586f742ea27e273779868792840bda390cd263c60dd6b64b6d933d49d83569e4"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:13:24 crc kubenswrapper[4766]: I1002 11:13:24.433072 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://586f742ea27e273779868792840bda390cd263c60dd6b64b6d933d49d83569e4" gracePeriod=600 Oct 02 11:13:26 crc kubenswrapper[4766]: I1002 11:13:26.767796 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="586f742ea27e273779868792840bda390cd263c60dd6b64b6d933d49d83569e4" exitCode=0 Oct 02 11:13:26 crc kubenswrapper[4766]: I1002 11:13:26.767964 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"586f742ea27e273779868792840bda390cd263c60dd6b64b6d933d49d83569e4"} Oct 02 11:13:26 crc kubenswrapper[4766]: I1002 11:13:26.768385 4766 scope.go:117] "RemoveContainer" containerID="e9d8027960aa5ff2fdb64c8c9c88c1508201265b3f2ec5d57d7c673e50cbb5eb" Oct 02 11:13:27 crc kubenswrapper[4766]: I1002 11:13:27.981557 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c7d8-account-create-mmkqm"] Oct 02 11:13:27 crc kubenswrapper[4766]: E1002 11:13:27.985160 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6935d6-839b-4f02-86f4-b79ad98cf891" containerName="mariadb-account-create" Oct 02 11:13:27 crc kubenswrapper[4766]: I1002 11:13:27.985242 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6935d6-839b-4f02-86f4-b79ad98cf891" containerName="mariadb-account-create" Oct 02 11:13:27 crc kubenswrapper[4766]: E1002 11:13:27.985304 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858ff098-bb65-418c-832e-9cd9d8cd75d6" containerName="mariadb-account-create" Oct 02 11:13:27 crc kubenswrapper[4766]: I1002 11:13:27.985354 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="858ff098-bb65-418c-832e-9cd9d8cd75d6" containerName="mariadb-account-create" Oct 02 11:13:27 crc kubenswrapper[4766]: I1002 11:13:27.985623 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6935d6-839b-4f02-86f4-b79ad98cf891" containerName="mariadb-account-create" Oct 02 11:13:27 crc kubenswrapper[4766]: I1002 11:13:27.985706 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="858ff098-bb65-418c-832e-9cd9d8cd75d6" containerName="mariadb-account-create" Oct 02 11:13:27 crc kubenswrapper[4766]: I1002 11:13:27.986304 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c7d8-account-create-mmkqm" Oct 02 11:13:28 crc kubenswrapper[4766]: I1002 11:13:27.988735 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 02 11:13:28 crc kubenswrapper[4766]: I1002 11:13:27.992259 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c7d8-account-create-mmkqm"] Oct 02 11:13:28 crc kubenswrapper[4766]: I1002 11:13:28.027831 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z6gh\" (UniqueName: \"kubernetes.io/projected/d7037a9b-8f0f-4595-892e-3106080371d6-kube-api-access-4z6gh\") pod \"neutron-c7d8-account-create-mmkqm\" (UID: \"d7037a9b-8f0f-4595-892e-3106080371d6\") " pod="openstack/neutron-c7d8-account-create-mmkqm" Oct 02 11:13:28 crc kubenswrapper[4766]: I1002 11:13:28.129397 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z6gh\" (UniqueName: \"kubernetes.io/projected/d7037a9b-8f0f-4595-892e-3106080371d6-kube-api-access-4z6gh\") pod \"neutron-c7d8-account-create-mmkqm\" (UID: \"d7037a9b-8f0f-4595-892e-3106080371d6\") " pod="openstack/neutron-c7d8-account-create-mmkqm" Oct 02 11:13:28 crc kubenswrapper[4766]: I1002 11:13:28.149737 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z6gh\" (UniqueName: \"kubernetes.io/projected/d7037a9b-8f0f-4595-892e-3106080371d6-kube-api-access-4z6gh\") pod \"neutron-c7d8-account-create-mmkqm\" (UID: \"d7037a9b-8f0f-4595-892e-3106080371d6\") " pod="openstack/neutron-c7d8-account-create-mmkqm" Oct 02 11:13:28 crc kubenswrapper[4766]: I1002 11:13:28.351828 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c7d8-account-create-mmkqm" Oct 02 11:13:28 crc kubenswrapper[4766]: I1002 11:13:28.788906 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6xbv2" event={"ID":"bbf52703-5083-4a00-a732-864efe21269f","Type":"ContainerDied","Data":"8074dc812bcebe508647ac172920dc1f9320250710210232210cdcdf226c6a07"} Oct 02 11:13:28 crc kubenswrapper[4766]: I1002 11:13:28.789301 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8074dc812bcebe508647ac172920dc1f9320250710210232210cdcdf226c6a07" Oct 02 11:13:28 crc kubenswrapper[4766]: I1002 11:13:28.918301 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6xbv2" Oct 02 11:13:28 crc kubenswrapper[4766]: I1002 11:13:28.942749 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf52703-5083-4a00-a732-864efe21269f-config-data\") pod \"bbf52703-5083-4a00-a732-864efe21269f\" (UID: \"bbf52703-5083-4a00-a732-864efe21269f\") " Oct 02 11:13:28 crc kubenswrapper[4766]: I1002 11:13:28.984453 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf52703-5083-4a00-a732-864efe21269f-config-data" (OuterVolumeSpecName: "config-data") pod "bbf52703-5083-4a00-a732-864efe21269f" (UID: "bbf52703-5083-4a00-a732-864efe21269f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.044314 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cppl7\" (UniqueName: \"kubernetes.io/projected/bbf52703-5083-4a00-a732-864efe21269f-kube-api-access-cppl7\") pod \"bbf52703-5083-4a00-a732-864efe21269f\" (UID: \"bbf52703-5083-4a00-a732-864efe21269f\") " Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.044461 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf52703-5083-4a00-a732-864efe21269f-combined-ca-bundle\") pod \"bbf52703-5083-4a00-a732-864efe21269f\" (UID: \"bbf52703-5083-4a00-a732-864efe21269f\") " Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.044960 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbf52703-5083-4a00-a732-864efe21269f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.052106 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf52703-5083-4a00-a732-864efe21269f-kube-api-access-cppl7" (OuterVolumeSpecName: "kube-api-access-cppl7") pod "bbf52703-5083-4a00-a732-864efe21269f" (UID: "bbf52703-5083-4a00-a732-864efe21269f"). InnerVolumeSpecName "kube-api-access-cppl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.066192 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbf52703-5083-4a00-a732-864efe21269f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbf52703-5083-4a00-a732-864efe21269f" (UID: "bbf52703-5083-4a00-a732-864efe21269f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.146246 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbf52703-5083-4a00-a732-864efe21269f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.146289 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cppl7\" (UniqueName: \"kubernetes.io/projected/bbf52703-5083-4a00-a732-864efe21269f-kube-api-access-cppl7\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.207495 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c7d8-account-create-mmkqm"] Oct 02 11:13:29 crc kubenswrapper[4766]: W1002 11:13:29.217987 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7037a9b_8f0f_4595_892e_3106080371d6.slice/crio-14de686d1fcd64f3beecc39c79d4c98263da587e18955134e48856af3a5191eb WatchSource:0}: Error finding container 14de686d1fcd64f3beecc39c79d4c98263da587e18955134e48856af3a5191eb: Status 404 returned error can't find the container with id 14de686d1fcd64f3beecc39c79d4c98263da587e18955134e48856af3a5191eb Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.797492 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" event={"ID":"cee5abbb-6bd8-4f13-b224-bb3434877e72","Type":"ContainerStarted","Data":"ff5292127b48fcb9724826491a7a21d46faceeb71f76f7d288257a9a9352fb94"} Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.797909 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.800370 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e"} Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.802387 4766 generic.go:334] "Generic (PLEG): container finished" podID="d7037a9b-8f0f-4595-892e-3106080371d6" containerID="961541d1569b99b83eabdf54931372b404c77eeafc45c413f6b4937167c67180" exitCode=0 Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.802454 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c7d8-account-create-mmkqm" event={"ID":"d7037a9b-8f0f-4595-892e-3106080371d6","Type":"ContainerDied","Data":"961541d1569b99b83eabdf54931372b404c77eeafc45c413f6b4937167c67180"} Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.802479 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c7d8-account-create-mmkqm" event={"ID":"d7037a9b-8f0f-4595-892e-3106080371d6","Type":"ContainerStarted","Data":"14de686d1fcd64f3beecc39c79d4c98263da587e18955134e48856af3a5191eb"} Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.804256 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6xbv2" Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.811184 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-k4x48" event={"ID":"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae","Type":"ContainerStarted","Data":"7b6649db0c3371deb879753f751b0d38b0339189eda5edb4f0fbad5ed847bc49"} Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.817351 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" podStartSLOduration=10.817330723 podStartE2EDuration="10.817330723s" podCreationTimestamp="2025-10-02 11:13:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:13:29.812722266 +0000 UTC m=+1324.755593240" watchObservedRunningTime="2025-10-02 11:13:29.817330723 +0000 UTC m=+1324.760201667" Oct 02 11:13:29 crc kubenswrapper[4766]: I1002 11:13:29.879424 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-k4x48" podStartSLOduration=4.885423772 podStartE2EDuration="18.879410241s" podCreationTimestamp="2025-10-02 11:13:11 +0000 UTC" firstStartedPulling="2025-10-02 11:13:15.025318902 +0000 UTC m=+1309.968189856" lastFinishedPulling="2025-10-02 11:13:29.019305371 +0000 UTC m=+1323.962176325" observedRunningTime="2025-10-02 11:13:29.875424764 +0000 UTC m=+1324.818295708" watchObservedRunningTime="2025-10-02 11:13:29.879410241 +0000 UTC m=+1324.822281185" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.160647 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vrk47"] Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.221241 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bs74v"] Oct 02 11:13:30 crc kubenswrapper[4766]: E1002 11:13:30.221767 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf52703-5083-4a00-a732-864efe21269f" containerName="keystone-db-sync" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.221791 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf52703-5083-4a00-a732-864efe21269f" containerName="keystone-db-sync" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.222089 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf52703-5083-4a00-a732-864efe21269f" containerName="keystone-db-sync" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.222836 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.227904 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.227921 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.228079 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kf9gx" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.228028 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.239839 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-hff7c"] Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.241395 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.252574 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bs74v"] Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.271304 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-fernet-keys\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.271344 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6xtf\" (UniqueName: \"kubernetes.io/projected/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-kube-api-access-c6xtf\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.271366 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-config-data\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.271383 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-credential-keys\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.271405 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.271425 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-combined-ca-bundle\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.271444 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-dns-svc\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.271540 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-scripts\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.271566 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.271584 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvn4g\" (UniqueName: \"kubernetes.io/projected/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-kube-api-access-gvn4g\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.271605 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.271619 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-config\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.290933 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-hff7c"] Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.375168 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-scripts\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.375234 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.375273 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvn4g\" (UniqueName: \"kubernetes.io/projected/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-kube-api-access-gvn4g\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.375306 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.375329 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-config\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.375387 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-fernet-keys\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.375408 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6xtf\" (UniqueName: \"kubernetes.io/projected/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-kube-api-access-c6xtf\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.375440 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-config-data\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.375908 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-credential-keys\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.375953 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.375992 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-combined-ca-bundle\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.376013 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-dns-svc\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.379397 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.380544 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-dns-svc\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.384313 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.384942 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.385114 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-config\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.386257 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-fernet-keys\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.387700 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-scripts\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.388160 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-combined-ca-bundle\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.392942 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-credential-keys\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.409481 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6xtf\" (UniqueName: \"kubernetes.io/projected/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-kube-api-access-c6xtf\") pod \"dnsmasq-dns-55fff446b9-hff7c\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.414163 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-config-data\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.434427 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.440182 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.449996 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.450378 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.463285 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.475488 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-s5j64"] Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.493714 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvn4g\" (UniqueName: \"kubernetes.io/projected/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-kube-api-access-gvn4g\") pod \"keystone-bootstrap-bs74v\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.508775 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e987f27f-69d6-4f1e-a9a2-486638ab4505-run-httpd\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.508841 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.508880 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zmcx\" (UniqueName: \"kubernetes.io/projected/e987f27f-69d6-4f1e-a9a2-486638ab4505-kube-api-access-4zmcx\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.508942 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-scripts\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.508980 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e987f27f-69d6-4f1e-a9a2-486638ab4505-log-httpd\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.509026 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-config-data\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.509047 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.518328 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.530960 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.553618 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.559651 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9bdhh" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.560028 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.574363 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.613278 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-s5j64"] Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.627644 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m79rr\" (UniqueName: \"kubernetes.io/projected/12786f1e-db55-4668-8e43-afa080dc0fa2-kube-api-access-m79rr\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.627695 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.627725 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12786f1e-db55-4668-8e43-afa080dc0fa2-etc-machine-id\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.627748 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-db-sync-config-data\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.627766 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zmcx\" (UniqueName: \"kubernetes.io/projected/e987f27f-69d6-4f1e-a9a2-486638ab4505-kube-api-access-4zmcx\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.627978 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-scripts\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.630520 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e987f27f-69d6-4f1e-a9a2-486638ab4505-log-httpd\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.630601 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-combined-ca-bundle\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.630936 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-ztpl6"] Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.630977 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-scripts\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.631033 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-config-data\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.632454 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ztpl6" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.632844 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e987f27f-69d6-4f1e-a9a2-486638ab4505-log-httpd\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.633215 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.633275 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-config-data\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.633351 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e987f27f-69d6-4f1e-a9a2-486638ab4505-run-httpd\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.633722 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e987f27f-69d6-4f1e-a9a2-486638ab4505-run-httpd\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.640095 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.640488 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nw7pr" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.641147 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.641359 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.646982 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-scripts\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.658784 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-hff7c"] Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.658939 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-config-data\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.667870 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ztpl6"] Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.673017 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zmcx\" (UniqueName: \"kubernetes.io/projected/e987f27f-69d6-4f1e-a9a2-486638ab4505-kube-api-access-4zmcx\") pod \"ceilometer-0\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.711119 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-wjzg5"] Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.712632 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wjzg5" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.721805 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-r9rtd" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.721958 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.722437 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.727288 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.734676 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m79rr\" (UniqueName: \"kubernetes.io/projected/12786f1e-db55-4668-8e43-afa080dc0fa2-kube-api-access-m79rr\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.734726 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12786f1e-db55-4668-8e43-afa080dc0fa2-etc-machine-id\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.734756 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6bhm\" (UniqueName: \"kubernetes.io/projected/58a480c5-a9e3-46da-b3df-4d73473d4b12-kube-api-access-j6bhm\") pod \"barbican-db-sync-ztpl6\" (UID: \"58a480c5-a9e3-46da-b3df-4d73473d4b12\") " pod="openstack/barbican-db-sync-ztpl6" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.734775 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-db-sync-config-data\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.734799 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-logs\") pod \"placement-db-sync-wjzg5\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " pod="openstack/placement-db-sync-wjzg5" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.734821 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-scripts\") pod \"placement-db-sync-wjzg5\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " pod="openstack/placement-db-sync-wjzg5" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.734837 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/58a480c5-a9e3-46da-b3df-4d73473d4b12-db-sync-config-data\") pod \"barbican-db-sync-ztpl6\" (UID: \"58a480c5-a9e3-46da-b3df-4d73473d4b12\") " pod="openstack/barbican-db-sync-ztpl6" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.734860 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-config-data\") pod \"placement-db-sync-wjzg5\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " pod="openstack/placement-db-sync-wjzg5" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.734876 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a480c5-a9e3-46da-b3df-4d73473d4b12-combined-ca-bundle\") pod \"barbican-db-sync-ztpl6\" (UID: \"58a480c5-a9e3-46da-b3df-4d73473d4b12\") " pod="openstack/barbican-db-sync-ztpl6" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.734904 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl7wb\" (UniqueName: \"kubernetes.io/projected/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-kube-api-access-hl7wb\") pod \"placement-db-sync-wjzg5\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " pod="openstack/placement-db-sync-wjzg5" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.734976 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-combined-ca-bundle\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.735024 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-scripts\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.735053 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-config-data\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.735070 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-combined-ca-bundle\") pod \"placement-db-sync-wjzg5\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " pod="openstack/placement-db-sync-wjzg5" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.735352 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12786f1e-db55-4668-8e43-afa080dc0fa2-etc-machine-id\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.739064 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-combined-ca-bundle\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.739115 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wjzg5"] Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.740037 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-scripts\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.741588 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-db-sync-config-data\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.745580 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-config-data\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.767073 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m79rr\" (UniqueName: \"kubernetes.io/projected/12786f1e-db55-4668-8e43-afa080dc0fa2-kube-api-access-m79rr\") pod \"cinder-db-sync-s5j64\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.777488 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mnx84"] Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.780460 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.790656 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mnx84"] Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.838575 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl7wb\" (UniqueName: \"kubernetes.io/projected/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-kube-api-access-hl7wb\") pod \"placement-db-sync-wjzg5\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " pod="openstack/placement-db-sync-wjzg5" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.838727 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc8xg\" (UniqueName: \"kubernetes.io/projected/577aa82c-8c30-4044-9d1a-ff88cc3f390a-kube-api-access-tc8xg\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.838770 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.838816 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-combined-ca-bundle\") pod \"placement-db-sync-wjzg5\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " pod="openstack/placement-db-sync-wjzg5" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.838848 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.838942 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6bhm\" (UniqueName: \"kubernetes.io/projected/58a480c5-a9e3-46da-b3df-4d73473d4b12-kube-api-access-j6bhm\") pod \"barbican-db-sync-ztpl6\" (UID: \"58a480c5-a9e3-46da-b3df-4d73473d4b12\") " pod="openstack/barbican-db-sync-ztpl6" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.838998 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-config\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.839024 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-logs\") pod \"placement-db-sync-wjzg5\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " pod="openstack/placement-db-sync-wjzg5" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.839048 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.839078 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.839105 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-scripts\") pod \"placement-db-sync-wjzg5\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " pod="openstack/placement-db-sync-wjzg5" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.839128 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/58a480c5-a9e3-46da-b3df-4d73473d4b12-db-sync-config-data\") pod \"barbican-db-sync-ztpl6\" (UID: \"58a480c5-a9e3-46da-b3df-4d73473d4b12\") " pod="openstack/barbican-db-sync-ztpl6" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.839217 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-config-data\") pod \"placement-db-sync-wjzg5\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " pod="openstack/placement-db-sync-wjzg5" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.842348 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a480c5-a9e3-46da-b3df-4d73473d4b12-combined-ca-bundle\") pod \"barbican-db-sync-ztpl6\" (UID: \"58a480c5-a9e3-46da-b3df-4d73473d4b12\") " pod="openstack/barbican-db-sync-ztpl6" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.843984 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-logs\") pod \"placement-db-sync-wjzg5\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " pod="openstack/placement-db-sync-wjzg5" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.846524 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a480c5-a9e3-46da-b3df-4d73473d4b12-combined-ca-bundle\") pod \"barbican-db-sync-ztpl6\" (UID: \"58a480c5-a9e3-46da-b3df-4d73473d4b12\") " pod="openstack/barbican-db-sync-ztpl6" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.847353 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/58a480c5-a9e3-46da-b3df-4d73473d4b12-db-sync-config-data\") pod \"barbican-db-sync-ztpl6\" (UID: \"58a480c5-a9e3-46da-b3df-4d73473d4b12\") " pod="openstack/barbican-db-sync-ztpl6" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.851051 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-config-data\") pod \"placement-db-sync-wjzg5\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " pod="openstack/placement-db-sync-wjzg5" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.851362 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-scripts\") pod \"placement-db-sync-wjzg5\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " pod="openstack/placement-db-sync-wjzg5" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.859095 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-combined-ca-bundle\") pod \"placement-db-sync-wjzg5\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " pod="openstack/placement-db-sync-wjzg5" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.881356 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl7wb\" (UniqueName: \"kubernetes.io/projected/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-kube-api-access-hl7wb\") pod \"placement-db-sync-wjzg5\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " pod="openstack/placement-db-sync-wjzg5" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.890064 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6bhm\" (UniqueName: \"kubernetes.io/projected/58a480c5-a9e3-46da-b3df-4d73473d4b12-kube-api-access-j6bhm\") pod \"barbican-db-sync-ztpl6\" (UID: \"58a480c5-a9e3-46da-b3df-4d73473d4b12\") " pod="openstack/barbican-db-sync-ztpl6" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.950719 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc8xg\" (UniqueName: \"kubernetes.io/projected/577aa82c-8c30-4044-9d1a-ff88cc3f390a-kube-api-access-tc8xg\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.950769 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.950820 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.950983 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-config\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.951004 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.951028 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.952357 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-config\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.952877 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.953217 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.953386 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:30 crc kubenswrapper[4766]: I1002 11:13:30.954105 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.005153 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc8xg\" (UniqueName: \"kubernetes.io/projected/577aa82c-8c30-4044-9d1a-ff88cc3f390a-kube-api-access-tc8xg\") pod \"dnsmasq-dns-76fcf4b695-mnx84\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.057065 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s5j64" Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.073399 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ztpl6" Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.091370 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wjzg5" Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.100318 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.349435 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-hff7c"] Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.447276 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bs74v"] Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.629353 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.732968 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c7d8-account-create-mmkqm" Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.766437 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z6gh\" (UniqueName: \"kubernetes.io/projected/d7037a9b-8f0f-4595-892e-3106080371d6-kube-api-access-4z6gh\") pod \"d7037a9b-8f0f-4595-892e-3106080371d6\" (UID: \"d7037a9b-8f0f-4595-892e-3106080371d6\") " Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.774375 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7037a9b-8f0f-4595-892e-3106080371d6-kube-api-access-4z6gh" (OuterVolumeSpecName: "kube-api-access-4z6gh") pod "d7037a9b-8f0f-4595-892e-3106080371d6" (UID: "d7037a9b-8f0f-4595-892e-3106080371d6"). InnerVolumeSpecName "kube-api-access-4z6gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.837455 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e987f27f-69d6-4f1e-a9a2-486638ab4505","Type":"ContainerStarted","Data":"bab85da1103e917e04501474b774a23c998e8a43dcdd1094b3e1b2d6059bba31"} Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.839260 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-hff7c" event={"ID":"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9","Type":"ContainerStarted","Data":"d95465ba2b042de63aeb8ba24dc9b1c1d8d945e5261b79f946bcc0cdb2af94c1"} Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.844629 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c7d8-account-create-mmkqm" event={"ID":"d7037a9b-8f0f-4595-892e-3106080371d6","Type":"ContainerDied","Data":"14de686d1fcd64f3beecc39c79d4c98263da587e18955134e48856af3a5191eb"} Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.844657 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14de686d1fcd64f3beecc39c79d4c98263da587e18955134e48856af3a5191eb" Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.844710 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c7d8-account-create-mmkqm" Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.849946 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bs74v" event={"ID":"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa","Type":"ContainerStarted","Data":"9cabb28ab7818c7cf0e3fed7afdd263c5dcbfba22be76f43af0e49d17cf31249"} Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.850108 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" podUID="cee5abbb-6bd8-4f13-b224-bb3434877e72" containerName="dnsmasq-dns" containerID="cri-o://ff5292127b48fcb9724826491a7a21d46faceeb71f76f7d288257a9a9352fb94" gracePeriod=10 Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.868719 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z6gh\" (UniqueName: \"kubernetes.io/projected/d7037a9b-8f0f-4595-892e-3106080371d6-kube-api-access-4z6gh\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.982464 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-s5j64"] Oct 02 11:13:31 crc kubenswrapper[4766]: I1002 11:13:31.992340 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wjzg5"] Oct 02 11:13:32 crc kubenswrapper[4766]: I1002 11:13:32.004722 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ztpl6"] Oct 02 11:13:32 crc kubenswrapper[4766]: I1002 11:13:32.021897 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mnx84"] Oct 02 11:13:32 crc kubenswrapper[4766]: W1002 11:13:32.076771 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c61ea8c_4cfc_4f0d_97eb_d33c62117db2.slice/crio-5a3d8ae6d77d6bc06c64380ec9f4d8859131be4a4a03317deb006306f4acd50d WatchSource:0}: Error finding container 5a3d8ae6d77d6bc06c64380ec9f4d8859131be4a4a03317deb006306f4acd50d: Status 404 returned error can't find the container with id 5a3d8ae6d77d6bc06c64380ec9f4d8859131be4a4a03317deb006306f4acd50d Oct 02 11:13:32 crc kubenswrapper[4766]: I1002 11:13:32.230242 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:13:32 crc kubenswrapper[4766]: E1002 11:13:32.410403 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda27e8f91_2df0_4f54_b6df_2eaf2187b5a9.slice/crio-conmon-2e18ee546248ef917f6fb04c28136f2e331fa1c97406bef7069ad883b797b249.scope\": RecentStats: unable to find data in memory cache]" Oct 02 11:13:32 crc kubenswrapper[4766]: I1002 11:13:32.875828 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bs74v" event={"ID":"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa","Type":"ContainerStarted","Data":"68d08e12b25020fbe30c5d0dfd77d7b48a170b4b187be4c4958856f980dbd584"} Oct 02 11:13:32 crc kubenswrapper[4766]: I1002 11:13:32.878228 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wjzg5" event={"ID":"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2","Type":"ContainerStarted","Data":"5a3d8ae6d77d6bc06c64380ec9f4d8859131be4a4a03317deb006306f4acd50d"} Oct 02 11:13:32 crc kubenswrapper[4766]: I1002 11:13:32.882861 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ztpl6" event={"ID":"58a480c5-a9e3-46da-b3df-4d73473d4b12","Type":"ContainerStarted","Data":"1a4f39d2e61a7af2ebbe64e3390ece1d3e90ce95f2e4470e501babbe65a8a0bd"} Oct 02 11:13:32 crc kubenswrapper[4766]: I1002 11:13:32.885556 4766 generic.go:334] "Generic (PLEG): container finished" podID="cee5abbb-6bd8-4f13-b224-bb3434877e72" containerID="ff5292127b48fcb9724826491a7a21d46faceeb71f76f7d288257a9a9352fb94" exitCode=0 Oct 02 11:13:32 crc kubenswrapper[4766]: I1002 11:13:32.885625 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" event={"ID":"cee5abbb-6bd8-4f13-b224-bb3434877e72","Type":"ContainerDied","Data":"ff5292127b48fcb9724826491a7a21d46faceeb71f76f7d288257a9a9352fb94"} Oct 02 11:13:32 crc kubenswrapper[4766]: I1002 11:13:32.890424 4766 generic.go:334] "Generic (PLEG): container finished" podID="a27e8f91-2df0-4f54-b6df-2eaf2187b5a9" containerID="2e18ee546248ef917f6fb04c28136f2e331fa1c97406bef7069ad883b797b249" exitCode=0 Oct 02 11:13:32 crc kubenswrapper[4766]: I1002 11:13:32.890488 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-hff7c" event={"ID":"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9","Type":"ContainerDied","Data":"2e18ee546248ef917f6fb04c28136f2e331fa1c97406bef7069ad883b797b249"} Oct 02 11:13:32 crc kubenswrapper[4766]: I1002 11:13:32.904475 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bs74v" podStartSLOduration=2.9044530379999998 podStartE2EDuration="2.904453038s" podCreationTimestamp="2025-10-02 11:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:13:32.900420399 +0000 UTC m=+1327.843291353" watchObservedRunningTime="2025-10-02 11:13:32.904453038 +0000 UTC m=+1327.847323982" Oct 02 11:13:32 crc kubenswrapper[4766]: I1002 11:13:32.906555 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s5j64" event={"ID":"12786f1e-db55-4668-8e43-afa080dc0fa2","Type":"ContainerStarted","Data":"17eb98bc53a15ec1cfa13fd3ee84be9351bd857e450c1f79a5d5f052cfe92cd6"} Oct 02 11:13:32 crc kubenswrapper[4766]: I1002 11:13:32.910119 4766 generic.go:334] "Generic (PLEG): container finished" podID="577aa82c-8c30-4044-9d1a-ff88cc3f390a" containerID="10d7d4fd71d139575da833850c09e54e2286d6072e71abd8fda27847693d7b1e" exitCode=0 Oct 02 11:13:32 crc kubenswrapper[4766]: I1002 11:13:32.910149 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" event={"ID":"577aa82c-8c30-4044-9d1a-ff88cc3f390a","Type":"ContainerDied","Data":"10d7d4fd71d139575da833850c09e54e2286d6072e71abd8fda27847693d7b1e"} Oct 02 11:13:32 crc kubenswrapper[4766]: I1002 11:13:32.910165 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" event={"ID":"577aa82c-8c30-4044-9d1a-ff88cc3f390a","Type":"ContainerStarted","Data":"761a3baefc1408afe92b53cb7e2c36d012bb9fd45b4458579b58a7df3ce1a0f9"} Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.153631 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.209524 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-dns-svc\") pod \"cee5abbb-6bd8-4f13-b224-bb3434877e72\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.212868 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-ovsdbserver-sb\") pod \"cee5abbb-6bd8-4f13-b224-bb3434877e72\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.214202 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-ovsdbserver-nb\") pod \"cee5abbb-6bd8-4f13-b224-bb3434877e72\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.214242 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-config\") pod \"cee5abbb-6bd8-4f13-b224-bb3434877e72\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.214274 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-dns-swift-storage-0\") pod \"cee5abbb-6bd8-4f13-b224-bb3434877e72\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.214526 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmbz9\" (UniqueName: \"kubernetes.io/projected/cee5abbb-6bd8-4f13-b224-bb3434877e72-kube-api-access-fmbz9\") pod \"cee5abbb-6bd8-4f13-b224-bb3434877e72\" (UID: \"cee5abbb-6bd8-4f13-b224-bb3434877e72\") " Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.223080 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cee5abbb-6bd8-4f13-b224-bb3434877e72-kube-api-access-fmbz9" (OuterVolumeSpecName: "kube-api-access-fmbz9") pod "cee5abbb-6bd8-4f13-b224-bb3434877e72" (UID: "cee5abbb-6bd8-4f13-b224-bb3434877e72"). InnerVolumeSpecName "kube-api-access-fmbz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.309561 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-xc8gb"] Oct 02 11:13:33 crc kubenswrapper[4766]: E1002 11:13:33.310009 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7037a9b-8f0f-4595-892e-3106080371d6" containerName="mariadb-account-create" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.310026 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7037a9b-8f0f-4595-892e-3106080371d6" containerName="mariadb-account-create" Oct 02 11:13:33 crc kubenswrapper[4766]: E1002 11:13:33.310043 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee5abbb-6bd8-4f13-b224-bb3434877e72" containerName="dnsmasq-dns" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.310051 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee5abbb-6bd8-4f13-b224-bb3434877e72" containerName="dnsmasq-dns" Oct 02 11:13:33 crc kubenswrapper[4766]: E1002 11:13:33.310064 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cee5abbb-6bd8-4f13-b224-bb3434877e72" containerName="init" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.310071 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee5abbb-6bd8-4f13-b224-bb3434877e72" containerName="init" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.310285 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7037a9b-8f0f-4595-892e-3106080371d6" containerName="mariadb-account-create" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.310302 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="cee5abbb-6bd8-4f13-b224-bb3434877e72" containerName="dnsmasq-dns" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.312101 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xc8gb" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.317620 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.317837 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.318003 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmbz9\" (UniqueName: \"kubernetes.io/projected/cee5abbb-6bd8-4f13-b224-bb3434877e72-kube-api-access-fmbz9\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.318448 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-25rq9" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.320973 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.323877 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-xc8gb"] Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.419245 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-dns-swift-storage-0\") pod \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.420496 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6xtf\" (UniqueName: \"kubernetes.io/projected/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-kube-api-access-c6xtf\") pod \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.420933 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-ovsdbserver-sb\") pod \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.421061 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-config\") pod \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.421204 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-ovsdbserver-nb\") pod \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.421471 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-dns-svc\") pod \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\" (UID: \"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9\") " Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.421996 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkrdr\" (UniqueName: \"kubernetes.io/projected/fea98489-bbfa-4490-9e89-40a19bfb594f-kube-api-access-tkrdr\") pod \"neutron-db-sync-xc8gb\" (UID: \"fea98489-bbfa-4490-9e89-40a19bfb594f\") " pod="openstack/neutron-db-sync-xc8gb" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.422584 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fea98489-bbfa-4490-9e89-40a19bfb594f-config\") pod \"neutron-db-sync-xc8gb\" (UID: \"fea98489-bbfa-4490-9e89-40a19bfb594f\") " pod="openstack/neutron-db-sync-xc8gb" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.422716 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea98489-bbfa-4490-9e89-40a19bfb594f-combined-ca-bundle\") pod \"neutron-db-sync-xc8gb\" (UID: \"fea98489-bbfa-4490-9e89-40a19bfb594f\") " pod="openstack/neutron-db-sync-xc8gb" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.424305 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-config" (OuterVolumeSpecName: "config") pod "cee5abbb-6bd8-4f13-b224-bb3434877e72" (UID: "cee5abbb-6bd8-4f13-b224-bb3434877e72"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.426829 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-kube-api-access-c6xtf" (OuterVolumeSpecName: "kube-api-access-c6xtf") pod "a27e8f91-2df0-4f54-b6df-2eaf2187b5a9" (UID: "a27e8f91-2df0-4f54-b6df-2eaf2187b5a9"). InnerVolumeSpecName "kube-api-access-c6xtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.441108 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cee5abbb-6bd8-4f13-b224-bb3434877e72" (UID: "cee5abbb-6bd8-4f13-b224-bb3434877e72"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.444830 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cee5abbb-6bd8-4f13-b224-bb3434877e72" (UID: "cee5abbb-6bd8-4f13-b224-bb3434877e72"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.457304 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a27e8f91-2df0-4f54-b6df-2eaf2187b5a9" (UID: "a27e8f91-2df0-4f54-b6df-2eaf2187b5a9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.464268 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a27e8f91-2df0-4f54-b6df-2eaf2187b5a9" (UID: "a27e8f91-2df0-4f54-b6df-2eaf2187b5a9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.479158 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a27e8f91-2df0-4f54-b6df-2eaf2187b5a9" (UID: "a27e8f91-2df0-4f54-b6df-2eaf2187b5a9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.483818 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cee5abbb-6bd8-4f13-b224-bb3434877e72" (UID: "cee5abbb-6bd8-4f13-b224-bb3434877e72"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.486680 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cee5abbb-6bd8-4f13-b224-bb3434877e72" (UID: "cee5abbb-6bd8-4f13-b224-bb3434877e72"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.486989 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a27e8f91-2df0-4f54-b6df-2eaf2187b5a9" (UID: "a27e8f91-2df0-4f54-b6df-2eaf2187b5a9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.490551 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-config" (OuterVolumeSpecName: "config") pod "a27e8f91-2df0-4f54-b6df-2eaf2187b5a9" (UID: "a27e8f91-2df0-4f54-b6df-2eaf2187b5a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.523860 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fea98489-bbfa-4490-9e89-40a19bfb594f-config\") pod \"neutron-db-sync-xc8gb\" (UID: \"fea98489-bbfa-4490-9e89-40a19bfb594f\") " pod="openstack/neutron-db-sync-xc8gb" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.523914 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea98489-bbfa-4490-9e89-40a19bfb594f-combined-ca-bundle\") pod \"neutron-db-sync-xc8gb\" (UID: \"fea98489-bbfa-4490-9e89-40a19bfb594f\") " pod="openstack/neutron-db-sync-xc8gb" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.524234 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkrdr\" (UniqueName: \"kubernetes.io/projected/fea98489-bbfa-4490-9e89-40a19bfb594f-kube-api-access-tkrdr\") pod \"neutron-db-sync-xc8gb\" (UID: \"fea98489-bbfa-4490-9e89-40a19bfb594f\") " pod="openstack/neutron-db-sync-xc8gb" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.524328 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.524339 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.524347 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.524461 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.524497 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.524521 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.524534 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cee5abbb-6bd8-4f13-b224-bb3434877e72-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.524549 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6xtf\" (UniqueName: \"kubernetes.io/projected/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-kube-api-access-c6xtf\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.524563 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.524577 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.524588 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.528753 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea98489-bbfa-4490-9e89-40a19bfb594f-combined-ca-bundle\") pod \"neutron-db-sync-xc8gb\" (UID: \"fea98489-bbfa-4490-9e89-40a19bfb594f\") " pod="openstack/neutron-db-sync-xc8gb" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.534056 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fea98489-bbfa-4490-9e89-40a19bfb594f-config\") pod \"neutron-db-sync-xc8gb\" (UID: \"fea98489-bbfa-4490-9e89-40a19bfb594f\") " pod="openstack/neutron-db-sync-xc8gb" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.545456 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkrdr\" (UniqueName: \"kubernetes.io/projected/fea98489-bbfa-4490-9e89-40a19bfb594f-kube-api-access-tkrdr\") pod \"neutron-db-sync-xc8gb\" (UID: \"fea98489-bbfa-4490-9e89-40a19bfb594f\") " pod="openstack/neutron-db-sync-xc8gb" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.679041 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xc8gb" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.966015 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.966024 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-vrk47" event={"ID":"cee5abbb-6bd8-4f13-b224-bb3434877e72","Type":"ContainerDied","Data":"64c1c014447d7078306625309a4e262149515dc6f443d0344bdaea19f1c28440"} Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.966751 4766 scope.go:117] "RemoveContainer" containerID="ff5292127b48fcb9724826491a7a21d46faceeb71f76f7d288257a9a9352fb94" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.976998 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-hff7c" event={"ID":"a27e8f91-2df0-4f54-b6df-2eaf2187b5a9","Type":"ContainerDied","Data":"d95465ba2b042de63aeb8ba24dc9b1c1d8d945e5261b79f946bcc0cdb2af94c1"} Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.977030 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-hff7c" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.988004 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" event={"ID":"577aa82c-8c30-4044-9d1a-ff88cc3f390a","Type":"ContainerStarted","Data":"b264c9db174673bc4ffbf1f294738343f7e8f59f993f08f7e6b0bb265cf2db85"} Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.988556 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:33 crc kubenswrapper[4766]: I1002 11:13:33.997029 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vrk47"] Oct 02 11:13:34 crc kubenswrapper[4766]: I1002 11:13:34.005335 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-vrk47"] Oct 02 11:13:34 crc kubenswrapper[4766]: I1002 11:13:34.012670 4766 scope.go:117] "RemoveContainer" containerID="75f02eafed39214ffa2fae29bc21ea5aedc9c11f99a3a84d457a57f87b46ddad" Oct 02 11:13:34 crc kubenswrapper[4766]: I1002 11:13:34.057332 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-hff7c"] Oct 02 11:13:34 crc kubenswrapper[4766]: I1002 11:13:34.065356 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-hff7c"] Oct 02 11:13:34 crc kubenswrapper[4766]: I1002 11:13:34.065696 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" podStartSLOduration=4.065676871 podStartE2EDuration="4.065676871s" podCreationTimestamp="2025-10-02 11:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:13:34.051688235 +0000 UTC m=+1328.994559179" watchObservedRunningTime="2025-10-02 11:13:34.065676871 +0000 UTC m=+1329.008547815" Oct 02 11:13:34 crc kubenswrapper[4766]: I1002 11:13:34.067638 4766 scope.go:117] "RemoveContainer" containerID="2e18ee546248ef917f6fb04c28136f2e331fa1c97406bef7069ad883b797b249" Oct 02 11:13:34 crc kubenswrapper[4766]: I1002 11:13:34.149203 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-xc8gb"] Oct 02 11:13:34 crc kubenswrapper[4766]: W1002 11:13:34.157161 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfea98489_bbfa_4490_9e89_40a19bfb594f.slice/crio-dd66c1dd9497b9bd79d7b172ac20ef09890d52077ca50fd87192e43a1a24124c WatchSource:0}: Error finding container dd66c1dd9497b9bd79d7b172ac20ef09890d52077ca50fd87192e43a1a24124c: Status 404 returned error can't find the container with id dd66c1dd9497b9bd79d7b172ac20ef09890d52077ca50fd87192e43a1a24124c Oct 02 11:13:34 crc kubenswrapper[4766]: I1002 11:13:34.996588 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xc8gb" event={"ID":"fea98489-bbfa-4490-9e89-40a19bfb594f","Type":"ContainerStarted","Data":"edbb3da01856c2a41dd64eefb64dc0d38f95f5a264b444f1dbcac1d668a8635d"} Oct 02 11:13:34 crc kubenswrapper[4766]: I1002 11:13:34.996816 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xc8gb" event={"ID":"fea98489-bbfa-4490-9e89-40a19bfb594f","Type":"ContainerStarted","Data":"dd66c1dd9497b9bd79d7b172ac20ef09890d52077ca50fd87192e43a1a24124c"} Oct 02 11:13:35 crc kubenswrapper[4766]: I1002 11:13:35.016470 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-xc8gb" podStartSLOduration=2.016451778 podStartE2EDuration="2.016451778s" podCreationTimestamp="2025-10-02 11:13:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:13:35.011424939 +0000 UTC m=+1329.954295903" watchObservedRunningTime="2025-10-02 11:13:35.016451778 +0000 UTC m=+1329.959322722" Oct 02 11:13:35 crc kubenswrapper[4766]: I1002 11:13:35.893595 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a27e8f91-2df0-4f54-b6df-2eaf2187b5a9" path="/var/lib/kubelet/pods/a27e8f91-2df0-4f54-b6df-2eaf2187b5a9/volumes" Oct 02 11:13:35 crc kubenswrapper[4766]: I1002 11:13:35.894198 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cee5abbb-6bd8-4f13-b224-bb3434877e72" path="/var/lib/kubelet/pods/cee5abbb-6bd8-4f13-b224-bb3434877e72/volumes" Oct 02 11:13:41 crc kubenswrapper[4766]: I1002 11:13:41.101729 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:13:41 crc kubenswrapper[4766]: I1002 11:13:41.153876 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-t7x8d"] Oct 02 11:13:41 crc kubenswrapper[4766]: I1002 11:13:41.154146 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-t7x8d" podUID="f2569c54-b0a8-456b-b311-264d6605d4ed" containerName="dnsmasq-dns" containerID="cri-o://7d8c49753a9da23134fe46b471ef13f2409e29f8b10dba2b308f93ed7b837080" gracePeriod=10 Oct 02 11:13:42 crc kubenswrapper[4766]: I1002 11:13:42.061976 4766 generic.go:334] "Generic (PLEG): container finished" podID="f2569c54-b0a8-456b-b311-264d6605d4ed" containerID="7d8c49753a9da23134fe46b471ef13f2409e29f8b10dba2b308f93ed7b837080" exitCode=0 Oct 02 11:13:42 crc kubenswrapper[4766]: I1002 11:13:42.062022 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-t7x8d" event={"ID":"f2569c54-b0a8-456b-b311-264d6605d4ed","Type":"ContainerDied","Data":"7d8c49753a9da23134fe46b471ef13f2409e29f8b10dba2b308f93ed7b837080"} Oct 02 11:13:43 crc kubenswrapper[4766]: I1002 11:13:43.210998 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-t7x8d" podUID="f2569c54-b0a8-456b-b311-264d6605d4ed" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: connect: connection refused" Oct 02 11:13:48 crc kubenswrapper[4766]: I1002 11:13:48.211228 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-t7x8d" podUID="f2569c54-b0a8-456b-b311-264d6605d4ed" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: connect: connection refused" Oct 02 11:13:53 crc kubenswrapper[4766]: I1002 11:13:53.210838 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-t7x8d" podUID="f2569c54-b0a8-456b-b311-264d6605d4ed" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: connect: connection refused" Oct 02 11:13:53 crc kubenswrapper[4766]: I1002 11:13:53.211537 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:13:58 crc kubenswrapper[4766]: I1002 11:13:58.210873 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-t7x8d" podUID="f2569c54-b0a8-456b-b311-264d6605d4ed" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: connect: connection refused" Oct 02 11:14:03 crc kubenswrapper[4766]: I1002 11:14:03.211101 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-t7x8d" podUID="f2569c54-b0a8-456b-b311-264d6605d4ed" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: connect: connection refused" Oct 02 11:14:03 crc kubenswrapper[4766]: E1002 11:14:03.644798 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Oct 02 11:14:03 crc kubenswrapper[4766]: E1002 11:14:03.644999 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hl7wb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-wjzg5_openstack(3c61ea8c-4cfc-4f0d-97eb-d33c62117db2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:14:03 crc kubenswrapper[4766]: E1002 11:14:03.646208 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-wjzg5" podUID="3c61ea8c-4cfc-4f0d-97eb-d33c62117db2" Oct 02 11:14:04 crc kubenswrapper[4766]: E1002 11:14:04.242796 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-wjzg5" podUID="3c61ea8c-4cfc-4f0d-97eb-d33c62117db2" Oct 02 11:14:08 crc kubenswrapper[4766]: I1002 11:14:08.212234 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-t7x8d" podUID="f2569c54-b0a8-456b-b311-264d6605d4ed" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: connect: connection refused" Oct 02 11:14:10 crc kubenswrapper[4766]: I1002 11:14:10.291354 4766 generic.go:334] "Generic (PLEG): container finished" podID="8bddbcf6-aa73-4b9e-934f-1ca8d37188aa" containerID="68d08e12b25020fbe30c5d0dfd77d7b48a170b4b187be4c4958856f980dbd584" exitCode=0 Oct 02 11:14:10 crc kubenswrapper[4766]: I1002 11:14:10.291464 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bs74v" event={"ID":"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa","Type":"ContainerDied","Data":"68d08e12b25020fbe30c5d0dfd77d7b48a170b4b187be4c4958856f980dbd584"} Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.505528 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.591629 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-credential-keys\") pod \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.591695 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-config-data\") pod \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.591762 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-combined-ca-bundle\") pod \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.591796 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-fernet-keys\") pod \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.591922 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-scripts\") pod \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.591988 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvn4g\" (UniqueName: \"kubernetes.io/projected/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-kube-api-access-gvn4g\") pod \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\" (UID: \"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa\") " Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.602320 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8bddbcf6-aa73-4b9e-934f-1ca8d37188aa" (UID: "8bddbcf6-aa73-4b9e-934f-1ca8d37188aa"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.602358 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-kube-api-access-gvn4g" (OuterVolumeSpecName: "kube-api-access-gvn4g") pod "8bddbcf6-aa73-4b9e-934f-1ca8d37188aa" (UID: "8bddbcf6-aa73-4b9e-934f-1ca8d37188aa"). InnerVolumeSpecName "kube-api-access-gvn4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.602454 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-scripts" (OuterVolumeSpecName: "scripts") pod "8bddbcf6-aa73-4b9e-934f-1ca8d37188aa" (UID: "8bddbcf6-aa73-4b9e-934f-1ca8d37188aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.603560 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8bddbcf6-aa73-4b9e-934f-1ca8d37188aa" (UID: "8bddbcf6-aa73-4b9e-934f-1ca8d37188aa"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.635609 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bddbcf6-aa73-4b9e-934f-1ca8d37188aa" (UID: "8bddbcf6-aa73-4b9e-934f-1ca8d37188aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.642811 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-config-data" (OuterVolumeSpecName: "config-data") pod "8bddbcf6-aa73-4b9e-934f-1ca8d37188aa" (UID: "8bddbcf6-aa73-4b9e-934f-1ca8d37188aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.695016 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.695090 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvn4g\" (UniqueName: \"kubernetes.io/projected/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-kube-api-access-gvn4g\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.695117 4766 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.695140 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.695160 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:12 crc kubenswrapper[4766]: I1002 11:14:12.695178 4766 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.210804 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-t7x8d" podUID="f2569c54-b0a8-456b-b311-264d6605d4ed" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.111:5353: connect: connection refused" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.324052 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bs74v" event={"ID":"8bddbcf6-aa73-4b9e-934f-1ca8d37188aa","Type":"ContainerDied","Data":"9cabb28ab7818c7cf0e3fed7afdd263c5dcbfba22be76f43af0e49d17cf31249"} Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.324093 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cabb28ab7818c7cf0e3fed7afdd263c5dcbfba22be76f43af0e49d17cf31249" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.324137 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bs74v" Oct 02 11:14:13 crc kubenswrapper[4766]: E1002 11:14:13.338187 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 02 11:14:13 crc kubenswrapper[4766]: E1002 11:14:13.338382 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-j6bhm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-ztpl6_openstack(58a480c5-a9e3-46da-b3df-4d73473d4b12): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:14:13 crc kubenswrapper[4766]: E1002 11:14:13.339540 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-ztpl6" podUID="58a480c5-a9e3-46da-b3df-4d73473d4b12" Oct 02 11:14:13 crc kubenswrapper[4766]: E1002 11:14:13.434332 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bddbcf6_aa73_4b9e_934f_1ca8d37188aa.slice\": RecentStats: unable to find data in memory cache]" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.591814 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bs74v"] Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.599679 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bs74v"] Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.710160 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jzlb4"] Oct 02 11:14:13 crc kubenswrapper[4766]: E1002 11:14:13.710845 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27e8f91-2df0-4f54-b6df-2eaf2187b5a9" containerName="init" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.710874 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27e8f91-2df0-4f54-b6df-2eaf2187b5a9" containerName="init" Oct 02 11:14:13 crc kubenswrapper[4766]: E1002 11:14:13.710912 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bddbcf6-aa73-4b9e-934f-1ca8d37188aa" containerName="keystone-bootstrap" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.710920 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bddbcf6-aa73-4b9e-934f-1ca8d37188aa" containerName="keystone-bootstrap" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.711153 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a27e8f91-2df0-4f54-b6df-2eaf2187b5a9" containerName="init" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.711198 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bddbcf6-aa73-4b9e-934f-1ca8d37188aa" containerName="keystone-bootstrap" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.712137 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.715762 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.715994 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kf9gx" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.716844 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.716913 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.726850 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jzlb4"] Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.813948 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgsgl\" (UniqueName: \"kubernetes.io/projected/d6918fd8-4c73-477e-bacd-ed09a36838e6-kube-api-access-bgsgl\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.814020 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-combined-ca-bundle\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.814055 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-config-data\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.814772 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-scripts\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.814958 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-credential-keys\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.815039 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-fernet-keys\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.894195 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bddbcf6-aa73-4b9e-934f-1ca8d37188aa" path="/var/lib/kubelet/pods/8bddbcf6-aa73-4b9e-934f-1ca8d37188aa/volumes" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.917657 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-scripts\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.917806 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-credential-keys\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.917859 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-fernet-keys\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.918068 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgsgl\" (UniqueName: \"kubernetes.io/projected/d6918fd8-4c73-477e-bacd-ed09a36838e6-kube-api-access-bgsgl\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.918197 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-combined-ca-bundle\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.918240 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-config-data\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.922945 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-scripts\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.923827 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-credential-keys\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.925015 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-config-data\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.934263 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-fernet-keys\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.937846 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgsgl\" (UniqueName: \"kubernetes.io/projected/d6918fd8-4c73-477e-bacd-ed09a36838e6-kube-api-access-bgsgl\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:13 crc kubenswrapper[4766]: I1002 11:14:13.943044 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-combined-ca-bundle\") pod \"keystone-bootstrap-jzlb4\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:14 crc kubenswrapper[4766]: I1002 11:14:14.040953 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:14 crc kubenswrapper[4766]: E1002 11:14:14.332053 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-ztpl6" podUID="58a480c5-a9e3-46da-b3df-4d73473d4b12" Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.113667 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.249893 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gflgq\" (UniqueName: \"kubernetes.io/projected/f2569c54-b0a8-456b-b311-264d6605d4ed-kube-api-access-gflgq\") pod \"f2569c54-b0a8-456b-b311-264d6605d4ed\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.249978 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-config\") pod \"f2569c54-b0a8-456b-b311-264d6605d4ed\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.250004 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-dns-svc\") pod \"f2569c54-b0a8-456b-b311-264d6605d4ed\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.250057 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-ovsdbserver-sb\") pod \"f2569c54-b0a8-456b-b311-264d6605d4ed\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.250258 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-ovsdbserver-nb\") pod \"f2569c54-b0a8-456b-b311-264d6605d4ed\" (UID: \"f2569c54-b0a8-456b-b311-264d6605d4ed\") " Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.256551 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2569c54-b0a8-456b-b311-264d6605d4ed-kube-api-access-gflgq" (OuterVolumeSpecName: "kube-api-access-gflgq") pod "f2569c54-b0a8-456b-b311-264d6605d4ed" (UID: "f2569c54-b0a8-456b-b311-264d6605d4ed"). InnerVolumeSpecName "kube-api-access-gflgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.305335 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-config" (OuterVolumeSpecName: "config") pod "f2569c54-b0a8-456b-b311-264d6605d4ed" (UID: "f2569c54-b0a8-456b-b311-264d6605d4ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.306009 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f2569c54-b0a8-456b-b311-264d6605d4ed" (UID: "f2569c54-b0a8-456b-b311-264d6605d4ed"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.318358 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f2569c54-b0a8-456b-b311-264d6605d4ed" (UID: "f2569c54-b0a8-456b-b311-264d6605d4ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.333940 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f2569c54-b0a8-456b-b311-264d6605d4ed" (UID: "f2569c54-b0a8-456b-b311-264d6605d4ed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.341408 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-t7x8d" event={"ID":"f2569c54-b0a8-456b-b311-264d6605d4ed","Type":"ContainerDied","Data":"a8a2e591d3a8f764b2111381f79db9125f9da1be9e9446593f2c7c6eed6d8690"} Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.341470 4766 scope.go:117] "RemoveContainer" containerID="7d8c49753a9da23134fe46b471ef13f2409e29f8b10dba2b308f93ed7b837080" Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.341583 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-t7x8d" Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.352691 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.352723 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.352732 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.352744 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f2569c54-b0a8-456b-b311-264d6605d4ed-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.352772 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gflgq\" (UniqueName: \"kubernetes.io/projected/f2569c54-b0a8-456b-b311-264d6605d4ed-kube-api-access-gflgq\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.375244 4766 scope.go:117] "RemoveContainer" containerID="ebe4e7a9575c91241d0e641d7a4188a1625c7bac14d55f996a3e6627b50b28b9" Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.412746 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-t7x8d"] Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.423364 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-t7x8d"] Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.432562 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jzlb4"] Oct 02 11:14:15 crc kubenswrapper[4766]: I1002 11:14:15.894130 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2569c54-b0a8-456b-b311-264d6605d4ed" path="/var/lib/kubelet/pods/f2569c54-b0a8-456b-b311-264d6605d4ed/volumes" Oct 02 11:14:16 crc kubenswrapper[4766]: E1002 11:14:16.313352 4766 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 02 11:14:16 crc kubenswrapper[4766]: E1002 11:14:16.313547 4766 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m79rr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-s5j64_openstack(12786f1e-db55-4668-8e43-afa080dc0fa2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 02 11:14:16 crc kubenswrapper[4766]: E1002 11:14:16.316983 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-s5j64" podUID="12786f1e-db55-4668-8e43-afa080dc0fa2" Oct 02 11:14:16 crc kubenswrapper[4766]: I1002 11:14:16.351382 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e987f27f-69d6-4f1e-a9a2-486638ab4505","Type":"ContainerStarted","Data":"2c054c8ea3985c615aeed264f3de67c7b80313a458aaeb66c1e5049eb169aa9a"} Oct 02 11:14:16 crc kubenswrapper[4766]: I1002 11:14:16.354010 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jzlb4" event={"ID":"d6918fd8-4c73-477e-bacd-ed09a36838e6","Type":"ContainerStarted","Data":"12eb8d333452acd192738223ca8f4fa9dc2c00bb7c7f6c98059bd4f4f3f58665"} Oct 02 11:14:16 crc kubenswrapper[4766]: I1002 11:14:16.354060 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jzlb4" event={"ID":"d6918fd8-4c73-477e-bacd-ed09a36838e6","Type":"ContainerStarted","Data":"6cc7f4d1dd5446c933608905ffba55124c68259e439f5f31c37298e08815e7ef"} Oct 02 11:14:16 crc kubenswrapper[4766]: E1002 11:14:16.355118 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-s5j64" podUID="12786f1e-db55-4668-8e43-afa080dc0fa2" Oct 02 11:14:16 crc kubenswrapper[4766]: I1002 11:14:16.418474 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jzlb4" podStartSLOduration=3.4184273 podStartE2EDuration="3.4184273s" podCreationTimestamp="2025-10-02 11:14:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:14:16.416682743 +0000 UTC m=+1371.359553687" watchObservedRunningTime="2025-10-02 11:14:16.4184273 +0000 UTC m=+1371.361298244" Oct 02 11:14:17 crc kubenswrapper[4766]: I1002 11:14:17.363620 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wjzg5" event={"ID":"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2","Type":"ContainerStarted","Data":"9e3131de3333db6b7ac4e841b0504597a46f9838934a930881def24fa734d3b6"} Oct 02 11:14:17 crc kubenswrapper[4766]: I1002 11:14:17.382719 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-wjzg5" podStartSLOduration=2.851108517 podStartE2EDuration="47.382704577s" podCreationTimestamp="2025-10-02 11:13:30 +0000 UTC" firstStartedPulling="2025-10-02 11:13:32.079459436 +0000 UTC m=+1327.022330380" lastFinishedPulling="2025-10-02 11:14:16.611055496 +0000 UTC m=+1371.553926440" observedRunningTime="2025-10-02 11:14:17.379903707 +0000 UTC m=+1372.322774651" watchObservedRunningTime="2025-10-02 11:14:17.382704577 +0000 UTC m=+1372.325575521" Oct 02 11:14:19 crc kubenswrapper[4766]: I1002 11:14:19.383957 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e987f27f-69d6-4f1e-a9a2-486638ab4505","Type":"ContainerStarted","Data":"3b99448d07e099bf2885bf4c7656f473d56a1ed9da62660d94ce7d444859b3c3"} Oct 02 11:14:25 crc kubenswrapper[4766]: I1002 11:14:25.436032 4766 generic.go:334] "Generic (PLEG): container finished" podID="d6918fd8-4c73-477e-bacd-ed09a36838e6" containerID="12eb8d333452acd192738223ca8f4fa9dc2c00bb7c7f6c98059bd4f4f3f58665" exitCode=0 Oct 02 11:14:25 crc kubenswrapper[4766]: I1002 11:14:25.436406 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jzlb4" event={"ID":"d6918fd8-4c73-477e-bacd-ed09a36838e6","Type":"ContainerDied","Data":"12eb8d333452acd192738223ca8f4fa9dc2c00bb7c7f6c98059bd4f4f3f58665"} Oct 02 11:14:26 crc kubenswrapper[4766]: I1002 11:14:26.859654 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:26 crc kubenswrapper[4766]: I1002 11:14:26.980268 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgsgl\" (UniqueName: \"kubernetes.io/projected/d6918fd8-4c73-477e-bacd-ed09a36838e6-kube-api-access-bgsgl\") pod \"d6918fd8-4c73-477e-bacd-ed09a36838e6\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " Oct 02 11:14:26 crc kubenswrapper[4766]: I1002 11:14:26.980420 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-credential-keys\") pod \"d6918fd8-4c73-477e-bacd-ed09a36838e6\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " Oct 02 11:14:26 crc kubenswrapper[4766]: I1002 11:14:26.980460 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-scripts\") pod \"d6918fd8-4c73-477e-bacd-ed09a36838e6\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " Oct 02 11:14:26 crc kubenswrapper[4766]: I1002 11:14:26.980541 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-config-data\") pod \"d6918fd8-4c73-477e-bacd-ed09a36838e6\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " Oct 02 11:14:26 crc kubenswrapper[4766]: I1002 11:14:26.980612 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-combined-ca-bundle\") pod \"d6918fd8-4c73-477e-bacd-ed09a36838e6\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " Oct 02 11:14:26 crc kubenswrapper[4766]: I1002 11:14:26.980668 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-fernet-keys\") pod \"d6918fd8-4c73-477e-bacd-ed09a36838e6\" (UID: \"d6918fd8-4c73-477e-bacd-ed09a36838e6\") " Oct 02 11:14:26 crc kubenswrapper[4766]: I1002 11:14:26.987243 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d6918fd8-4c73-477e-bacd-ed09a36838e6" (UID: "d6918fd8-4c73-477e-bacd-ed09a36838e6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:26 crc kubenswrapper[4766]: I1002 11:14:26.987319 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d6918fd8-4c73-477e-bacd-ed09a36838e6" (UID: "d6918fd8-4c73-477e-bacd-ed09a36838e6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:26 crc kubenswrapper[4766]: I1002 11:14:26.989641 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6918fd8-4c73-477e-bacd-ed09a36838e6-kube-api-access-bgsgl" (OuterVolumeSpecName: "kube-api-access-bgsgl") pod "d6918fd8-4c73-477e-bacd-ed09a36838e6" (UID: "d6918fd8-4c73-477e-bacd-ed09a36838e6"). InnerVolumeSpecName "kube-api-access-bgsgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:14:26 crc kubenswrapper[4766]: I1002 11:14:26.997118 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-scripts" (OuterVolumeSpecName: "scripts") pod "d6918fd8-4c73-477e-bacd-ed09a36838e6" (UID: "d6918fd8-4c73-477e-bacd-ed09a36838e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.005208 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-config-data" (OuterVolumeSpecName: "config-data") pod "d6918fd8-4c73-477e-bacd-ed09a36838e6" (UID: "d6918fd8-4c73-477e-bacd-ed09a36838e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.006801 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6918fd8-4c73-477e-bacd-ed09a36838e6" (UID: "d6918fd8-4c73-477e-bacd-ed09a36838e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.082131 4766 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.082194 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.082205 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.082215 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.082226 4766 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d6918fd8-4c73-477e-bacd-ed09a36838e6-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.082234 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgsgl\" (UniqueName: \"kubernetes.io/projected/d6918fd8-4c73-477e-bacd-ed09a36838e6-kube-api-access-bgsgl\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.457797 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e987f27f-69d6-4f1e-a9a2-486638ab4505","Type":"ContainerStarted","Data":"d8f73b782fe717cf312b989671d64632b8ac18cda3ae18a553b5c9b2bf5080f9"} Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.460731 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jzlb4" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.460735 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jzlb4" event={"ID":"d6918fd8-4c73-477e-bacd-ed09a36838e6","Type":"ContainerDied","Data":"6cc7f4d1dd5446c933608905ffba55124c68259e439f5f31c37298e08815e7ef"} Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.460783 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cc7f4d1dd5446c933608905ffba55124c68259e439f5f31c37298e08815e7ef" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.462597 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ztpl6" event={"ID":"58a480c5-a9e3-46da-b3df-4d73473d4b12","Type":"ContainerStarted","Data":"e59b9e3e28f7f3521b5649e7c481c6db733ebb4c636e4f6e575b4857b80ca3bd"} Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.485456 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-ztpl6" podStartSLOduration=2.946138874 podStartE2EDuration="57.485437862s" podCreationTimestamp="2025-10-02 11:13:30 +0000 UTC" firstStartedPulling="2025-10-02 11:13:32.079946932 +0000 UTC m=+1327.022817876" lastFinishedPulling="2025-10-02 11:14:26.61924592 +0000 UTC m=+1381.562116864" observedRunningTime="2025-10-02 11:14:27.482671804 +0000 UTC m=+1382.425542758" watchObservedRunningTime="2025-10-02 11:14:27.485437862 +0000 UTC m=+1382.428308806" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.568945 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-576797c867-n7r4b"] Oct 02 11:14:27 crc kubenswrapper[4766]: E1002 11:14:27.569412 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2569c54-b0a8-456b-b311-264d6605d4ed" containerName="dnsmasq-dns" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.569434 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2569c54-b0a8-456b-b311-264d6605d4ed" containerName="dnsmasq-dns" Oct 02 11:14:27 crc kubenswrapper[4766]: E1002 11:14:27.569450 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6918fd8-4c73-477e-bacd-ed09a36838e6" containerName="keystone-bootstrap" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.569461 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6918fd8-4c73-477e-bacd-ed09a36838e6" containerName="keystone-bootstrap" Oct 02 11:14:27 crc kubenswrapper[4766]: E1002 11:14:27.569487 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2569c54-b0a8-456b-b311-264d6605d4ed" containerName="init" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.569512 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2569c54-b0a8-456b-b311-264d6605d4ed" containerName="init" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.569755 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6918fd8-4c73-477e-bacd-ed09a36838e6" containerName="keystone-bootstrap" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.569794 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2569c54-b0a8-456b-b311-264d6605d4ed" containerName="dnsmasq-dns" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.575282 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.578696 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.579316 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.579466 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-kf9gx" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.579716 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.579743 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.581479 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.587235 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-576797c867-n7r4b"] Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.693482 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-scripts\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.693865 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbqmz\" (UniqueName: \"kubernetes.io/projected/07e3dfd6-c718-4304-9770-edbbfaca9cf4-kube-api-access-dbqmz\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.694019 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-fernet-keys\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.694165 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-public-tls-certs\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.694362 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-internal-tls-certs\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.694488 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-config-data\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.694629 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-credential-keys\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.694757 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-combined-ca-bundle\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.797302 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-scripts\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.797371 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbqmz\" (UniqueName: \"kubernetes.io/projected/07e3dfd6-c718-4304-9770-edbbfaca9cf4-kube-api-access-dbqmz\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.797401 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-fernet-keys\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.797445 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-public-tls-certs\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.797550 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-internal-tls-certs\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.797574 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-config-data\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.797606 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-credential-keys\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.797627 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-combined-ca-bundle\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.803779 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-fernet-keys\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.804516 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-public-tls-certs\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.805128 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-internal-tls-certs\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.806271 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-credential-keys\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.807296 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-config-data\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.818647 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-combined-ca-bundle\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.820155 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-scripts\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.821632 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbqmz\" (UniqueName: \"kubernetes.io/projected/07e3dfd6-c718-4304-9770-edbbfaca9cf4-kube-api-access-dbqmz\") pod \"keystone-576797c867-n7r4b\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:27 crc kubenswrapper[4766]: I1002 11:14:27.891460 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:28 crc kubenswrapper[4766]: I1002 11:14:28.417794 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-576797c867-n7r4b"] Oct 02 11:14:28 crc kubenswrapper[4766]: I1002 11:14:28.472913 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-576797c867-n7r4b" event={"ID":"07e3dfd6-c718-4304-9770-edbbfaca9cf4","Type":"ContainerStarted","Data":"c9baccdf2aa205a53a557014f5fef109b7e2b59c5ead4f716fc4892c5edb23cd"} Oct 02 11:14:29 crc kubenswrapper[4766]: I1002 11:14:29.485756 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-576797c867-n7r4b" event={"ID":"07e3dfd6-c718-4304-9770-edbbfaca9cf4","Type":"ContainerStarted","Data":"e6c3d2e041b5f5a0a93635dedbb2e7ad90fbe97c81c3d583742c3fd4c3beb5a3"} Oct 02 11:14:29 crc kubenswrapper[4766]: I1002 11:14:29.487449 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:29 crc kubenswrapper[4766]: I1002 11:14:29.517152 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-576797c867-n7r4b" podStartSLOduration=2.517120304 podStartE2EDuration="2.517120304s" podCreationTimestamp="2025-10-02 11:14:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:14:29.508410545 +0000 UTC m=+1384.451281489" watchObservedRunningTime="2025-10-02 11:14:29.517120304 +0000 UTC m=+1384.459991248" Oct 02 11:14:32 crc kubenswrapper[4766]: I1002 11:14:32.521459 4766 generic.go:334] "Generic (PLEG): container finished" podID="3c61ea8c-4cfc-4f0d-97eb-d33c62117db2" containerID="9e3131de3333db6b7ac4e841b0504597a46f9838934a930881def24fa734d3b6" exitCode=0 Oct 02 11:14:32 crc kubenswrapper[4766]: I1002 11:14:32.521972 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wjzg5" event={"ID":"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2","Type":"ContainerDied","Data":"9e3131de3333db6b7ac4e841b0504597a46f9838934a930881def24fa734d3b6"} Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.222619 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wjzg5" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.404184 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-combined-ca-bundle\") pod \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.404244 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-config-data\") pod \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.404290 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl7wb\" (UniqueName: \"kubernetes.io/projected/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-kube-api-access-hl7wb\") pod \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.404324 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-scripts\") pod \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.404378 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-logs\") pod \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\" (UID: \"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2\") " Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.404665 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-logs" (OuterVolumeSpecName: "logs") pod "3c61ea8c-4cfc-4f0d-97eb-d33c62117db2" (UID: "3c61ea8c-4cfc-4f0d-97eb-d33c62117db2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.404929 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.411297 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-scripts" (OuterVolumeSpecName: "scripts") pod "3c61ea8c-4cfc-4f0d-97eb-d33c62117db2" (UID: "3c61ea8c-4cfc-4f0d-97eb-d33c62117db2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.411390 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-kube-api-access-hl7wb" (OuterVolumeSpecName: "kube-api-access-hl7wb") pod "3c61ea8c-4cfc-4f0d-97eb-d33c62117db2" (UID: "3c61ea8c-4cfc-4f0d-97eb-d33c62117db2"). InnerVolumeSpecName "kube-api-access-hl7wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.434893 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-config-data" (OuterVolumeSpecName: "config-data") pod "3c61ea8c-4cfc-4f0d-97eb-d33c62117db2" (UID: "3c61ea8c-4cfc-4f0d-97eb-d33c62117db2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.438810 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c61ea8c-4cfc-4f0d-97eb-d33c62117db2" (UID: "3c61ea8c-4cfc-4f0d-97eb-d33c62117db2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.506455 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.506707 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.506724 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl7wb\" (UniqueName: \"kubernetes.io/projected/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-kube-api-access-hl7wb\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.506737 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.540633 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wjzg5" event={"ID":"3c61ea8c-4cfc-4f0d-97eb-d33c62117db2","Type":"ContainerDied","Data":"5a3d8ae6d77d6bc06c64380ec9f4d8859131be4a4a03317deb006306f4acd50d"} Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.540678 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wjzg5" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.540689 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a3d8ae6d77d6bc06c64380ec9f4d8859131be4a4a03317deb006306f4acd50d" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.650749 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-84bf49766d-bbf2p"] Oct 02 11:14:34 crc kubenswrapper[4766]: E1002 11:14:34.651309 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c61ea8c-4cfc-4f0d-97eb-d33c62117db2" containerName="placement-db-sync" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.651331 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c61ea8c-4cfc-4f0d-97eb-d33c62117db2" containerName="placement-db-sync" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.651548 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c61ea8c-4cfc-4f0d-97eb-d33c62117db2" containerName="placement-db-sync" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.652670 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.658458 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.658786 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.660378 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-r9rtd" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.660495 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.662943 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.665551 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-84bf49766d-bbf2p"] Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.811915 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-config-data\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.812644 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q84ss\" (UniqueName: \"kubernetes.io/projected/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-kube-api-access-q84ss\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.812788 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-public-tls-certs\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.812838 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-combined-ca-bundle\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.812862 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-scripts\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.812920 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-logs\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.812938 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-internal-tls-certs\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.915978 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-config-data\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.916026 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q84ss\" (UniqueName: \"kubernetes.io/projected/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-kube-api-access-q84ss\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.916069 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-public-tls-certs\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.916108 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-combined-ca-bundle\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.916146 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-scripts\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.916175 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-logs\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.917009 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-logs\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.917638 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-internal-tls-certs\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.920409 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-internal-tls-certs\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.920682 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-combined-ca-bundle\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.920837 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-public-tls-certs\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.921651 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-scripts\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.922083 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-config-data\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.940405 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q84ss\" (UniqueName: \"kubernetes.io/projected/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-kube-api-access-q84ss\") pod \"placement-84bf49766d-bbf2p\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:34 crc kubenswrapper[4766]: I1002 11:14:34.981458 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:38 crc kubenswrapper[4766]: I1002 11:14:38.645909 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-84bf49766d-bbf2p"] Oct 02 11:14:38 crc kubenswrapper[4766]: W1002 11:14:38.714099 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7eb84667_7ff3_441c_ab7c_ccc4fc9233ca.slice/crio-fc14bb35873b38c4cb42c7f4f961d971819f9c86d26a74108f67b78c72373197 WatchSource:0}: Error finding container fc14bb35873b38c4cb42c7f4f961d971819f9c86d26a74108f67b78c72373197: Status 404 returned error can't find the container with id fc14bb35873b38c4cb42c7f4f961d971819f9c86d26a74108f67b78c72373197 Oct 02 11:14:39 crc kubenswrapper[4766]: I1002 11:14:39.585697 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84bf49766d-bbf2p" event={"ID":"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca","Type":"ContainerStarted","Data":"0b8dae08b6fef80dba408e5df15861c9a4ec087115f964d50b2c13d7ce34c9a4"} Oct 02 11:14:39 crc kubenswrapper[4766]: I1002 11:14:39.586058 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84bf49766d-bbf2p" event={"ID":"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca","Type":"ContainerStarted","Data":"3d18a6249adb9873ab43d08de93aedc55c827c33e60c6d8d86226fc15f98872a"} Oct 02 11:14:39 crc kubenswrapper[4766]: I1002 11:14:39.586072 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84bf49766d-bbf2p" event={"ID":"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca","Type":"ContainerStarted","Data":"fc14bb35873b38c4cb42c7f4f961d971819f9c86d26a74108f67b78c72373197"} Oct 02 11:14:39 crc kubenswrapper[4766]: I1002 11:14:39.586125 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:39 crc kubenswrapper[4766]: I1002 11:14:39.586153 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:14:39 crc kubenswrapper[4766]: I1002 11:14:39.592103 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s5j64" event={"ID":"12786f1e-db55-4668-8e43-afa080dc0fa2","Type":"ContainerStarted","Data":"867efe65a7b16f3f4c20bd8fbe635248271d0736d8757d4cc1dbab820099a089"} Oct 02 11:14:39 crc kubenswrapper[4766]: I1002 11:14:39.595244 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e987f27f-69d6-4f1e-a9a2-486638ab4505","Type":"ContainerStarted","Data":"a38d42f134dd62b004b1c9af8a7dfab1f8b857b8e41a0c0e9598773f668d693b"} Oct 02 11:14:39 crc kubenswrapper[4766]: I1002 11:14:39.595397 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerName="ceilometer-central-agent" containerID="cri-o://2c054c8ea3985c615aeed264f3de67c7b80313a458aaeb66c1e5049eb169aa9a" gracePeriod=30 Oct 02 11:14:39 crc kubenswrapper[4766]: I1002 11:14:39.595599 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:14:39 crc kubenswrapper[4766]: I1002 11:14:39.595639 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerName="proxy-httpd" containerID="cri-o://a38d42f134dd62b004b1c9af8a7dfab1f8b857b8e41a0c0e9598773f668d693b" gracePeriod=30 Oct 02 11:14:39 crc kubenswrapper[4766]: I1002 11:14:39.595679 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerName="sg-core" containerID="cri-o://d8f73b782fe717cf312b989671d64632b8ac18cda3ae18a553b5c9b2bf5080f9" gracePeriod=30 Oct 02 11:14:39 crc kubenswrapper[4766]: I1002 11:14:39.595716 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerName="ceilometer-notification-agent" containerID="cri-o://3b99448d07e099bf2885bf4c7656f473d56a1ed9da62660d94ce7d444859b3c3" gracePeriod=30 Oct 02 11:14:39 crc kubenswrapper[4766]: I1002 11:14:39.616335 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-84bf49766d-bbf2p" podStartSLOduration=5.616313055 podStartE2EDuration="5.616313055s" podCreationTimestamp="2025-10-02 11:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:14:39.609723925 +0000 UTC m=+1394.552594869" watchObservedRunningTime="2025-10-02 11:14:39.616313055 +0000 UTC m=+1394.559183999" Oct 02 11:14:39 crc kubenswrapper[4766]: I1002 11:14:39.640461 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-s5j64" podStartSLOduration=2.862609527 podStartE2EDuration="1m9.640434457s" podCreationTimestamp="2025-10-02 11:13:30 +0000 UTC" firstStartedPulling="2025-10-02 11:13:31.985408651 +0000 UTC m=+1326.928279595" lastFinishedPulling="2025-10-02 11:14:38.763233591 +0000 UTC m=+1393.706104525" observedRunningTime="2025-10-02 11:14:39.632020368 +0000 UTC m=+1394.574891312" watchObservedRunningTime="2025-10-02 11:14:39.640434457 +0000 UTC m=+1394.583305441" Oct 02 11:14:39 crc kubenswrapper[4766]: I1002 11:14:39.653758 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.616438343 podStartE2EDuration="1m9.653739121s" podCreationTimestamp="2025-10-02 11:13:30 +0000 UTC" firstStartedPulling="2025-10-02 11:13:31.729182008 +0000 UTC m=+1326.672052972" lastFinishedPulling="2025-10-02 11:14:38.766482806 +0000 UTC m=+1393.709353750" observedRunningTime="2025-10-02 11:14:39.650108165 +0000 UTC m=+1394.592979119" watchObservedRunningTime="2025-10-02 11:14:39.653739121 +0000 UTC m=+1394.596610065" Oct 02 11:14:40 crc kubenswrapper[4766]: I1002 11:14:40.608012 4766 generic.go:334] "Generic (PLEG): container finished" podID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerID="a38d42f134dd62b004b1c9af8a7dfab1f8b857b8e41a0c0e9598773f668d693b" exitCode=0 Oct 02 11:14:40 crc kubenswrapper[4766]: I1002 11:14:40.608343 4766 generic.go:334] "Generic (PLEG): container finished" podID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerID="d8f73b782fe717cf312b989671d64632b8ac18cda3ae18a553b5c9b2bf5080f9" exitCode=2 Oct 02 11:14:40 crc kubenswrapper[4766]: I1002 11:14:40.608120 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e987f27f-69d6-4f1e-a9a2-486638ab4505","Type":"ContainerDied","Data":"a38d42f134dd62b004b1c9af8a7dfab1f8b857b8e41a0c0e9598773f668d693b"} Oct 02 11:14:40 crc kubenswrapper[4766]: I1002 11:14:40.608389 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e987f27f-69d6-4f1e-a9a2-486638ab4505","Type":"ContainerDied","Data":"d8f73b782fe717cf312b989671d64632b8ac18cda3ae18a553b5c9b2bf5080f9"} Oct 02 11:14:40 crc kubenswrapper[4766]: I1002 11:14:40.608404 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e987f27f-69d6-4f1e-a9a2-486638ab4505","Type":"ContainerDied","Data":"2c054c8ea3985c615aeed264f3de67c7b80313a458aaeb66c1e5049eb169aa9a"} Oct 02 11:14:40 crc kubenswrapper[4766]: I1002 11:14:40.608354 4766 generic.go:334] "Generic (PLEG): container finished" podID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerID="2c054c8ea3985c615aeed264f3de67c7b80313a458aaeb66c1e5049eb169aa9a" exitCode=0 Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.462160 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.564462 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e987f27f-69d6-4f1e-a9a2-486638ab4505-log-httpd\") pod \"e987f27f-69d6-4f1e-a9a2-486638ab4505\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.564641 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e987f27f-69d6-4f1e-a9a2-486638ab4505-run-httpd\") pod \"e987f27f-69d6-4f1e-a9a2-486638ab4505\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.564755 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-sg-core-conf-yaml\") pod \"e987f27f-69d6-4f1e-a9a2-486638ab4505\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.564778 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-combined-ca-bundle\") pod \"e987f27f-69d6-4f1e-a9a2-486638ab4505\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.564802 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-config-data\") pod \"e987f27f-69d6-4f1e-a9a2-486638ab4505\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.564882 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zmcx\" (UniqueName: \"kubernetes.io/projected/e987f27f-69d6-4f1e-a9a2-486638ab4505-kube-api-access-4zmcx\") pod \"e987f27f-69d6-4f1e-a9a2-486638ab4505\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.565060 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-scripts\") pod \"e987f27f-69d6-4f1e-a9a2-486638ab4505\" (UID: \"e987f27f-69d6-4f1e-a9a2-486638ab4505\") " Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.565279 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e987f27f-69d6-4f1e-a9a2-486638ab4505-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e987f27f-69d6-4f1e-a9a2-486638ab4505" (UID: "e987f27f-69d6-4f1e-a9a2-486638ab4505"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.565398 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e987f27f-69d6-4f1e-a9a2-486638ab4505-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e987f27f-69d6-4f1e-a9a2-486638ab4505" (UID: "e987f27f-69d6-4f1e-a9a2-486638ab4505"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.565949 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e987f27f-69d6-4f1e-a9a2-486638ab4505-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.565972 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e987f27f-69d6-4f1e-a9a2-486638ab4505-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.578654 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-scripts" (OuterVolumeSpecName: "scripts") pod "e987f27f-69d6-4f1e-a9a2-486638ab4505" (UID: "e987f27f-69d6-4f1e-a9a2-486638ab4505"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.578844 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e987f27f-69d6-4f1e-a9a2-486638ab4505-kube-api-access-4zmcx" (OuterVolumeSpecName: "kube-api-access-4zmcx") pod "e987f27f-69d6-4f1e-a9a2-486638ab4505" (UID: "e987f27f-69d6-4f1e-a9a2-486638ab4505"). InnerVolumeSpecName "kube-api-access-4zmcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.599830 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e987f27f-69d6-4f1e-a9a2-486638ab4505" (UID: "e987f27f-69d6-4f1e-a9a2-486638ab4505"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.635806 4766 generic.go:334] "Generic (PLEG): container finished" podID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerID="3b99448d07e099bf2885bf4c7656f473d56a1ed9da62660d94ce7d444859b3c3" exitCode=0 Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.635878 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e987f27f-69d6-4f1e-a9a2-486638ab4505","Type":"ContainerDied","Data":"3b99448d07e099bf2885bf4c7656f473d56a1ed9da62660d94ce7d444859b3c3"} Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.635922 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e987f27f-69d6-4f1e-a9a2-486638ab4505","Type":"ContainerDied","Data":"bab85da1103e917e04501474b774a23c998e8a43dcdd1094b3e1b2d6059bba31"} Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.635947 4766 scope.go:117] "RemoveContainer" containerID="a38d42f134dd62b004b1c9af8a7dfab1f8b857b8e41a0c0e9598773f668d693b" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.636172 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.649661 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-config-data" (OuterVolumeSpecName: "config-data") pod "e987f27f-69d6-4f1e-a9a2-486638ab4505" (UID: "e987f27f-69d6-4f1e-a9a2-486638ab4505"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.654322 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e987f27f-69d6-4f1e-a9a2-486638ab4505" (UID: "e987f27f-69d6-4f1e-a9a2-486638ab4505"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.667568 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.667611 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.667621 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.667631 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zmcx\" (UniqueName: \"kubernetes.io/projected/e987f27f-69d6-4f1e-a9a2-486638ab4505-kube-api-access-4zmcx\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.667645 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e987f27f-69d6-4f1e-a9a2-486638ab4505-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.698214 4766 scope.go:117] "RemoveContainer" containerID="d8f73b782fe717cf312b989671d64632b8ac18cda3ae18a553b5c9b2bf5080f9" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.721724 4766 scope.go:117] "RemoveContainer" containerID="3b99448d07e099bf2885bf4c7656f473d56a1ed9da62660d94ce7d444859b3c3" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.742476 4766 scope.go:117] "RemoveContainer" containerID="2c054c8ea3985c615aeed264f3de67c7b80313a458aaeb66c1e5049eb169aa9a" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.765628 4766 scope.go:117] "RemoveContainer" containerID="a38d42f134dd62b004b1c9af8a7dfab1f8b857b8e41a0c0e9598773f668d693b" Oct 02 11:14:42 crc kubenswrapper[4766]: E1002 11:14:42.766265 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38d42f134dd62b004b1c9af8a7dfab1f8b857b8e41a0c0e9598773f668d693b\": container with ID starting with a38d42f134dd62b004b1c9af8a7dfab1f8b857b8e41a0c0e9598773f668d693b not found: ID does not exist" containerID="a38d42f134dd62b004b1c9af8a7dfab1f8b857b8e41a0c0e9598773f668d693b" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.766310 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38d42f134dd62b004b1c9af8a7dfab1f8b857b8e41a0c0e9598773f668d693b"} err="failed to get container status \"a38d42f134dd62b004b1c9af8a7dfab1f8b857b8e41a0c0e9598773f668d693b\": rpc error: code = NotFound desc = could not find container \"a38d42f134dd62b004b1c9af8a7dfab1f8b857b8e41a0c0e9598773f668d693b\": container with ID starting with a38d42f134dd62b004b1c9af8a7dfab1f8b857b8e41a0c0e9598773f668d693b not found: ID does not exist" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.766333 4766 scope.go:117] "RemoveContainer" containerID="d8f73b782fe717cf312b989671d64632b8ac18cda3ae18a553b5c9b2bf5080f9" Oct 02 11:14:42 crc kubenswrapper[4766]: E1002 11:14:42.766731 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8f73b782fe717cf312b989671d64632b8ac18cda3ae18a553b5c9b2bf5080f9\": container with ID starting with d8f73b782fe717cf312b989671d64632b8ac18cda3ae18a553b5c9b2bf5080f9 not found: ID does not exist" containerID="d8f73b782fe717cf312b989671d64632b8ac18cda3ae18a553b5c9b2bf5080f9" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.766755 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8f73b782fe717cf312b989671d64632b8ac18cda3ae18a553b5c9b2bf5080f9"} err="failed to get container status \"d8f73b782fe717cf312b989671d64632b8ac18cda3ae18a553b5c9b2bf5080f9\": rpc error: code = NotFound desc = could not find container \"d8f73b782fe717cf312b989671d64632b8ac18cda3ae18a553b5c9b2bf5080f9\": container with ID starting with d8f73b782fe717cf312b989671d64632b8ac18cda3ae18a553b5c9b2bf5080f9 not found: ID does not exist" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.766773 4766 scope.go:117] "RemoveContainer" containerID="3b99448d07e099bf2885bf4c7656f473d56a1ed9da62660d94ce7d444859b3c3" Oct 02 11:14:42 crc kubenswrapper[4766]: E1002 11:14:42.767179 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b99448d07e099bf2885bf4c7656f473d56a1ed9da62660d94ce7d444859b3c3\": container with ID starting with 3b99448d07e099bf2885bf4c7656f473d56a1ed9da62660d94ce7d444859b3c3 not found: ID does not exist" containerID="3b99448d07e099bf2885bf4c7656f473d56a1ed9da62660d94ce7d444859b3c3" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.767224 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b99448d07e099bf2885bf4c7656f473d56a1ed9da62660d94ce7d444859b3c3"} err="failed to get container status \"3b99448d07e099bf2885bf4c7656f473d56a1ed9da62660d94ce7d444859b3c3\": rpc error: code = NotFound desc = could not find container \"3b99448d07e099bf2885bf4c7656f473d56a1ed9da62660d94ce7d444859b3c3\": container with ID starting with 3b99448d07e099bf2885bf4c7656f473d56a1ed9da62660d94ce7d444859b3c3 not found: ID does not exist" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.767251 4766 scope.go:117] "RemoveContainer" containerID="2c054c8ea3985c615aeed264f3de67c7b80313a458aaeb66c1e5049eb169aa9a" Oct 02 11:14:42 crc kubenswrapper[4766]: E1002 11:14:42.767707 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c054c8ea3985c615aeed264f3de67c7b80313a458aaeb66c1e5049eb169aa9a\": container with ID starting with 2c054c8ea3985c615aeed264f3de67c7b80313a458aaeb66c1e5049eb169aa9a not found: ID does not exist" containerID="2c054c8ea3985c615aeed264f3de67c7b80313a458aaeb66c1e5049eb169aa9a" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.767739 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c054c8ea3985c615aeed264f3de67c7b80313a458aaeb66c1e5049eb169aa9a"} err="failed to get container status \"2c054c8ea3985c615aeed264f3de67c7b80313a458aaeb66c1e5049eb169aa9a\": rpc error: code = NotFound desc = could not find container \"2c054c8ea3985c615aeed264f3de67c7b80313a458aaeb66c1e5049eb169aa9a\": container with ID starting with 2c054c8ea3985c615aeed264f3de67c7b80313a458aaeb66c1e5049eb169aa9a not found: ID does not exist" Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.966979 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:14:42 crc kubenswrapper[4766]: I1002 11:14:42.975771 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.012911 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:14:43 crc kubenswrapper[4766]: E1002 11:14:43.013360 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerName="ceilometer-central-agent" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.013379 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerName="ceilometer-central-agent" Oct 02 11:14:43 crc kubenswrapper[4766]: E1002 11:14:43.013408 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerName="proxy-httpd" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.013416 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerName="proxy-httpd" Oct 02 11:14:43 crc kubenswrapper[4766]: E1002 11:14:43.013440 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerName="ceilometer-notification-agent" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.013448 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerName="ceilometer-notification-agent" Oct 02 11:14:43 crc kubenswrapper[4766]: E1002 11:14:43.013462 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerName="sg-core" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.013470 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerName="sg-core" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.013680 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerName="ceilometer-central-agent" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.013697 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerName="proxy-httpd" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.013716 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerName="sg-core" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.013727 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e987f27f-69d6-4f1e-a9a2-486638ab4505" containerName="ceilometer-notification-agent" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.015592 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.024848 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.025103 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.037682 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.073045 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-config-data\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.073094 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m79j\" (UniqueName: \"kubernetes.io/projected/acb47101-638b-42ba-aca0-96a9f81c1443-kube-api-access-5m79j\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.073289 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.073455 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.073671 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acb47101-638b-42ba-aca0-96a9f81c1443-log-httpd\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.073772 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-scripts\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.073819 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acb47101-638b-42ba-aca0-96a9f81c1443-run-httpd\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.175211 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.175311 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.175351 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acb47101-638b-42ba-aca0-96a9f81c1443-log-httpd\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.175401 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-scripts\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.175444 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acb47101-638b-42ba-aca0-96a9f81c1443-run-httpd\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.175484 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-config-data\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.175523 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m79j\" (UniqueName: \"kubernetes.io/projected/acb47101-638b-42ba-aca0-96a9f81c1443-kube-api-access-5m79j\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.177276 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acb47101-638b-42ba-aca0-96a9f81c1443-log-httpd\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.177325 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acb47101-638b-42ba-aca0-96a9f81c1443-run-httpd\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.180231 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.180618 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-scripts\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.180802 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.186141 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-config-data\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.197162 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m79j\" (UniqueName: \"kubernetes.io/projected/acb47101-638b-42ba-aca0-96a9f81c1443-kube-api-access-5m79j\") pod \"ceilometer-0\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.332710 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.645956 4766 generic.go:334] "Generic (PLEG): container finished" podID="58a480c5-a9e3-46da-b3df-4d73473d4b12" containerID="e59b9e3e28f7f3521b5649e7c481c6db733ebb4c636e4f6e575b4857b80ca3bd" exitCode=0 Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.646042 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ztpl6" event={"ID":"58a480c5-a9e3-46da-b3df-4d73473d4b12","Type":"ContainerDied","Data":"e59b9e3e28f7f3521b5649e7c481c6db733ebb4c636e4f6e575b4857b80ca3bd"} Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.770712 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:14:43 crc kubenswrapper[4766]: I1002 11:14:43.899006 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e987f27f-69d6-4f1e-a9a2-486638ab4505" path="/var/lib/kubelet/pods/e987f27f-69d6-4f1e-a9a2-486638ab4505/volumes" Oct 02 11:14:44 crc kubenswrapper[4766]: I1002 11:14:44.658460 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acb47101-638b-42ba-aca0-96a9f81c1443","Type":"ContainerStarted","Data":"fbf3d7476613b9ea2f684cc297139f531d6c7a3b770ed4e9c33f6c9d44f72172"} Oct 02 11:14:44 crc kubenswrapper[4766]: I1002 11:14:44.658850 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acb47101-638b-42ba-aca0-96a9f81c1443","Type":"ContainerStarted","Data":"bb86615d3d68699bdf7398bee61afe5b9d0ff618cdee7edfb16e75d98b5c909a"} Oct 02 11:14:44 crc kubenswrapper[4766]: I1002 11:14:44.965781 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ztpl6" Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.006685 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/58a480c5-a9e3-46da-b3df-4d73473d4b12-db-sync-config-data\") pod \"58a480c5-a9e3-46da-b3df-4d73473d4b12\" (UID: \"58a480c5-a9e3-46da-b3df-4d73473d4b12\") " Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.006824 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6bhm\" (UniqueName: \"kubernetes.io/projected/58a480c5-a9e3-46da-b3df-4d73473d4b12-kube-api-access-j6bhm\") pod \"58a480c5-a9e3-46da-b3df-4d73473d4b12\" (UID: \"58a480c5-a9e3-46da-b3df-4d73473d4b12\") " Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.006880 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a480c5-a9e3-46da-b3df-4d73473d4b12-combined-ca-bundle\") pod \"58a480c5-a9e3-46da-b3df-4d73473d4b12\" (UID: \"58a480c5-a9e3-46da-b3df-4d73473d4b12\") " Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.013056 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58a480c5-a9e3-46da-b3df-4d73473d4b12-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "58a480c5-a9e3-46da-b3df-4d73473d4b12" (UID: "58a480c5-a9e3-46da-b3df-4d73473d4b12"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.013082 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58a480c5-a9e3-46da-b3df-4d73473d4b12-kube-api-access-j6bhm" (OuterVolumeSpecName: "kube-api-access-j6bhm") pod "58a480c5-a9e3-46da-b3df-4d73473d4b12" (UID: "58a480c5-a9e3-46da-b3df-4d73473d4b12"). InnerVolumeSpecName "kube-api-access-j6bhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.030641 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58a480c5-a9e3-46da-b3df-4d73473d4b12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58a480c5-a9e3-46da-b3df-4d73473d4b12" (UID: "58a480c5-a9e3-46da-b3df-4d73473d4b12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.108967 4766 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/58a480c5-a9e3-46da-b3df-4d73473d4b12-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.109013 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6bhm\" (UniqueName: \"kubernetes.io/projected/58a480c5-a9e3-46da-b3df-4d73473d4b12-kube-api-access-j6bhm\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.109029 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58a480c5-a9e3-46da-b3df-4d73473d4b12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.669832 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acb47101-638b-42ba-aca0-96a9f81c1443","Type":"ContainerStarted","Data":"5576c520748d5953e012dcd26c50c4ce01e261b6dd06db08507d51fc9740dc1f"} Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.672433 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ztpl6" event={"ID":"58a480c5-a9e3-46da-b3df-4d73473d4b12","Type":"ContainerDied","Data":"1a4f39d2e61a7af2ebbe64e3390ece1d3e90ce95f2e4470e501babbe65a8a0bd"} Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.672459 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a4f39d2e61a7af2ebbe64e3390ece1d3e90ce95f2e4470e501babbe65a8a0bd" Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.672562 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ztpl6" Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.928717 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-664d98ccd8-hh5xk"] Oct 02 11:14:45 crc kubenswrapper[4766]: E1002 11:14:45.929119 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58a480c5-a9e3-46da-b3df-4d73473d4b12" containerName="barbican-db-sync" Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.929132 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a480c5-a9e3-46da-b3df-4d73473d4b12" containerName="barbican-db-sync" Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.929290 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="58a480c5-a9e3-46da-b3df-4d73473d4b12" containerName="barbican-db-sync" Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.930364 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.932911 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nw7pr" Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.933170 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.934441 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.938756 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-664d98ccd8-hh5xk"] Oct 02 11:14:45 crc kubenswrapper[4766]: I1002 11:14:45.999649 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-f5f68d797-k4qqv"] Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.001014 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.004346 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.019981 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-f5f68d797-k4qqv"] Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.080987 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7979dc8455-gmpjj"] Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.082714 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.094906 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7979dc8455-gmpjj"] Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.122538 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d2gp\" (UniqueName: \"kubernetes.io/projected/d9339929-4331-4cd9-89bc-8350ef2f55f5-kube-api-access-6d2gp\") pod \"barbican-keystone-listener-664d98ccd8-hh5xk\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.122599 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcgz9\" (UniqueName: \"kubernetes.io/projected/8d43eab0-4595-42fc-8489-38792e0c6e19-kube-api-access-wcgz9\") pod \"barbican-worker-f5f68d797-k4qqv\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.122625 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-config-data\") pod \"barbican-worker-f5f68d797-k4qqv\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.122834 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-config-data-custom\") pod \"barbican-worker-f5f68d797-k4qqv\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.122854 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-combined-ca-bundle\") pod \"barbican-worker-f5f68d797-k4qqv\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.122871 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d43eab0-4595-42fc-8489-38792e0c6e19-logs\") pod \"barbican-worker-f5f68d797-k4qqv\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.122892 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data\") pod \"barbican-keystone-listener-664d98ccd8-hh5xk\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.122910 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9339929-4331-4cd9-89bc-8350ef2f55f5-logs\") pod \"barbican-keystone-listener-664d98ccd8-hh5xk\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.122943 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data-custom\") pod \"barbican-keystone-listener-664d98ccd8-hh5xk\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.122971 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-combined-ca-bundle\") pod \"barbican-keystone-listener-664d98ccd8-hh5xk\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.142575 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6dd4f44f78-54h7c"] Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.145084 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.153811 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.174936 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6dd4f44f78-54h7c"] Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.225978 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data-custom\") pod \"barbican-keystone-listener-664d98ccd8-hh5xk\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226042 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-combined-ca-bundle\") pod \"barbican-keystone-listener-664d98ccd8-hh5xk\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226088 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-dns-swift-storage-0\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226112 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-dns-svc\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226139 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-config-data\") pod \"barbican-api-6dd4f44f78-54h7c\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226165 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-config\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226197 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a967c35-c726-4b3a-ad92-80a11601ecaa-logs\") pod \"barbican-api-6dd4f44f78-54h7c\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226225 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-combined-ca-bundle\") pod \"barbican-api-6dd4f44f78-54h7c\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226280 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffd9s\" (UniqueName: \"kubernetes.io/projected/e17e6f13-91a3-4632-9477-81fa3ca78af0-kube-api-access-ffd9s\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226306 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-ovsdbserver-sb\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226367 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgbhm\" (UniqueName: \"kubernetes.io/projected/9a967c35-c726-4b3a-ad92-80a11601ecaa-kube-api-access-lgbhm\") pod \"barbican-api-6dd4f44f78-54h7c\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226404 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d2gp\" (UniqueName: \"kubernetes.io/projected/d9339929-4331-4cd9-89bc-8350ef2f55f5-kube-api-access-6d2gp\") pod \"barbican-keystone-listener-664d98ccd8-hh5xk\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226445 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcgz9\" (UniqueName: \"kubernetes.io/projected/8d43eab0-4595-42fc-8489-38792e0c6e19-kube-api-access-wcgz9\") pod \"barbican-worker-f5f68d797-k4qqv\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226475 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-config-data\") pod \"barbican-worker-f5f68d797-k4qqv\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226519 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-config-data-custom\") pod \"barbican-worker-f5f68d797-k4qqv\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226547 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-combined-ca-bundle\") pod \"barbican-worker-f5f68d797-k4qqv\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226570 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d43eab0-4595-42fc-8489-38792e0c6e19-logs\") pod \"barbican-worker-f5f68d797-k4qqv\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226596 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-config-data-custom\") pod \"barbican-api-6dd4f44f78-54h7c\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226620 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data\") pod \"barbican-keystone-listener-664d98ccd8-hh5xk\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226643 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-ovsdbserver-nb\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.226669 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9339929-4331-4cd9-89bc-8350ef2f55f5-logs\") pod \"barbican-keystone-listener-664d98ccd8-hh5xk\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.227114 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9339929-4331-4cd9-89bc-8350ef2f55f5-logs\") pod \"barbican-keystone-listener-664d98ccd8-hh5xk\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.230793 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d43eab0-4595-42fc-8489-38792e0c6e19-logs\") pod \"barbican-worker-f5f68d797-k4qqv\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.240024 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data-custom\") pod \"barbican-keystone-listener-664d98ccd8-hh5xk\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.241162 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-config-data-custom\") pod \"barbican-worker-f5f68d797-k4qqv\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.247028 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data\") pod \"barbican-keystone-listener-664d98ccd8-hh5xk\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.247135 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-config-data\") pod \"barbican-worker-f5f68d797-k4qqv\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.248121 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d2gp\" (UniqueName: \"kubernetes.io/projected/d9339929-4331-4cd9-89bc-8350ef2f55f5-kube-api-access-6d2gp\") pod \"barbican-keystone-listener-664d98ccd8-hh5xk\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.251043 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcgz9\" (UniqueName: \"kubernetes.io/projected/8d43eab0-4595-42fc-8489-38792e0c6e19-kube-api-access-wcgz9\") pod \"barbican-worker-f5f68d797-k4qqv\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.253021 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-combined-ca-bundle\") pod \"barbican-worker-f5f68d797-k4qqv\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.258280 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-combined-ca-bundle\") pod \"barbican-keystone-listener-664d98ccd8-hh5xk\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.261607 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.330429 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgbhm\" (UniqueName: \"kubernetes.io/projected/9a967c35-c726-4b3a-ad92-80a11601ecaa-kube-api-access-lgbhm\") pod \"barbican-api-6dd4f44f78-54h7c\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.330524 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-config-data-custom\") pod \"barbican-api-6dd4f44f78-54h7c\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.330548 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-ovsdbserver-nb\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.330580 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-dns-swift-storage-0\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.330596 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-dns-svc\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.330613 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-config-data\") pod \"barbican-api-6dd4f44f78-54h7c\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.330630 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-config\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.330651 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a967c35-c726-4b3a-ad92-80a11601ecaa-logs\") pod \"barbican-api-6dd4f44f78-54h7c\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.330793 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-combined-ca-bundle\") pod \"barbican-api-6dd4f44f78-54h7c\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.330913 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffd9s\" (UniqueName: \"kubernetes.io/projected/e17e6f13-91a3-4632-9477-81fa3ca78af0-kube-api-access-ffd9s\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.330948 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-ovsdbserver-sb\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.331243 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a967c35-c726-4b3a-ad92-80a11601ecaa-logs\") pod \"barbican-api-6dd4f44f78-54h7c\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.331871 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-dns-swift-storage-0\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.331910 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-dns-svc\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.331978 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-ovsdbserver-nb\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.332116 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-ovsdbserver-sb\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.332964 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-config\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.334727 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-config-data\") pod \"barbican-api-6dd4f44f78-54h7c\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.337222 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.342256 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-combined-ca-bundle\") pod \"barbican-api-6dd4f44f78-54h7c\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.348074 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-config-data-custom\") pod \"barbican-api-6dd4f44f78-54h7c\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.353176 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgbhm\" (UniqueName: \"kubernetes.io/projected/9a967c35-c726-4b3a-ad92-80a11601ecaa-kube-api-access-lgbhm\") pod \"barbican-api-6dd4f44f78-54h7c\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.354286 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffd9s\" (UniqueName: \"kubernetes.io/projected/e17e6f13-91a3-4632-9477-81fa3ca78af0-kube-api-access-ffd9s\") pod \"dnsmasq-dns-7979dc8455-gmpjj\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.404842 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.470084 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.690728 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acb47101-638b-42ba-aca0-96a9f81c1443","Type":"ContainerStarted","Data":"533c21f74a053a585cf21d97ee7b64bc41253b81187145237393a28bd0d8db63"} Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.809951 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-664d98ccd8-hh5xk"] Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.868028 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7979dc8455-gmpjj"] Oct 02 11:14:46 crc kubenswrapper[4766]: I1002 11:14:46.945751 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-f5f68d797-k4qqv"] Oct 02 11:14:47 crc kubenswrapper[4766]: I1002 11:14:47.235128 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6dd4f44f78-54h7c"] Oct 02 11:14:47 crc kubenswrapper[4766]: W1002 11:14:47.298377 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a967c35_c726_4b3a_ad92_80a11601ecaa.slice/crio-31b27a5fd6fd71563624e1880621c31ea312ab1cbc0a547d121cbd0e8718686b WatchSource:0}: Error finding container 31b27a5fd6fd71563624e1880621c31ea312ab1cbc0a547d121cbd0e8718686b: Status 404 returned error can't find the container with id 31b27a5fd6fd71563624e1880621c31ea312ab1cbc0a547d121cbd0e8718686b Oct 02 11:14:47 crc kubenswrapper[4766]: I1002 11:14:47.700474 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" event={"ID":"d9339929-4331-4cd9-89bc-8350ef2f55f5","Type":"ContainerStarted","Data":"428cfb135d81844a70fbfa047eb311a462a87b6b4af97a2fdf7aac2011337530"} Oct 02 11:14:47 crc kubenswrapper[4766]: I1002 11:14:47.702541 4766 generic.go:334] "Generic (PLEG): container finished" podID="e17e6f13-91a3-4632-9477-81fa3ca78af0" containerID="d496429221b83e5bdb0bd9206db1b115a9ef1902cef3fbeb0245b43237182a7f" exitCode=0 Oct 02 11:14:47 crc kubenswrapper[4766]: I1002 11:14:47.702662 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" event={"ID":"e17e6f13-91a3-4632-9477-81fa3ca78af0","Type":"ContainerDied","Data":"d496429221b83e5bdb0bd9206db1b115a9ef1902cef3fbeb0245b43237182a7f"} Oct 02 11:14:47 crc kubenswrapper[4766]: I1002 11:14:47.702714 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" event={"ID":"e17e6f13-91a3-4632-9477-81fa3ca78af0","Type":"ContainerStarted","Data":"293ffaa790a41905a17e62b26a1b02bd2f2010f13d8dbe62f5d7cb6837e7995e"} Oct 02 11:14:47 crc kubenswrapper[4766]: I1002 11:14:47.703903 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f5f68d797-k4qqv" event={"ID":"8d43eab0-4595-42fc-8489-38792e0c6e19","Type":"ContainerStarted","Data":"fbb7736ca2d64df541ebcaaaf6a45b4ba4512d16015fd11481ec9d053bfea1cc"} Oct 02 11:14:47 crc kubenswrapper[4766]: I1002 11:14:47.708151 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd4f44f78-54h7c" event={"ID":"9a967c35-c726-4b3a-ad92-80a11601ecaa","Type":"ContainerStarted","Data":"f571c2a1d52270793fac8f73616909327551438103fbb8ba14f3598e3035521b"} Oct 02 11:14:47 crc kubenswrapper[4766]: I1002 11:14:47.708213 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd4f44f78-54h7c" event={"ID":"9a967c35-c726-4b3a-ad92-80a11601ecaa","Type":"ContainerStarted","Data":"31b27a5fd6fd71563624e1880621c31ea312ab1cbc0a547d121cbd0e8718686b"} Oct 02 11:14:48 crc kubenswrapper[4766]: I1002 11:14:48.720837 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" event={"ID":"e17e6f13-91a3-4632-9477-81fa3ca78af0","Type":"ContainerStarted","Data":"d54c10723b7b6845713cb0b9bb5a326178cc22672b05226363388e5c0c74e0f4"} Oct 02 11:14:48 crc kubenswrapper[4766]: I1002 11:14:48.721452 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:48 crc kubenswrapper[4766]: I1002 11:14:48.724341 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd4f44f78-54h7c" event={"ID":"9a967c35-c726-4b3a-ad92-80a11601ecaa","Type":"ContainerStarted","Data":"af6392219af0c458e6a3ef494ba25c490c3ca60d53409167568566209b195c0f"} Oct 02 11:14:48 crc kubenswrapper[4766]: I1002 11:14:48.724460 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:48 crc kubenswrapper[4766]: I1002 11:14:48.724524 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:48 crc kubenswrapper[4766]: I1002 11:14:48.740974 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" podStartSLOduration=2.740932871 podStartE2EDuration="2.740932871s" podCreationTimestamp="2025-10-02 11:14:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:14:48.739069062 +0000 UTC m=+1403.681940006" watchObservedRunningTime="2025-10-02 11:14:48.740932871 +0000 UTC m=+1403.683803815" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.180599 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6dd4f44f78-54h7c" podStartSLOduration=3.180582642 podStartE2EDuration="3.180582642s" podCreationTimestamp="2025-10-02 11:14:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:14:48.772151829 +0000 UTC m=+1403.715022763" watchObservedRunningTime="2025-10-02 11:14:49.180582642 +0000 UTC m=+1404.123453586" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.186728 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-75997cdf8b-nnlzj"] Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.189019 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.190893 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.191423 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.192122 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94c8c5ed-b069-4112-ae71-d9071bc15ff2-logs\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.192174 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-internal-tls-certs\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.192200 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxsf9\" (UniqueName: \"kubernetes.io/projected/94c8c5ed-b069-4112-ae71-d9071bc15ff2-kube-api-access-sxsf9\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.192278 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-public-tls-certs\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.192384 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-combined-ca-bundle\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.192413 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-config-data\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.192448 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-config-data-custom\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.214290 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75997cdf8b-nnlzj"] Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.295123 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-public-tls-certs\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.295784 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-combined-ca-bundle\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.295811 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-config-data\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.295848 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-config-data-custom\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.295874 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94c8c5ed-b069-4112-ae71-d9071bc15ff2-logs\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.295903 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-internal-tls-certs\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.295927 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxsf9\" (UniqueName: \"kubernetes.io/projected/94c8c5ed-b069-4112-ae71-d9071bc15ff2-kube-api-access-sxsf9\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.296845 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94c8c5ed-b069-4112-ae71-d9071bc15ff2-logs\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.302393 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-internal-tls-certs\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.304127 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-combined-ca-bundle\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.304656 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-public-tls-certs\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.308913 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-config-data\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.313839 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-config-data-custom\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.321952 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxsf9\" (UniqueName: \"kubernetes.io/projected/94c8c5ed-b069-4112-ae71-d9071bc15ff2-kube-api-access-sxsf9\") pod \"barbican-api-75997cdf8b-nnlzj\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.450080 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.807981 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acb47101-638b-42ba-aca0-96a9f81c1443","Type":"ContainerStarted","Data":"bd2f7b3a96aa8ee91d830307f72f8b4bd4f41acf88b4ea16eda48c6b3f2872a5"} Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.812330 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.860248 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" event={"ID":"d9339929-4331-4cd9-89bc-8350ef2f55f5","Type":"ContainerStarted","Data":"dc1353dac7e3a318b9bf88253e7621b0e0c300fbb3ed030d2c367fb3cffe1ca0"} Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.883157 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.824878676 podStartE2EDuration="7.883133205s" podCreationTimestamp="2025-10-02 11:14:42 +0000 UTC" firstStartedPulling="2025-10-02 11:14:43.77520602 +0000 UTC m=+1398.718076974" lastFinishedPulling="2025-10-02 11:14:47.833460559 +0000 UTC m=+1402.776331503" observedRunningTime="2025-10-02 11:14:49.873831127 +0000 UTC m=+1404.816702081" watchObservedRunningTime="2025-10-02 11:14:49.883133205 +0000 UTC m=+1404.826004149" Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.917259 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-75997cdf8b-nnlzj"] Oct 02 11:14:49 crc kubenswrapper[4766]: I1002 11:14:49.917305 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f5f68d797-k4qqv" event={"ID":"8d43eab0-4595-42fc-8489-38792e0c6e19","Type":"ContainerStarted","Data":"f1a0b9913341b1fffedbb9296ca3e24f9abbcd44a25a0528cebe5da010d355e1"} Oct 02 11:14:50 crc kubenswrapper[4766]: I1002 11:14:50.896927 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f5f68d797-k4qqv" event={"ID":"8d43eab0-4595-42fc-8489-38792e0c6e19","Type":"ContainerStarted","Data":"433ae393df6f772a4b0964a7a633bfcdca8d7d78296edfa4ca875b807cacbd06"} Oct 02 11:14:50 crc kubenswrapper[4766]: I1002 11:14:50.900174 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" event={"ID":"d9339929-4331-4cd9-89bc-8350ef2f55f5","Type":"ContainerStarted","Data":"973c651619479183947224e4242097f161cf673b8da2935544f44e1700d072b9"} Oct 02 11:14:50 crc kubenswrapper[4766]: I1002 11:14:50.903014 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75997cdf8b-nnlzj" event={"ID":"94c8c5ed-b069-4112-ae71-d9071bc15ff2","Type":"ContainerStarted","Data":"6cd538b1cdf3993f6cd959aebdda72d9055a7730e3cb15b1a555f7da8b9b1353"} Oct 02 11:14:50 crc kubenswrapper[4766]: I1002 11:14:50.903066 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75997cdf8b-nnlzj" event={"ID":"94c8c5ed-b069-4112-ae71-d9071bc15ff2","Type":"ContainerStarted","Data":"1739e7ba00b0bf9bb771dca9d289d586ed12ce58d682ed8f6ac6cf039138a291"} Oct 02 11:14:50 crc kubenswrapper[4766]: I1002 11:14:50.903077 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75997cdf8b-nnlzj" event={"ID":"94c8c5ed-b069-4112-ae71-d9071bc15ff2","Type":"ContainerStarted","Data":"bce8cff20a12b2c91f70e55f9d1b924210a91e2f479f037c37aad32da97c53ba"} Oct 02 11:14:50 crc kubenswrapper[4766]: I1002 11:14:50.922730 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-f5f68d797-k4qqv" podStartSLOduration=3.8163931829999997 podStartE2EDuration="5.922712139s" podCreationTimestamp="2025-10-02 11:14:45 +0000 UTC" firstStartedPulling="2025-10-02 11:14:46.958537337 +0000 UTC m=+1401.901408281" lastFinishedPulling="2025-10-02 11:14:49.064856293 +0000 UTC m=+1404.007727237" observedRunningTime="2025-10-02 11:14:50.918006218 +0000 UTC m=+1405.860877162" watchObservedRunningTime="2025-10-02 11:14:50.922712139 +0000 UTC m=+1405.865583083" Oct 02 11:14:50 crc kubenswrapper[4766]: I1002 11:14:50.943691 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-75997cdf8b-nnlzj" podStartSLOduration=1.943669818 podStartE2EDuration="1.943669818s" podCreationTimestamp="2025-10-02 11:14:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:14:50.934549787 +0000 UTC m=+1405.877420731" watchObservedRunningTime="2025-10-02 11:14:50.943669818 +0000 UTC m=+1405.886540762" Oct 02 11:14:50 crc kubenswrapper[4766]: I1002 11:14:50.959177 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" podStartSLOduration=3.71337094 podStartE2EDuration="5.959151534s" podCreationTimestamp="2025-10-02 11:14:45 +0000 UTC" firstStartedPulling="2025-10-02 11:14:46.820429573 +0000 UTC m=+1401.763300517" lastFinishedPulling="2025-10-02 11:14:49.066210167 +0000 UTC m=+1404.009081111" observedRunningTime="2025-10-02 11:14:50.951937473 +0000 UTC m=+1405.894808427" watchObservedRunningTime="2025-10-02 11:14:50.959151534 +0000 UTC m=+1405.902022478" Oct 02 11:14:51 crc kubenswrapper[4766]: I1002 11:14:51.910175 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:51 crc kubenswrapper[4766]: I1002 11:14:51.910485 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:14:55 crc kubenswrapper[4766]: I1002 11:14:55.726529 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-6dd4f44f78-54h7c" podUID="9a967c35-c726-4b3a-ad92-80a11601ecaa" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 02 11:14:56 crc kubenswrapper[4766]: I1002 11:14:56.407118 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:14:56 crc kubenswrapper[4766]: I1002 11:14:56.499528 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mnx84"] Oct 02 11:14:56 crc kubenswrapper[4766]: I1002 11:14:56.500117 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" podUID="577aa82c-8c30-4044-9d1a-ff88cc3f390a" containerName="dnsmasq-dns" containerID="cri-o://b264c9db174673bc4ffbf1f294738343f7e8f59f993f08f7e6b0bb265cf2db85" gracePeriod=10 Oct 02 11:14:56 crc kubenswrapper[4766]: I1002 11:14:56.962105 4766 generic.go:334] "Generic (PLEG): container finished" podID="577aa82c-8c30-4044-9d1a-ff88cc3f390a" containerID="b264c9db174673bc4ffbf1f294738343f7e8f59f993f08f7e6b0bb265cf2db85" exitCode=0 Oct 02 11:14:56 crc kubenswrapper[4766]: I1002 11:14:56.962173 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" event={"ID":"577aa82c-8c30-4044-9d1a-ff88cc3f390a","Type":"ContainerDied","Data":"b264c9db174673bc4ffbf1f294738343f7e8f59f993f08f7e6b0bb265cf2db85"} Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.501563 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.618078 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-config\") pod \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.618217 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-ovsdbserver-sb\") pod \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.619270 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-dns-swift-storage-0\") pod \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.619342 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-dns-svc\") pod \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.619454 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc8xg\" (UniqueName: \"kubernetes.io/projected/577aa82c-8c30-4044-9d1a-ff88cc3f390a-kube-api-access-tc8xg\") pod \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.619485 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-ovsdbserver-nb\") pod \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\" (UID: \"577aa82c-8c30-4044-9d1a-ff88cc3f390a\") " Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.631328 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/577aa82c-8c30-4044-9d1a-ff88cc3f390a-kube-api-access-tc8xg" (OuterVolumeSpecName: "kube-api-access-tc8xg") pod "577aa82c-8c30-4044-9d1a-ff88cc3f390a" (UID: "577aa82c-8c30-4044-9d1a-ff88cc3f390a"). InnerVolumeSpecName "kube-api-access-tc8xg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.673842 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "577aa82c-8c30-4044-9d1a-ff88cc3f390a" (UID: "577aa82c-8c30-4044-9d1a-ff88cc3f390a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.685034 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "577aa82c-8c30-4044-9d1a-ff88cc3f390a" (UID: "577aa82c-8c30-4044-9d1a-ff88cc3f390a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.685857 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "577aa82c-8c30-4044-9d1a-ff88cc3f390a" (UID: "577aa82c-8c30-4044-9d1a-ff88cc3f390a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.686454 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "577aa82c-8c30-4044-9d1a-ff88cc3f390a" (UID: "577aa82c-8c30-4044-9d1a-ff88cc3f390a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.694599 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-config" (OuterVolumeSpecName: "config") pod "577aa82c-8c30-4044-9d1a-ff88cc3f390a" (UID: "577aa82c-8c30-4044-9d1a-ff88cc3f390a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.721955 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc8xg\" (UniqueName: \"kubernetes.io/projected/577aa82c-8c30-4044-9d1a-ff88cc3f390a-kube-api-access-tc8xg\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.722306 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.722323 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.722338 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.722347 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.722357 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/577aa82c-8c30-4044-9d1a-ff88cc3f390a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.974450 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" event={"ID":"577aa82c-8c30-4044-9d1a-ff88cc3f390a","Type":"ContainerDied","Data":"761a3baefc1408afe92b53cb7e2c36d012bb9fd45b4458579b58a7df3ce1a0f9"} Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.974565 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-mnx84" Oct 02 11:14:57 crc kubenswrapper[4766]: I1002 11:14:57.974859 4766 scope.go:117] "RemoveContainer" containerID="b264c9db174673bc4ffbf1f294738343f7e8f59f993f08f7e6b0bb265cf2db85" Oct 02 11:14:58 crc kubenswrapper[4766]: I1002 11:14:58.004793 4766 scope.go:117] "RemoveContainer" containerID="10d7d4fd71d139575da833850c09e54e2286d6072e71abd8fda27847693d7b1e" Oct 02 11:14:58 crc kubenswrapper[4766]: I1002 11:14:58.007146 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mnx84"] Oct 02 11:14:58 crc kubenswrapper[4766]: I1002 11:14:58.016783 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-mnx84"] Oct 02 11:14:58 crc kubenswrapper[4766]: I1002 11:14:58.182550 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:58 crc kubenswrapper[4766]: I1002 11:14:58.357010 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:14:59 crc kubenswrapper[4766]: I1002 11:14:59.677045 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:14:59 crc kubenswrapper[4766]: I1002 11:14:59.894696 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="577aa82c-8c30-4044-9d1a-ff88cc3f390a" path="/var/lib/kubelet/pods/577aa82c-8c30-4044-9d1a-ff88cc3f390a/volumes" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.132535 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr"] Oct 02 11:15:00 crc kubenswrapper[4766]: E1002 11:15:00.132996 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577aa82c-8c30-4044-9d1a-ff88cc3f390a" containerName="init" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.133015 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="577aa82c-8c30-4044-9d1a-ff88cc3f390a" containerName="init" Oct 02 11:15:00 crc kubenswrapper[4766]: E1002 11:15:00.133036 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="577aa82c-8c30-4044-9d1a-ff88cc3f390a" containerName="dnsmasq-dns" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.133043 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="577aa82c-8c30-4044-9d1a-ff88cc3f390a" containerName="dnsmasq-dns" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.133233 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="577aa82c-8c30-4044-9d1a-ff88cc3f390a" containerName="dnsmasq-dns" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.133882 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.135482 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.135940 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.140295 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr"] Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.169832 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27b54e2f-1607-453e-8a7a-cd9d111e7d24-config-volume\") pod \"collect-profiles-29323395-b4hvr\" (UID: \"27b54e2f-1607-453e-8a7a-cd9d111e7d24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.169911 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqtp9\" (UniqueName: \"kubernetes.io/projected/27b54e2f-1607-453e-8a7a-cd9d111e7d24-kube-api-access-lqtp9\") pod \"collect-profiles-29323395-b4hvr\" (UID: \"27b54e2f-1607-453e-8a7a-cd9d111e7d24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.170035 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27b54e2f-1607-453e-8a7a-cd9d111e7d24-secret-volume\") pod \"collect-profiles-29323395-b4hvr\" (UID: \"27b54e2f-1607-453e-8a7a-cd9d111e7d24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.272005 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27b54e2f-1607-453e-8a7a-cd9d111e7d24-secret-volume\") pod \"collect-profiles-29323395-b4hvr\" (UID: \"27b54e2f-1607-453e-8a7a-cd9d111e7d24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.272175 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27b54e2f-1607-453e-8a7a-cd9d111e7d24-config-volume\") pod \"collect-profiles-29323395-b4hvr\" (UID: \"27b54e2f-1607-453e-8a7a-cd9d111e7d24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.272217 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqtp9\" (UniqueName: \"kubernetes.io/projected/27b54e2f-1607-453e-8a7a-cd9d111e7d24-kube-api-access-lqtp9\") pod \"collect-profiles-29323395-b4hvr\" (UID: \"27b54e2f-1607-453e-8a7a-cd9d111e7d24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.273521 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27b54e2f-1607-453e-8a7a-cd9d111e7d24-config-volume\") pod \"collect-profiles-29323395-b4hvr\" (UID: \"27b54e2f-1607-453e-8a7a-cd9d111e7d24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.279965 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27b54e2f-1607-453e-8a7a-cd9d111e7d24-secret-volume\") pod \"collect-profiles-29323395-b4hvr\" (UID: \"27b54e2f-1607-453e-8a7a-cd9d111e7d24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.290667 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqtp9\" (UniqueName: \"kubernetes.io/projected/27b54e2f-1607-453e-8a7a-cd9d111e7d24-kube-api-access-lqtp9\") pod \"collect-profiles-29323395-b4hvr\" (UID: \"27b54e2f-1607-453e-8a7a-cd9d111e7d24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.460604 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr" Oct 02 11:15:00 crc kubenswrapper[4766]: I1002 11:15:00.930986 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr"] Oct 02 11:15:01 crc kubenswrapper[4766]: I1002 11:15:01.027191 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr" event={"ID":"27b54e2f-1607-453e-8a7a-cd9d111e7d24","Type":"ContainerStarted","Data":"c8e4affb70e8299446e2e314f0a623e4a53f40cb35ef67342b57fdb728a86bd0"} Oct 02 11:15:01 crc kubenswrapper[4766]: I1002 11:15:01.110284 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:15:01 crc kubenswrapper[4766]: I1002 11:15:01.145570 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:15:01 crc kubenswrapper[4766]: I1002 11:15:01.199617 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6dd4f44f78-54h7c"] Oct 02 11:15:01 crc kubenswrapper[4766]: I1002 11:15:01.199800 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6dd4f44f78-54h7c" podUID="9a967c35-c726-4b3a-ad92-80a11601ecaa" containerName="barbican-api-log" containerID="cri-o://f571c2a1d52270793fac8f73616909327551438103fbb8ba14f3598e3035521b" gracePeriod=30 Oct 02 11:15:01 crc kubenswrapper[4766]: I1002 11:15:01.199945 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6dd4f44f78-54h7c" podUID="9a967c35-c726-4b3a-ad92-80a11601ecaa" containerName="barbican-api" containerID="cri-o://af6392219af0c458e6a3ef494ba25c490c3ca60d53409167568566209b195c0f" gracePeriod=30 Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.038342 4766 generic.go:334] "Generic (PLEG): container finished" podID="9a967c35-c726-4b3a-ad92-80a11601ecaa" containerID="f571c2a1d52270793fac8f73616909327551438103fbb8ba14f3598e3035521b" exitCode=143 Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.038402 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd4f44f78-54h7c" event={"ID":"9a967c35-c726-4b3a-ad92-80a11601ecaa","Type":"ContainerDied","Data":"f571c2a1d52270793fac8f73616909327551438103fbb8ba14f3598e3035521b"} Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.040661 4766 generic.go:334] "Generic (PLEG): container finished" podID="27b54e2f-1607-453e-8a7a-cd9d111e7d24" containerID="90183a3a44b60d7ca321c396aec23d59060319163bf7c3bc6b2f96bc5fbbbc4f" exitCode=0 Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.040716 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr" event={"ID":"27b54e2f-1607-453e-8a7a-cd9d111e7d24","Type":"ContainerDied","Data":"90183a3a44b60d7ca321c396aec23d59060319163bf7c3bc6b2f96bc5fbbbc4f"} Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.174872 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.180196 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.185744 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.185744 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.191100 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-2ldv7" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.204898 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47szb\" (UniqueName: \"kubernetes.io/projected/8f861ddb-aafd-4a34-9796-313443c78050-kube-api-access-47szb\") pod \"openstackclient\" (UID: \"8f861ddb-aafd-4a34-9796-313443c78050\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.204965 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f861ddb-aafd-4a34-9796-313443c78050-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8f861ddb-aafd-4a34-9796-313443c78050\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.205015 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8f861ddb-aafd-4a34-9796-313443c78050-openstack-config\") pod \"openstackclient\" (UID: \"8f861ddb-aafd-4a34-9796-313443c78050\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.205115 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8f861ddb-aafd-4a34-9796-313443c78050-openstack-config-secret\") pod \"openstackclient\" (UID: \"8f861ddb-aafd-4a34-9796-313443c78050\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.207009 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.307163 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47szb\" (UniqueName: \"kubernetes.io/projected/8f861ddb-aafd-4a34-9796-313443c78050-kube-api-access-47szb\") pod \"openstackclient\" (UID: \"8f861ddb-aafd-4a34-9796-313443c78050\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.308296 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f861ddb-aafd-4a34-9796-313443c78050-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8f861ddb-aafd-4a34-9796-313443c78050\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.308382 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8f861ddb-aafd-4a34-9796-313443c78050-openstack-config\") pod \"openstackclient\" (UID: \"8f861ddb-aafd-4a34-9796-313443c78050\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.308601 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8f861ddb-aafd-4a34-9796-313443c78050-openstack-config-secret\") pod \"openstackclient\" (UID: \"8f861ddb-aafd-4a34-9796-313443c78050\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.309469 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8f861ddb-aafd-4a34-9796-313443c78050-openstack-config\") pod \"openstackclient\" (UID: \"8f861ddb-aafd-4a34-9796-313443c78050\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.316481 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f861ddb-aafd-4a34-9796-313443c78050-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8f861ddb-aafd-4a34-9796-313443c78050\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.318100 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8f861ddb-aafd-4a34-9796-313443c78050-openstack-config-secret\") pod \"openstackclient\" (UID: \"8f861ddb-aafd-4a34-9796-313443c78050\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.334115 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47szb\" (UniqueName: \"kubernetes.io/projected/8f861ddb-aafd-4a34-9796-313443c78050-kube-api-access-47szb\") pod \"openstackclient\" (UID: \"8f861ddb-aafd-4a34-9796-313443c78050\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.522467 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.534481 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.576794 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.594196 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.595700 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.612320 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d896308d-0b8a-4cfc-ad92-311521c2e417-openstack-config-secret\") pod \"openstackclient\" (UID: \"d896308d-0b8a-4cfc-ad92-311521c2e417\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.612432 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d896308d-0b8a-4cfc-ad92-311521c2e417-openstack-config\") pod \"openstackclient\" (UID: \"d896308d-0b8a-4cfc-ad92-311521c2e417\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.612475 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d896308d-0b8a-4cfc-ad92-311521c2e417-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d896308d-0b8a-4cfc-ad92-311521c2e417\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.612491 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvg8r\" (UniqueName: \"kubernetes.io/projected/d896308d-0b8a-4cfc-ad92-311521c2e417-kube-api-access-pvg8r\") pod \"openstackclient\" (UID: \"d896308d-0b8a-4cfc-ad92-311521c2e417\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.624426 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 11:15:02 crc kubenswrapper[4766]: E1002 11:15:02.627919 4766 log.go:32] "RunPodSandbox from runtime service failed" err=< Oct 02 11:15:02 crc kubenswrapper[4766]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_8f861ddb-aafd-4a34-9796-313443c78050_0(bb221e1ada9b8e0c03977d70eb45fdfb7e4572e2afe04f416a7deab00cdad03f): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"bb221e1ada9b8e0c03977d70eb45fdfb7e4572e2afe04f416a7deab00cdad03f" Netns:"/var/run/netns/f877c97a-466e-4f45-a452-1c90716cb162" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=bb221e1ada9b8e0c03977d70eb45fdfb7e4572e2afe04f416a7deab00cdad03f;K8S_POD_UID=8f861ddb-aafd-4a34-9796-313443c78050" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/8f861ddb-aafd-4a34-9796-313443c78050]: expected pod UID "8f861ddb-aafd-4a34-9796-313443c78050" but got "d896308d-0b8a-4cfc-ad92-311521c2e417" from Kube API Oct 02 11:15:02 crc kubenswrapper[4766]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 02 11:15:02 crc kubenswrapper[4766]: > Oct 02 11:15:02 crc kubenswrapper[4766]: E1002 11:15:02.627971 4766 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Oct 02 11:15:02 crc kubenswrapper[4766]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_8f861ddb-aafd-4a34-9796-313443c78050_0(bb221e1ada9b8e0c03977d70eb45fdfb7e4572e2afe04f416a7deab00cdad03f): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"bb221e1ada9b8e0c03977d70eb45fdfb7e4572e2afe04f416a7deab00cdad03f" Netns:"/var/run/netns/f877c97a-466e-4f45-a452-1c90716cb162" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=bb221e1ada9b8e0c03977d70eb45fdfb7e4572e2afe04f416a7deab00cdad03f;K8S_POD_UID=8f861ddb-aafd-4a34-9796-313443c78050" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/8f861ddb-aafd-4a34-9796-313443c78050]: expected pod UID "8f861ddb-aafd-4a34-9796-313443c78050" but got "d896308d-0b8a-4cfc-ad92-311521c2e417" from Kube API Oct 02 11:15:02 crc kubenswrapper[4766]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 02 11:15:02 crc kubenswrapper[4766]: > pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.714099 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d896308d-0b8a-4cfc-ad92-311521c2e417-openstack-config-secret\") pod \"openstackclient\" (UID: \"d896308d-0b8a-4cfc-ad92-311521c2e417\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.714223 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d896308d-0b8a-4cfc-ad92-311521c2e417-openstack-config\") pod \"openstackclient\" (UID: \"d896308d-0b8a-4cfc-ad92-311521c2e417\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.714261 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d896308d-0b8a-4cfc-ad92-311521c2e417-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d896308d-0b8a-4cfc-ad92-311521c2e417\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.714282 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvg8r\" (UniqueName: \"kubernetes.io/projected/d896308d-0b8a-4cfc-ad92-311521c2e417-kube-api-access-pvg8r\") pod \"openstackclient\" (UID: \"d896308d-0b8a-4cfc-ad92-311521c2e417\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.715326 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d896308d-0b8a-4cfc-ad92-311521c2e417-openstack-config\") pod \"openstackclient\" (UID: \"d896308d-0b8a-4cfc-ad92-311521c2e417\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.718369 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d896308d-0b8a-4cfc-ad92-311521c2e417-openstack-config-secret\") pod \"openstackclient\" (UID: \"d896308d-0b8a-4cfc-ad92-311521c2e417\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.719271 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d896308d-0b8a-4cfc-ad92-311521c2e417-combined-ca-bundle\") pod \"openstackclient\" (UID: \"d896308d-0b8a-4cfc-ad92-311521c2e417\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.735252 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvg8r\" (UniqueName: \"kubernetes.io/projected/d896308d-0b8a-4cfc-ad92-311521c2e417-kube-api-access-pvg8r\") pod \"openstackclient\" (UID: \"d896308d-0b8a-4cfc-ad92-311521c2e417\") " pod="openstack/openstackclient" Oct 02 11:15:02 crc kubenswrapper[4766]: I1002 11:15:02.987027 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.048790 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.054109 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8f861ddb-aafd-4a34-9796-313443c78050" podUID="d896308d-0b8a-4cfc-ad92-311521c2e417" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.067845 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.126686 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f861ddb-aafd-4a34-9796-313443c78050-combined-ca-bundle\") pod \"8f861ddb-aafd-4a34-9796-313443c78050\" (UID: \"8f861ddb-aafd-4a34-9796-313443c78050\") " Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.126835 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47szb\" (UniqueName: \"kubernetes.io/projected/8f861ddb-aafd-4a34-9796-313443c78050-kube-api-access-47szb\") pod \"8f861ddb-aafd-4a34-9796-313443c78050\" (UID: \"8f861ddb-aafd-4a34-9796-313443c78050\") " Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.126871 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8f861ddb-aafd-4a34-9796-313443c78050-openstack-config-secret\") pod \"8f861ddb-aafd-4a34-9796-313443c78050\" (UID: \"8f861ddb-aafd-4a34-9796-313443c78050\") " Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.126891 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8f861ddb-aafd-4a34-9796-313443c78050-openstack-config\") pod \"8f861ddb-aafd-4a34-9796-313443c78050\" (UID: \"8f861ddb-aafd-4a34-9796-313443c78050\") " Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.127860 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f861ddb-aafd-4a34-9796-313443c78050-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "8f861ddb-aafd-4a34-9796-313443c78050" (UID: "8f861ddb-aafd-4a34-9796-313443c78050"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.131897 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f861ddb-aafd-4a34-9796-313443c78050-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f861ddb-aafd-4a34-9796-313443c78050" (UID: "8f861ddb-aafd-4a34-9796-313443c78050"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.135683 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f861ddb-aafd-4a34-9796-313443c78050-kube-api-access-47szb" (OuterVolumeSpecName: "kube-api-access-47szb") pod "8f861ddb-aafd-4a34-9796-313443c78050" (UID: "8f861ddb-aafd-4a34-9796-313443c78050"). InnerVolumeSpecName "kube-api-access-47szb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.136325 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f861ddb-aafd-4a34-9796-313443c78050-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "8f861ddb-aafd-4a34-9796-313443c78050" (UID: "8f861ddb-aafd-4a34-9796-313443c78050"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.229582 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47szb\" (UniqueName: \"kubernetes.io/projected/8f861ddb-aafd-4a34-9796-313443c78050-kube-api-access-47szb\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.229626 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8f861ddb-aafd-4a34-9796-313443c78050-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.229649 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8f861ddb-aafd-4a34-9796-313443c78050-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.229662 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f861ddb-aafd-4a34-9796-313443c78050-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.522559 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.609769 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 11:15:03 crc kubenswrapper[4766]: W1002 11:15:03.624044 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd896308d_0b8a_4cfc_ad92_311521c2e417.slice/crio-34d6cc2563722ea2b1a33dc39c1c3a11b981b8f14f3c5972239a75124ceb4cef WatchSource:0}: Error finding container 34d6cc2563722ea2b1a33dc39c1c3a11b981b8f14f3c5972239a75124ceb4cef: Status 404 returned error can't find the container with id 34d6cc2563722ea2b1a33dc39c1c3a11b981b8f14f3c5972239a75124ceb4cef Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.636270 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27b54e2f-1607-453e-8a7a-cd9d111e7d24-config-volume\") pod \"27b54e2f-1607-453e-8a7a-cd9d111e7d24\" (UID: \"27b54e2f-1607-453e-8a7a-cd9d111e7d24\") " Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.638289 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27b54e2f-1607-453e-8a7a-cd9d111e7d24-config-volume" (OuterVolumeSpecName: "config-volume") pod "27b54e2f-1607-453e-8a7a-cd9d111e7d24" (UID: "27b54e2f-1607-453e-8a7a-cd9d111e7d24"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.638608 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27b54e2f-1607-453e-8a7a-cd9d111e7d24-secret-volume\") pod \"27b54e2f-1607-453e-8a7a-cd9d111e7d24\" (UID: \"27b54e2f-1607-453e-8a7a-cd9d111e7d24\") " Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.639047 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqtp9\" (UniqueName: \"kubernetes.io/projected/27b54e2f-1607-453e-8a7a-cd9d111e7d24-kube-api-access-lqtp9\") pod \"27b54e2f-1607-453e-8a7a-cd9d111e7d24\" (UID: \"27b54e2f-1607-453e-8a7a-cd9d111e7d24\") " Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.639848 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27b54e2f-1607-453e-8a7a-cd9d111e7d24-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.643154 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27b54e2f-1607-453e-8a7a-cd9d111e7d24-kube-api-access-lqtp9" (OuterVolumeSpecName: "kube-api-access-lqtp9") pod "27b54e2f-1607-453e-8a7a-cd9d111e7d24" (UID: "27b54e2f-1607-453e-8a7a-cd9d111e7d24"). InnerVolumeSpecName "kube-api-access-lqtp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.644187 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b54e2f-1607-453e-8a7a-cd9d111e7d24-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "27b54e2f-1607-453e-8a7a-cd9d111e7d24" (UID: "27b54e2f-1607-453e-8a7a-cd9d111e7d24"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.741401 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/27b54e2f-1607-453e-8a7a-cd9d111e7d24-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.741437 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqtp9\" (UniqueName: \"kubernetes.io/projected/27b54e2f-1607-453e-8a7a-cd9d111e7d24-kube-api-access-lqtp9\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:03 crc kubenswrapper[4766]: I1002 11:15:03.895009 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f861ddb-aafd-4a34-9796-313443c78050" path="/var/lib/kubelet/pods/8f861ddb-aafd-4a34-9796-313443c78050/volumes" Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.059612 4766 generic.go:334] "Generic (PLEG): container finished" podID="12786f1e-db55-4668-8e43-afa080dc0fa2" containerID="867efe65a7b16f3f4c20bd8fbe635248271d0736d8757d4cc1dbab820099a089" exitCode=0 Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.059736 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s5j64" event={"ID":"12786f1e-db55-4668-8e43-afa080dc0fa2","Type":"ContainerDied","Data":"867efe65a7b16f3f4c20bd8fbe635248271d0736d8757d4cc1dbab820099a089"} Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.062721 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr" Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.062718 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr" event={"ID":"27b54e2f-1607-453e-8a7a-cd9d111e7d24","Type":"ContainerDied","Data":"c8e4affb70e8299446e2e314f0a623e4a53f40cb35ef67342b57fdb728a86bd0"} Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.062893 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8e4affb70e8299446e2e314f0a623e4a53f40cb35ef67342b57fdb728a86bd0" Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.064279 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d896308d-0b8a-4cfc-ad92-311521c2e417","Type":"ContainerStarted","Data":"34d6cc2563722ea2b1a33dc39c1c3a11b981b8f14f3c5972239a75124ceb4cef"} Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.064308 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.090802 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="8f861ddb-aafd-4a34-9796-313443c78050" podUID="d896308d-0b8a-4cfc-ad92-311521c2e417" Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.424160 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dd4f44f78-54h7c" podUID="9a967c35-c726-4b3a-ad92-80a11601ecaa" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.147:9311/healthcheck\": read tcp 10.217.0.2:46278->10.217.0.147:9311: read: connection reset by peer" Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.424193 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6dd4f44f78-54h7c" podUID="9a967c35-c726-4b3a-ad92-80a11601ecaa" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.147:9311/healthcheck\": read tcp 10.217.0.2:46276->10.217.0.147:9311: read: connection reset by peer" Oct 02 11:15:04 crc kubenswrapper[4766]: E1002 11:15:04.678198 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a967c35_c726_4b3a_ad92_80a11601ecaa.slice/crio-conmon-af6392219af0c458e6a3ef494ba25c490c3ca60d53409167568566209b195c0f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a967c35_c726_4b3a_ad92_80a11601ecaa.slice/crio-af6392219af0c458e6a3ef494ba25c490c3ca60d53409167568566209b195c0f.scope\": RecentStats: unable to find data in memory cache]" Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.934340 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.966416 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-combined-ca-bundle\") pod \"9a967c35-c726-4b3a-ad92-80a11601ecaa\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.966527 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-config-data-custom\") pod \"9a967c35-c726-4b3a-ad92-80a11601ecaa\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.966686 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-config-data\") pod \"9a967c35-c726-4b3a-ad92-80a11601ecaa\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.966725 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a967c35-c726-4b3a-ad92-80a11601ecaa-logs\") pod \"9a967c35-c726-4b3a-ad92-80a11601ecaa\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.966823 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgbhm\" (UniqueName: \"kubernetes.io/projected/9a967c35-c726-4b3a-ad92-80a11601ecaa-kube-api-access-lgbhm\") pod \"9a967c35-c726-4b3a-ad92-80a11601ecaa\" (UID: \"9a967c35-c726-4b3a-ad92-80a11601ecaa\") " Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.978430 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a967c35-c726-4b3a-ad92-80a11601ecaa-logs" (OuterVolumeSpecName: "logs") pod "9a967c35-c726-4b3a-ad92-80a11601ecaa" (UID: "9a967c35-c726-4b3a-ad92-80a11601ecaa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:04 crc kubenswrapper[4766]: I1002 11:15:04.988580 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a967c35-c726-4b3a-ad92-80a11601ecaa-kube-api-access-lgbhm" (OuterVolumeSpecName: "kube-api-access-lgbhm") pod "9a967c35-c726-4b3a-ad92-80a11601ecaa" (UID: "9a967c35-c726-4b3a-ad92-80a11601ecaa"). InnerVolumeSpecName "kube-api-access-lgbhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.014306 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9a967c35-c726-4b3a-ad92-80a11601ecaa" (UID: "9a967c35-c726-4b3a-ad92-80a11601ecaa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.019691 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a967c35-c726-4b3a-ad92-80a11601ecaa" (UID: "9a967c35-c726-4b3a-ad92-80a11601ecaa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.038335 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-config-data" (OuterVolumeSpecName: "config-data") pod "9a967c35-c726-4b3a-ad92-80a11601ecaa" (UID: "9a967c35-c726-4b3a-ad92-80a11601ecaa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.072890 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.072936 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a967c35-c726-4b3a-ad92-80a11601ecaa-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.072947 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgbhm\" (UniqueName: \"kubernetes.io/projected/9a967c35-c726-4b3a-ad92-80a11601ecaa-kube-api-access-lgbhm\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.072960 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.072968 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9a967c35-c726-4b3a-ad92-80a11601ecaa-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.077972 4766 generic.go:334] "Generic (PLEG): container finished" podID="9a967c35-c726-4b3a-ad92-80a11601ecaa" containerID="af6392219af0c458e6a3ef494ba25c490c3ca60d53409167568566209b195c0f" exitCode=0 Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.078267 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6dd4f44f78-54h7c" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.079122 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd4f44f78-54h7c" event={"ID":"9a967c35-c726-4b3a-ad92-80a11601ecaa","Type":"ContainerDied","Data":"af6392219af0c458e6a3ef494ba25c490c3ca60d53409167568566209b195c0f"} Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.079195 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6dd4f44f78-54h7c" event={"ID":"9a967c35-c726-4b3a-ad92-80a11601ecaa","Type":"ContainerDied","Data":"31b27a5fd6fd71563624e1880621c31ea312ab1cbc0a547d121cbd0e8718686b"} Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.079215 4766 scope.go:117] "RemoveContainer" containerID="af6392219af0c458e6a3ef494ba25c490c3ca60d53409167568566209b195c0f" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.121949 4766 scope.go:117] "RemoveContainer" containerID="f571c2a1d52270793fac8f73616909327551438103fbb8ba14f3598e3035521b" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.123604 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6dd4f44f78-54h7c"] Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.131988 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6dd4f44f78-54h7c"] Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.140358 4766 scope.go:117] "RemoveContainer" containerID="af6392219af0c458e6a3ef494ba25c490c3ca60d53409167568566209b195c0f" Oct 02 11:15:05 crc kubenswrapper[4766]: E1002 11:15:05.140873 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af6392219af0c458e6a3ef494ba25c490c3ca60d53409167568566209b195c0f\": container with ID starting with af6392219af0c458e6a3ef494ba25c490c3ca60d53409167568566209b195c0f not found: ID does not exist" containerID="af6392219af0c458e6a3ef494ba25c490c3ca60d53409167568566209b195c0f" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.140920 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af6392219af0c458e6a3ef494ba25c490c3ca60d53409167568566209b195c0f"} err="failed to get container status \"af6392219af0c458e6a3ef494ba25c490c3ca60d53409167568566209b195c0f\": rpc error: code = NotFound desc = could not find container \"af6392219af0c458e6a3ef494ba25c490c3ca60d53409167568566209b195c0f\": container with ID starting with af6392219af0c458e6a3ef494ba25c490c3ca60d53409167568566209b195c0f not found: ID does not exist" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.140951 4766 scope.go:117] "RemoveContainer" containerID="f571c2a1d52270793fac8f73616909327551438103fbb8ba14f3598e3035521b" Oct 02 11:15:05 crc kubenswrapper[4766]: E1002 11:15:05.141686 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f571c2a1d52270793fac8f73616909327551438103fbb8ba14f3598e3035521b\": container with ID starting with f571c2a1d52270793fac8f73616909327551438103fbb8ba14f3598e3035521b not found: ID does not exist" containerID="f571c2a1d52270793fac8f73616909327551438103fbb8ba14f3598e3035521b" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.141745 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f571c2a1d52270793fac8f73616909327551438103fbb8ba14f3598e3035521b"} err="failed to get container status \"f571c2a1d52270793fac8f73616909327551438103fbb8ba14f3598e3035521b\": rpc error: code = NotFound desc = could not find container \"f571c2a1d52270793fac8f73616909327551438103fbb8ba14f3598e3035521b\": container with ID starting with f571c2a1d52270793fac8f73616909327551438103fbb8ba14f3598e3035521b not found: ID does not exist" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.396458 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s5j64" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.480608 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-combined-ca-bundle\") pod \"12786f1e-db55-4668-8e43-afa080dc0fa2\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.480869 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12786f1e-db55-4668-8e43-afa080dc0fa2-etc-machine-id\") pod \"12786f1e-db55-4668-8e43-afa080dc0fa2\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.480905 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-db-sync-config-data\") pod \"12786f1e-db55-4668-8e43-afa080dc0fa2\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.480983 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-scripts\") pod \"12786f1e-db55-4668-8e43-afa080dc0fa2\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.481063 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m79rr\" (UniqueName: \"kubernetes.io/projected/12786f1e-db55-4668-8e43-afa080dc0fa2-kube-api-access-m79rr\") pod \"12786f1e-db55-4668-8e43-afa080dc0fa2\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.481116 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-config-data\") pod \"12786f1e-db55-4668-8e43-afa080dc0fa2\" (UID: \"12786f1e-db55-4668-8e43-afa080dc0fa2\") " Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.482265 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12786f1e-db55-4668-8e43-afa080dc0fa2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "12786f1e-db55-4668-8e43-afa080dc0fa2" (UID: "12786f1e-db55-4668-8e43-afa080dc0fa2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.486751 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "12786f1e-db55-4668-8e43-afa080dc0fa2" (UID: "12786f1e-db55-4668-8e43-afa080dc0fa2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.486833 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12786f1e-db55-4668-8e43-afa080dc0fa2-kube-api-access-m79rr" (OuterVolumeSpecName: "kube-api-access-m79rr") pod "12786f1e-db55-4668-8e43-afa080dc0fa2" (UID: "12786f1e-db55-4668-8e43-afa080dc0fa2"). InnerVolumeSpecName "kube-api-access-m79rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.487697 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-scripts" (OuterVolumeSpecName: "scripts") pod "12786f1e-db55-4668-8e43-afa080dc0fa2" (UID: "12786f1e-db55-4668-8e43-afa080dc0fa2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.510488 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12786f1e-db55-4668-8e43-afa080dc0fa2" (UID: "12786f1e-db55-4668-8e43-afa080dc0fa2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.554881 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-config-data" (OuterVolumeSpecName: "config-data") pod "12786f1e-db55-4668-8e43-afa080dc0fa2" (UID: "12786f1e-db55-4668-8e43-afa080dc0fa2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.583650 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.583688 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m79rr\" (UniqueName: \"kubernetes.io/projected/12786f1e-db55-4668-8e43-afa080dc0fa2-kube-api-access-m79rr\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.583703 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.583713 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.583724 4766 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12786f1e-db55-4668-8e43-afa080dc0fa2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.583733 4766 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/12786f1e-db55-4668-8e43-afa080dc0fa2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:05 crc kubenswrapper[4766]: I1002 11:15:05.894145 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a967c35-c726-4b3a-ad92-80a11601ecaa" path="/var/lib/kubelet/pods/9a967c35-c726-4b3a-ad92-80a11601ecaa/volumes" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.093219 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-s5j64" event={"ID":"12786f1e-db55-4668-8e43-afa080dc0fa2","Type":"ContainerDied","Data":"17eb98bc53a15ec1cfa13fd3ee84be9351bd857e450c1f79a5d5f052cfe92cd6"} Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.094055 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17eb98bc53a15ec1cfa13fd3ee84be9351bd857e450c1f79a5d5f052cfe92cd6" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.094208 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-s5j64" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.168420 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-67bd9fd99f-qbp28"] Oct 02 11:15:06 crc kubenswrapper[4766]: E1002 11:15:06.168858 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a967c35-c726-4b3a-ad92-80a11601ecaa" containerName="barbican-api" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.168876 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a967c35-c726-4b3a-ad92-80a11601ecaa" containerName="barbican-api" Oct 02 11:15:06 crc kubenswrapper[4766]: E1002 11:15:06.168890 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12786f1e-db55-4668-8e43-afa080dc0fa2" containerName="cinder-db-sync" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.168901 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="12786f1e-db55-4668-8e43-afa080dc0fa2" containerName="cinder-db-sync" Oct 02 11:15:06 crc kubenswrapper[4766]: E1002 11:15:06.168936 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a967c35-c726-4b3a-ad92-80a11601ecaa" containerName="barbican-api-log" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.168945 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a967c35-c726-4b3a-ad92-80a11601ecaa" containerName="barbican-api-log" Oct 02 11:15:06 crc kubenswrapper[4766]: E1002 11:15:06.168961 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27b54e2f-1607-453e-8a7a-cd9d111e7d24" containerName="collect-profiles" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.168968 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b54e2f-1607-453e-8a7a-cd9d111e7d24" containerName="collect-profiles" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.170001 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="12786f1e-db55-4668-8e43-afa080dc0fa2" containerName="cinder-db-sync" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.170037 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a967c35-c726-4b3a-ad92-80a11601ecaa" containerName="barbican-api-log" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.170058 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a967c35-c726-4b3a-ad92-80a11601ecaa" containerName="barbican-api" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.170075 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="27b54e2f-1607-453e-8a7a-cd9d111e7d24" containerName="collect-profiles" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.177192 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.184382 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.184436 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.184866 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.195837 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-67bd9fd99f-qbp28"] Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.269227 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.307048 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/44893df1-77c5-494c-bae0-253447abc8f4-etc-swift\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.307492 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-internal-tls-certs\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.307568 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-combined-ca-bundle\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.307605 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-public-tls-certs\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.307645 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44893df1-77c5-494c-bae0-253447abc8f4-run-httpd\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.307697 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44893df1-77c5-494c-bae0-253447abc8f4-log-httpd\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.307743 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stv6j\" (UniqueName: \"kubernetes.io/projected/44893df1-77c5-494c-bae0-253447abc8f4-kube-api-access-stv6j\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.307864 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-config-data\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.388492 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.397053 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.397603 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.416173 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.416463 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.416622 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-config-data\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.416684 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9bdhh" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.416704 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/44893df1-77c5-494c-bae0-253447abc8f4-etc-swift\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.416726 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-internal-tls-certs\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.416756 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-combined-ca-bundle\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.416781 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-public-tls-certs\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.416802 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44893df1-77c5-494c-bae0-253447abc8f4-run-httpd\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.416822 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44893df1-77c5-494c-bae0-253447abc8f4-log-httpd\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.416850 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stv6j\" (UniqueName: \"kubernetes.io/projected/44893df1-77c5-494c-bae0-253447abc8f4-kube-api-access-stv6j\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.417777 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.420349 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44893df1-77c5-494c-bae0-253447abc8f4-run-httpd\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.420670 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44893df1-77c5-494c-bae0-253447abc8f4-log-httpd\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.437535 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/44893df1-77c5-494c-bae0-253447abc8f4-etc-swift\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.441793 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-777dc6f59c-knpng"] Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.444092 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.448958 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-config-data\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.449159 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-public-tls-certs\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.453685 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-internal-tls-certs\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.454872 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-combined-ca-bundle\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.470649 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stv6j\" (UniqueName: \"kubernetes.io/projected/44893df1-77c5-494c-bae0-253447abc8f4-kube-api-access-stv6j\") pod \"swift-proxy-67bd9fd99f-qbp28\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.504127 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.515258 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.524037 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.524150 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-dns-svc\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.524184 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-ovsdbserver-nb\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.524248 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-dns-swift-storage-0\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.524272 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-scripts\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.524304 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5wlj\" (UniqueName: \"kubernetes.io/projected/77f4ba15-529f-4e71-a7ea-74848f8bb55e-kube-api-access-t5wlj\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.524329 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-ovsdbserver-sb\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.524563 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxlm9\" (UniqueName: \"kubernetes.io/projected/da3f12a0-0986-43aa-9727-60efa7d4a1f8-kube-api-access-nxlm9\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.524584 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-config-data\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.524700 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-config\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.524772 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da3f12a0-0986-43aa-9727-60efa7d4a1f8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.524808 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.540542 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-777dc6f59c-knpng"] Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.626645 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-config\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.626724 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da3f12a0-0986-43aa-9727-60efa7d4a1f8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.626764 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.626822 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.626849 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-dns-svc\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.626872 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-ovsdbserver-nb\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.626923 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-dns-swift-storage-0\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.626944 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-scripts\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.626966 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5wlj\" (UniqueName: \"kubernetes.io/projected/77f4ba15-529f-4e71-a7ea-74848f8bb55e-kube-api-access-t5wlj\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.626990 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-ovsdbserver-sb\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.627050 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxlm9\" (UniqueName: \"kubernetes.io/projected/da3f12a0-0986-43aa-9727-60efa7d4a1f8-kube-api-access-nxlm9\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.627071 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-config-data\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.629832 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-ovsdbserver-nb\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.629909 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-ovsdbserver-sb\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.630355 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.630603 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-dns-swift-storage-0\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.630808 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da3f12a0-0986-43aa-9727-60efa7d4a1f8-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.630927 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-config\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.631517 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-dns-svc\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.632466 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.635573 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.637262 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.638152 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-scripts\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.642976 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-config-data\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.644146 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.652962 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxlm9\" (UniqueName: \"kubernetes.io/projected/da3f12a0-0986-43aa-9727-60efa7d4a1f8-kube-api-access-nxlm9\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.656363 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5wlj\" (UniqueName: \"kubernetes.io/projected/77f4ba15-529f-4e71-a7ea-74848f8bb55e-kube-api-access-t5wlj\") pod \"dnsmasq-dns-777dc6f59c-knpng\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.660901 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.730825 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp9kl\" (UniqueName: \"kubernetes.io/projected/69a16b06-e649-4c66-94e9-7cda4fb8c135-kube-api-access-sp9kl\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.730899 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69a16b06-e649-4c66-94e9-7cda4fb8c135-logs\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.730960 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-scripts\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.731129 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-config-data-custom\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.731379 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/69a16b06-e649-4c66-94e9-7cda4fb8c135-etc-machine-id\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.731629 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.731741 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-config-data\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.836976 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.837474 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-config-data\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.837552 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp9kl\" (UniqueName: \"kubernetes.io/projected/69a16b06-e649-4c66-94e9-7cda4fb8c135-kube-api-access-sp9kl\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.837599 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69a16b06-e649-4c66-94e9-7cda4fb8c135-logs\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.837636 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-scripts\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.837670 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-config-data-custom\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.837766 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/69a16b06-e649-4c66-94e9-7cda4fb8c135-etc-machine-id\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.837897 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/69a16b06-e649-4c66-94e9-7cda4fb8c135-etc-machine-id\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.839283 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69a16b06-e649-4c66-94e9-7cda4fb8c135-logs\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.843091 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.844195 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-config-data\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.846985 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-scripts\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.852235 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-config-data-custom\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.856964 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.859230 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:06 crc kubenswrapper[4766]: I1002 11:15:06.871496 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp9kl\" (UniqueName: \"kubernetes.io/projected/69a16b06-e649-4c66-94e9-7cda4fb8c135-kube-api-access-sp9kl\") pod \"cinder-api-0\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " pod="openstack/cinder-api-0" Oct 02 11:15:07 crc kubenswrapper[4766]: I1002 11:15:07.047127 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:15:07 crc kubenswrapper[4766]: I1002 11:15:07.260222 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-67bd9fd99f-qbp28"] Oct 02 11:15:07 crc kubenswrapper[4766]: I1002 11:15:07.366409 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:07 crc kubenswrapper[4766]: W1002 11:15:07.381293 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda3f12a0_0986_43aa_9727_60efa7d4a1f8.slice/crio-f387f81b7cfa01b5813ca8df45f25272fb13c07c3c0a34158357a9a850e15e87 WatchSource:0}: Error finding container f387f81b7cfa01b5813ca8df45f25272fb13c07c3c0a34158357a9a850e15e87: Status 404 returned error can't find the container with id f387f81b7cfa01b5813ca8df45f25272fb13c07c3c0a34158357a9a850e15e87 Oct 02 11:15:07 crc kubenswrapper[4766]: I1002 11:15:07.509752 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-777dc6f59c-knpng"] Oct 02 11:15:07 crc kubenswrapper[4766]: W1002 11:15:07.512820 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77f4ba15_529f_4e71_a7ea_74848f8bb55e.slice/crio-49ba44bdbcc716377a5922d2c00211ef23b87e7caf90ae4c736b09db7a555903 WatchSource:0}: Error finding container 49ba44bdbcc716377a5922d2c00211ef23b87e7caf90ae4c736b09db7a555903: Status 404 returned error can't find the container with id 49ba44bdbcc716377a5922d2c00211ef23b87e7caf90ae4c736b09db7a555903 Oct 02 11:15:07 crc kubenswrapper[4766]: I1002 11:15:07.645081 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.147241 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da3f12a0-0986-43aa-9727-60efa7d4a1f8","Type":"ContainerStarted","Data":"f387f81b7cfa01b5813ca8df45f25272fb13c07c3c0a34158357a9a850e15e87"} Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.150600 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"69a16b06-e649-4c66-94e9-7cda4fb8c135","Type":"ContainerStarted","Data":"cb42efd32d1dadebc5d5a2e6e75f09e166e0f167a399a64addec663fce33b19b"} Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.153597 4766 generic.go:334] "Generic (PLEG): container finished" podID="77f4ba15-529f-4e71-a7ea-74848f8bb55e" containerID="94582f2be827091009f5a7b6293e09610ca21ff4e1deac7163cfafc66e363ae3" exitCode=0 Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.153724 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-777dc6f59c-knpng" event={"ID":"77f4ba15-529f-4e71-a7ea-74848f8bb55e","Type":"ContainerDied","Data":"94582f2be827091009f5a7b6293e09610ca21ff4e1deac7163cfafc66e363ae3"} Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.153759 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-777dc6f59c-knpng" event={"ID":"77f4ba15-529f-4e71-a7ea-74848f8bb55e","Type":"ContainerStarted","Data":"49ba44bdbcc716377a5922d2c00211ef23b87e7caf90ae4c736b09db7a555903"} Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.159206 4766 generic.go:334] "Generic (PLEG): container finished" podID="1a084ae6-94ba-4057-adcf-5d3d9b92c9ae" containerID="7b6649db0c3371deb879753f751b0d38b0339189eda5edb4f0fbad5ed847bc49" exitCode=0 Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.159296 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-k4x48" event={"ID":"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae","Type":"ContainerDied","Data":"7b6649db0c3371deb879753f751b0d38b0339189eda5edb4f0fbad5ed847bc49"} Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.162717 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67bd9fd99f-qbp28" event={"ID":"44893df1-77c5-494c-bae0-253447abc8f4","Type":"ContainerStarted","Data":"15c74c8e7a8896b2165a6a68dd5ee3e1b21f3fcb860feba3289044c93dbf1f19"} Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.162750 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67bd9fd99f-qbp28" event={"ID":"44893df1-77c5-494c-bae0-253447abc8f4","Type":"ContainerStarted","Data":"ff59b3f5c87557d99a67228679782023c256d54647b29c27032c95dfbc2bd77b"} Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.162765 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67bd9fd99f-qbp28" event={"ID":"44893df1-77c5-494c-bae0-253447abc8f4","Type":"ContainerStarted","Data":"a6e259f88b512ed89d5740fea0631a92320aab896a4a089d15e2abdb5846627e"} Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.163395 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.163431 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.225335 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-67bd9fd99f-qbp28" podStartSLOduration=2.225310886 podStartE2EDuration="2.225310886s" podCreationTimestamp="2025-10-02 11:15:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:08.215669668 +0000 UTC m=+1423.158540612" watchObservedRunningTime="2025-10-02 11:15:08.225310886 +0000 UTC m=+1423.168181830" Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.642948 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.644806 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="acb47101-638b-42ba-aca0-96a9f81c1443" containerName="ceilometer-central-agent" containerID="cri-o://fbf3d7476613b9ea2f684cc297139f531d6c7a3b770ed4e9c33f6c9d44f72172" gracePeriod=30 Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.646937 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="acb47101-638b-42ba-aca0-96a9f81c1443" containerName="proxy-httpd" containerID="cri-o://bd2f7b3a96aa8ee91d830307f72f8b4bd4f41acf88b4ea16eda48c6b3f2872a5" gracePeriod=30 Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.647123 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="acb47101-638b-42ba-aca0-96a9f81c1443" containerName="sg-core" containerID="cri-o://533c21f74a053a585cf21d97ee7b64bc41253b81187145237393a28bd0d8db63" gracePeriod=30 Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.647254 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="acb47101-638b-42ba-aca0-96a9f81c1443" containerName="ceilometer-notification-agent" containerID="cri-o://5576c520748d5953e012dcd26c50c4ce01e261b6dd06db08507d51fc9740dc1f" gracePeriod=30 Oct 02 11:15:08 crc kubenswrapper[4766]: I1002 11:15:08.654436 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 11:15:09 crc kubenswrapper[4766]: I1002 11:15:09.202406 4766 generic.go:334] "Generic (PLEG): container finished" podID="acb47101-638b-42ba-aca0-96a9f81c1443" containerID="bd2f7b3a96aa8ee91d830307f72f8b4bd4f41acf88b4ea16eda48c6b3f2872a5" exitCode=0 Oct 02 11:15:09 crc kubenswrapper[4766]: I1002 11:15:09.202753 4766 generic.go:334] "Generic (PLEG): container finished" podID="acb47101-638b-42ba-aca0-96a9f81c1443" containerID="533c21f74a053a585cf21d97ee7b64bc41253b81187145237393a28bd0d8db63" exitCode=2 Oct 02 11:15:09 crc kubenswrapper[4766]: I1002 11:15:09.202823 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acb47101-638b-42ba-aca0-96a9f81c1443","Type":"ContainerDied","Data":"bd2f7b3a96aa8ee91d830307f72f8b4bd4f41acf88b4ea16eda48c6b3f2872a5"} Oct 02 11:15:09 crc kubenswrapper[4766]: I1002 11:15:09.202849 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acb47101-638b-42ba-aca0-96a9f81c1443","Type":"ContainerDied","Data":"533c21f74a053a585cf21d97ee7b64bc41253b81187145237393a28bd0d8db63"} Oct 02 11:15:09 crc kubenswrapper[4766]: I1002 11:15:09.215044 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da3f12a0-0986-43aa-9727-60efa7d4a1f8","Type":"ContainerStarted","Data":"ada1411166ef19e5906600e666b606d7f0baa1825ff3982a02212ccfa8e31381"} Oct 02 11:15:09 crc kubenswrapper[4766]: I1002 11:15:09.221775 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"69a16b06-e649-4c66-94e9-7cda4fb8c135","Type":"ContainerStarted","Data":"4c673cf07b5afbfa3104c07fa205845d5ee675d264763c597c7f6688aad6e342"} Oct 02 11:15:09 crc kubenswrapper[4766]: I1002 11:15:09.224362 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-777dc6f59c-knpng" event={"ID":"77f4ba15-529f-4e71-a7ea-74848f8bb55e","Type":"ContainerStarted","Data":"39835df239f044b69e0720445d53580e2d8946352714d801c68ca413d3cbd38c"} Oct 02 11:15:09 crc kubenswrapper[4766]: I1002 11:15:09.252451 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-777dc6f59c-knpng" podStartSLOduration=3.252431372 podStartE2EDuration="3.252431372s" podCreationTimestamp="2025-10-02 11:15:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:09.243976921 +0000 UTC m=+1424.186847865" watchObservedRunningTime="2025-10-02 11:15:09.252431372 +0000 UTC m=+1424.195302316" Oct 02 11:15:09 crc kubenswrapper[4766]: I1002 11:15:09.652627 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.075958 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-k4x48" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.142549 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-combined-ca-bundle\") pod \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\" (UID: \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\") " Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.142606 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-db-sync-config-data\") pod \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\" (UID: \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\") " Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.142741 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nfjl\" (UniqueName: \"kubernetes.io/projected/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-kube-api-access-8nfjl\") pod \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\" (UID: \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\") " Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.142777 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-config-data\") pod \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\" (UID: \"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae\") " Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.156983 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1a084ae6-94ba-4057-adcf-5d3d9b92c9ae" (UID: "1a084ae6-94ba-4057-adcf-5d3d9b92c9ae"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.160899 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-kube-api-access-8nfjl" (OuterVolumeSpecName: "kube-api-access-8nfjl") pod "1a084ae6-94ba-4057-adcf-5d3d9b92c9ae" (UID: "1a084ae6-94ba-4057-adcf-5d3d9b92c9ae"). InnerVolumeSpecName "kube-api-access-8nfjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.225746 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a084ae6-94ba-4057-adcf-5d3d9b92c9ae" (UID: "1a084ae6-94ba-4057-adcf-5d3d9b92c9ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.245449 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.245489 4766 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.245518 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nfjl\" (UniqueName: \"kubernetes.io/projected/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-kube-api-access-8nfjl\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.247402 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"69a16b06-e649-4c66-94e9-7cda4fb8c135","Type":"ContainerStarted","Data":"72397d872ce507fa92e345865588eec799e5986148feeabfabfecfa3853620de"} Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.247707 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="69a16b06-e649-4c66-94e9-7cda4fb8c135" containerName="cinder-api-log" containerID="cri-o://4c673cf07b5afbfa3104c07fa205845d5ee675d264763c597c7f6688aad6e342" gracePeriod=30 Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.248145 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="69a16b06-e649-4c66-94e9-7cda4fb8c135" containerName="cinder-api" containerID="cri-o://72397d872ce507fa92e345865588eec799e5986148feeabfabfecfa3853620de" gracePeriod=30 Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.251855 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-config-data" (OuterVolumeSpecName: "config-data") pod "1a084ae6-94ba-4057-adcf-5d3d9b92c9ae" (UID: "1a084ae6-94ba-4057-adcf-5d3d9b92c9ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.257069 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-k4x48" event={"ID":"1a084ae6-94ba-4057-adcf-5d3d9b92c9ae","Type":"ContainerDied","Data":"0f04f1165e5f07cde4fef940ddaf54ee09af0ec060811699110af58b65ea69c6"} Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.257109 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f04f1165e5f07cde4fef940ddaf54ee09af0ec060811699110af58b65ea69c6" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.257160 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-k4x48" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.273425 4766 generic.go:334] "Generic (PLEG): container finished" podID="acb47101-638b-42ba-aca0-96a9f81c1443" containerID="fbf3d7476613b9ea2f684cc297139f531d6c7a3b770ed4e9c33f6c9d44f72172" exitCode=0 Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.273480 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acb47101-638b-42ba-aca0-96a9f81c1443","Type":"ContainerDied","Data":"fbf3d7476613b9ea2f684cc297139f531d6c7a3b770ed4e9c33f6c9d44f72172"} Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.283072 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da3f12a0-0986-43aa-9727-60efa7d4a1f8","Type":"ContainerStarted","Data":"052955713ff3cdfa26c7f727ee98ce52ee226feaa68f7142ccdb9ee7508fe5f3"} Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.283330 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.283883 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.283867016 podStartE2EDuration="4.283867016s" podCreationTimestamp="2025-10-02 11:15:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:10.268900467 +0000 UTC m=+1425.211771431" watchObservedRunningTime="2025-10-02 11:15:10.283867016 +0000 UTC m=+1425.226737960" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.310089 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.580066984 podStartE2EDuration="4.310066204s" podCreationTimestamp="2025-10-02 11:15:06 +0000 UTC" firstStartedPulling="2025-10-02 11:15:07.388690278 +0000 UTC m=+1422.331561222" lastFinishedPulling="2025-10-02 11:15:08.118689498 +0000 UTC m=+1423.061560442" observedRunningTime="2025-10-02 11:15:10.306453798 +0000 UTC m=+1425.249324742" watchObservedRunningTime="2025-10-02 11:15:10.310066204 +0000 UTC m=+1425.252937148" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.348729 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.601717 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-777dc6f59c-knpng"] Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.627249 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-99bm5"] Oct 02 11:15:10 crc kubenswrapper[4766]: E1002 11:15:10.629333 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a084ae6-94ba-4057-adcf-5d3d9b92c9ae" containerName="glance-db-sync" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.629362 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a084ae6-94ba-4057-adcf-5d3d9b92c9ae" containerName="glance-db-sync" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.629675 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a084ae6-94ba-4057-adcf-5d3d9b92c9ae" containerName="glance-db-sync" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.631162 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.674667 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-99bm5"] Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.762750 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.762838 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbhxb\" (UniqueName: \"kubernetes.io/projected/de3d498f-4d4b-453a-80ae-bf1456505ba3-kube-api-access-kbhxb\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.762894 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.762940 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.762970 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-config\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.763069 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.865291 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.865356 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.865408 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbhxb\" (UniqueName: \"kubernetes.io/projected/de3d498f-4d4b-453a-80ae-bf1456505ba3-kube-api-access-kbhxb\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.865467 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.865536 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.865557 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-config\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.866565 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-config\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.867175 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-ovsdbserver-sb\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.867694 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-ovsdbserver-nb\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.869071 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-dns-svc\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.869391 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-dns-swift-storage-0\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.892154 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbhxb\" (UniqueName: \"kubernetes.io/projected/de3d498f-4d4b-453a-80ae-bf1456505ba3-kube-api-access-kbhxb\") pod \"dnsmasq-dns-69c986f6d7-99bm5\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.983594 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:10 crc kubenswrapper[4766]: I1002 11:15:10.985230 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.069191 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-scripts\") pod \"acb47101-638b-42ba-aca0-96a9f81c1443\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.069274 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-config-data\") pod \"acb47101-638b-42ba-aca0-96a9f81c1443\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.069410 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acb47101-638b-42ba-aca0-96a9f81c1443-run-httpd\") pod \"acb47101-638b-42ba-aca0-96a9f81c1443\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.069586 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-sg-core-conf-yaml\") pod \"acb47101-638b-42ba-aca0-96a9f81c1443\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.069618 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-combined-ca-bundle\") pod \"acb47101-638b-42ba-aca0-96a9f81c1443\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.069691 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acb47101-638b-42ba-aca0-96a9f81c1443-log-httpd\") pod \"acb47101-638b-42ba-aca0-96a9f81c1443\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.069773 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m79j\" (UniqueName: \"kubernetes.io/projected/acb47101-638b-42ba-aca0-96a9f81c1443-kube-api-access-5m79j\") pod \"acb47101-638b-42ba-aca0-96a9f81c1443\" (UID: \"acb47101-638b-42ba-aca0-96a9f81c1443\") " Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.074248 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acb47101-638b-42ba-aca0-96a9f81c1443-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "acb47101-638b-42ba-aca0-96a9f81c1443" (UID: "acb47101-638b-42ba-aca0-96a9f81c1443"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.075920 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acb47101-638b-42ba-aca0-96a9f81c1443-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "acb47101-638b-42ba-aca0-96a9f81c1443" (UID: "acb47101-638b-42ba-aca0-96a9f81c1443"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.076789 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb47101-638b-42ba-aca0-96a9f81c1443-kube-api-access-5m79j" (OuterVolumeSpecName: "kube-api-access-5m79j") pod "acb47101-638b-42ba-aca0-96a9f81c1443" (UID: "acb47101-638b-42ba-aca0-96a9f81c1443"). InnerVolumeSpecName "kube-api-access-5m79j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.079930 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-scripts" (OuterVolumeSpecName: "scripts") pod "acb47101-638b-42ba-aca0-96a9f81c1443" (UID: "acb47101-638b-42ba-aca0-96a9f81c1443"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.128329 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "acb47101-638b-42ba-aca0-96a9f81c1443" (UID: "acb47101-638b-42ba-aca0-96a9f81c1443"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.172339 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.172376 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acb47101-638b-42ba-aca0-96a9f81c1443-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.172390 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.172404 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/acb47101-638b-42ba-aca0-96a9f81c1443-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.172415 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m79j\" (UniqueName: \"kubernetes.io/projected/acb47101-638b-42ba-aca0-96a9f81c1443-kube-api-access-5m79j\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.178470 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acb47101-638b-42ba-aca0-96a9f81c1443" (UID: "acb47101-638b-42ba-aca0-96a9f81c1443"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.225723 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-config-data" (OuterVolumeSpecName: "config-data") pod "acb47101-638b-42ba-aca0-96a9f81c1443" (UID: "acb47101-638b-42ba-aca0-96a9f81c1443"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.273821 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.273887 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acb47101-638b-42ba-aca0-96a9f81c1443-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.294451 4766 generic.go:334] "Generic (PLEG): container finished" podID="69a16b06-e649-4c66-94e9-7cda4fb8c135" containerID="4c673cf07b5afbfa3104c07fa205845d5ee675d264763c597c7f6688aad6e342" exitCode=143 Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.294536 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"69a16b06-e649-4c66-94e9-7cda4fb8c135","Type":"ContainerDied","Data":"4c673cf07b5afbfa3104c07fa205845d5ee675d264763c597c7f6688aad6e342"} Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.298536 4766 generic.go:334] "Generic (PLEG): container finished" podID="acb47101-638b-42ba-aca0-96a9f81c1443" containerID="5576c520748d5953e012dcd26c50c4ce01e261b6dd06db08507d51fc9740dc1f" exitCode=0 Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.299609 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.299813 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acb47101-638b-42ba-aca0-96a9f81c1443","Type":"ContainerDied","Data":"5576c520748d5953e012dcd26c50c4ce01e261b6dd06db08507d51fc9740dc1f"} Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.299853 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"acb47101-638b-42ba-aca0-96a9f81c1443","Type":"ContainerDied","Data":"bb86615d3d68699bdf7398bee61afe5b9d0ff618cdee7edfb16e75d98b5c909a"} Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.299875 4766 scope.go:117] "RemoveContainer" containerID="bd2f7b3a96aa8ee91d830307f72f8b4bd4f41acf88b4ea16eda48c6b3f2872a5" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.367360 4766 scope.go:117] "RemoveContainer" containerID="533c21f74a053a585cf21d97ee7b64bc41253b81187145237393a28bd0d8db63" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.380732 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.394981 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.403564 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.403697 4766 scope.go:117] "RemoveContainer" containerID="5576c520748d5953e012dcd26c50c4ce01e261b6dd06db08507d51fc9740dc1f" Oct 02 11:15:11 crc kubenswrapper[4766]: E1002 11:15:11.403970 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb47101-638b-42ba-aca0-96a9f81c1443" containerName="ceilometer-central-agent" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.403987 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb47101-638b-42ba-aca0-96a9f81c1443" containerName="ceilometer-central-agent" Oct 02 11:15:11 crc kubenswrapper[4766]: E1002 11:15:11.404014 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb47101-638b-42ba-aca0-96a9f81c1443" containerName="sg-core" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.404020 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb47101-638b-42ba-aca0-96a9f81c1443" containerName="sg-core" Oct 02 11:15:11 crc kubenswrapper[4766]: E1002 11:15:11.404046 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb47101-638b-42ba-aca0-96a9f81c1443" containerName="ceilometer-notification-agent" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.404052 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb47101-638b-42ba-aca0-96a9f81c1443" containerName="ceilometer-notification-agent" Oct 02 11:15:11 crc kubenswrapper[4766]: E1002 11:15:11.404059 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb47101-638b-42ba-aca0-96a9f81c1443" containerName="proxy-httpd" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.404064 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb47101-638b-42ba-aca0-96a9f81c1443" containerName="proxy-httpd" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.404218 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb47101-638b-42ba-aca0-96a9f81c1443" containerName="proxy-httpd" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.404250 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb47101-638b-42ba-aca0-96a9f81c1443" containerName="ceilometer-central-agent" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.404259 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb47101-638b-42ba-aca0-96a9f81c1443" containerName="sg-core" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.404266 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb47101-638b-42ba-aca0-96a9f81c1443" containerName="ceilometer-notification-agent" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.405872 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.411360 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.411657 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.414646 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.447098 4766 scope.go:117] "RemoveContainer" containerID="fbf3d7476613b9ea2f684cc297139f531d6c7a3b770ed4e9c33f6c9d44f72172" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.476683 4766 scope.go:117] "RemoveContainer" containerID="bd2f7b3a96aa8ee91d830307f72f8b4bd4f41acf88b4ea16eda48c6b3f2872a5" Oct 02 11:15:11 crc kubenswrapper[4766]: E1002 11:15:11.480616 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd2f7b3a96aa8ee91d830307f72f8b4bd4f41acf88b4ea16eda48c6b3f2872a5\": container with ID starting with bd2f7b3a96aa8ee91d830307f72f8b4bd4f41acf88b4ea16eda48c6b3f2872a5 not found: ID does not exist" containerID="bd2f7b3a96aa8ee91d830307f72f8b4bd4f41acf88b4ea16eda48c6b3f2872a5" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.480689 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd2f7b3a96aa8ee91d830307f72f8b4bd4f41acf88b4ea16eda48c6b3f2872a5"} err="failed to get container status \"bd2f7b3a96aa8ee91d830307f72f8b4bd4f41acf88b4ea16eda48c6b3f2872a5\": rpc error: code = NotFound desc = could not find container \"bd2f7b3a96aa8ee91d830307f72f8b4bd4f41acf88b4ea16eda48c6b3f2872a5\": container with ID starting with bd2f7b3a96aa8ee91d830307f72f8b4bd4f41acf88b4ea16eda48c6b3f2872a5 not found: ID does not exist" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.480730 4766 scope.go:117] "RemoveContainer" containerID="533c21f74a053a585cf21d97ee7b64bc41253b81187145237393a28bd0d8db63" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.482760 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df305f8b-2b53-4032-875e-531accfd848e-run-httpd\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.482816 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.482875 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-config-data\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.482902 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvvbv\" (UniqueName: \"kubernetes.io/projected/df305f8b-2b53-4032-875e-531accfd848e-kube-api-access-tvvbv\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.482931 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df305f8b-2b53-4032-875e-531accfd848e-log-httpd\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: E1002 11:15:11.483076 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"533c21f74a053a585cf21d97ee7b64bc41253b81187145237393a28bd0d8db63\": container with ID starting with 533c21f74a053a585cf21d97ee7b64bc41253b81187145237393a28bd0d8db63 not found: ID does not exist" containerID="533c21f74a053a585cf21d97ee7b64bc41253b81187145237393a28bd0d8db63" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.483251 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.483370 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-scripts\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.483789 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533c21f74a053a585cf21d97ee7b64bc41253b81187145237393a28bd0d8db63"} err="failed to get container status \"533c21f74a053a585cf21d97ee7b64bc41253b81187145237393a28bd0d8db63\": rpc error: code = NotFound desc = could not find container \"533c21f74a053a585cf21d97ee7b64bc41253b81187145237393a28bd0d8db63\": container with ID starting with 533c21f74a053a585cf21d97ee7b64bc41253b81187145237393a28bd0d8db63 not found: ID does not exist" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.483835 4766 scope.go:117] "RemoveContainer" containerID="5576c520748d5953e012dcd26c50c4ce01e261b6dd06db08507d51fc9740dc1f" Oct 02 11:15:11 crc kubenswrapper[4766]: E1002 11:15:11.484321 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5576c520748d5953e012dcd26c50c4ce01e261b6dd06db08507d51fc9740dc1f\": container with ID starting with 5576c520748d5953e012dcd26c50c4ce01e261b6dd06db08507d51fc9740dc1f not found: ID does not exist" containerID="5576c520748d5953e012dcd26c50c4ce01e261b6dd06db08507d51fc9740dc1f" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.484371 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5576c520748d5953e012dcd26c50c4ce01e261b6dd06db08507d51fc9740dc1f"} err="failed to get container status \"5576c520748d5953e012dcd26c50c4ce01e261b6dd06db08507d51fc9740dc1f\": rpc error: code = NotFound desc = could not find container \"5576c520748d5953e012dcd26c50c4ce01e261b6dd06db08507d51fc9740dc1f\": container with ID starting with 5576c520748d5953e012dcd26c50c4ce01e261b6dd06db08507d51fc9740dc1f not found: ID does not exist" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.484408 4766 scope.go:117] "RemoveContainer" containerID="fbf3d7476613b9ea2f684cc297139f531d6c7a3b770ed4e9c33f6c9d44f72172" Oct 02 11:15:11 crc kubenswrapper[4766]: E1002 11:15:11.484673 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbf3d7476613b9ea2f684cc297139f531d6c7a3b770ed4e9c33f6c9d44f72172\": container with ID starting with fbf3d7476613b9ea2f684cc297139f531d6c7a3b770ed4e9c33f6c9d44f72172 not found: ID does not exist" containerID="fbf3d7476613b9ea2f684cc297139f531d6c7a3b770ed4e9c33f6c9d44f72172" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.484692 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbf3d7476613b9ea2f684cc297139f531d6c7a3b770ed4e9c33f6c9d44f72172"} err="failed to get container status \"fbf3d7476613b9ea2f684cc297139f531d6c7a3b770ed4e9c33f6c9d44f72172\": rpc error: code = NotFound desc = could not find container \"fbf3d7476613b9ea2f684cc297139f531d6c7a3b770ed4e9c33f6c9d44f72172\": container with ID starting with fbf3d7476613b9ea2f684cc297139f531d6c7a3b770ed4e9c33f6c9d44f72172 not found: ID does not exist" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.519411 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.521944 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.524296 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2ttzh" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.524804 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.529732 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.531730 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.585056 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-99bm5"] Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.586305 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df305f8b-2b53-4032-875e-531accfd848e-run-httpd\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.586358 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.586425 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-config-data\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.586459 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvvbv\" (UniqueName: \"kubernetes.io/projected/df305f8b-2b53-4032-875e-531accfd848e-kube-api-access-tvvbv\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.586491 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df305f8b-2b53-4032-875e-531accfd848e-log-httpd\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.586628 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.586664 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-scripts\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.589313 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df305f8b-2b53-4032-875e-531accfd848e-run-httpd\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.589890 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df305f8b-2b53-4032-875e-531accfd848e-log-httpd\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.595813 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-scripts\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.596392 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-config-data\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.596805 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.597087 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.611320 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvvbv\" (UniqueName: \"kubernetes.io/projected/df305f8b-2b53-4032-875e-531accfd848e-kube-api-access-tvvbv\") pod \"ceilometer-0\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.688830 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32309add-4057-4300-b193-ec17033ace3a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.688921 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfsjl\" (UniqueName: \"kubernetes.io/projected/32309add-4057-4300-b193-ec17033ace3a-kube-api-access-pfsjl\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.689085 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32309add-4057-4300-b193-ec17033ace3a-logs\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.689137 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.689185 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-scripts\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.689219 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-config-data\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.689258 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.741555 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.767290 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.768815 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.771701 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.781381 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.791106 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32309add-4057-4300-b193-ec17033ace3a-logs\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.791155 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.791202 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-scripts\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.791237 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-config-data\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.791273 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.791453 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32309add-4057-4300-b193-ec17033ace3a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.791857 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32309add-4057-4300-b193-ec17033ace3a-logs\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.792167 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfsjl\" (UniqueName: \"kubernetes.io/projected/32309add-4057-4300-b193-ec17033ace3a-kube-api-access-pfsjl\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.792350 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.793684 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32309add-4057-4300-b193-ec17033ace3a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.802673 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.808760 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-config-data\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.815114 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-scripts\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.819262 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfsjl\" (UniqueName: \"kubernetes.io/projected/32309add-4057-4300-b193-ec17033ace3a-kube-api-access-pfsjl\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.853729 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.876461 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.908293 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-scripts\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.908467 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-config-data\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.908494 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc6zt\" (UniqueName: \"kubernetes.io/projected/19506e05-cbde-4b54-8875-4a6791011bae-kube-api-access-dc6zt\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.908580 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.908814 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19506e05-cbde-4b54-8875-4a6791011bae-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.908936 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.908980 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19506e05-cbde-4b54-8875-4a6791011bae-logs\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:11 crc kubenswrapper[4766]: I1002 11:15:11.926449 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acb47101-638b-42ba-aca0-96a9f81c1443" path="/var/lib/kubelet/pods/acb47101-638b-42ba-aca0-96a9f81c1443/volumes" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.023771 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-scripts\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.024170 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-config-data\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.024191 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc6zt\" (UniqueName: \"kubernetes.io/projected/19506e05-cbde-4b54-8875-4a6791011bae-kube-api-access-dc6zt\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.024213 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.024260 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19506e05-cbde-4b54-8875-4a6791011bae-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.024286 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.024306 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19506e05-cbde-4b54-8875-4a6791011bae-logs\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.025313 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19506e05-cbde-4b54-8875-4a6791011bae-logs\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.026343 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19506e05-cbde-4b54-8875-4a6791011bae-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.026439 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.031327 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-config-data\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.031661 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.033113 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-scripts\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.045915 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc6zt\" (UniqueName: \"kubernetes.io/projected/19506e05-cbde-4b54-8875-4a6791011bae-kube-api-access-dc6zt\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.048209 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.074942 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.147787 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.266293 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.289734 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.317786 4766 generic.go:334] "Generic (PLEG): container finished" podID="de3d498f-4d4b-453a-80ae-bf1456505ba3" containerID="acd8271431ed0cd3230257d6a30239e607a8fd1008099bf28ae24a2917d9dd2b" exitCode=0 Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.317853 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" event={"ID":"de3d498f-4d4b-453a-80ae-bf1456505ba3","Type":"ContainerDied","Data":"acd8271431ed0cd3230257d6a30239e607a8fd1008099bf28ae24a2917d9dd2b"} Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.317879 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" event={"ID":"de3d498f-4d4b-453a-80ae-bf1456505ba3","Type":"ContainerStarted","Data":"0c5ed28c002f76500f3fcd0862c8d11713e25a3793932cd7d556c9c05da42c6b"} Oct 02 11:15:12 crc kubenswrapper[4766]: I1002 11:15:12.323922 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-777dc6f59c-knpng" podUID="77f4ba15-529f-4e71-a7ea-74848f8bb55e" containerName="dnsmasq-dns" containerID="cri-o://39835df239f044b69e0720445d53580e2d8946352714d801c68ca413d3cbd38c" gracePeriod=10 Oct 02 11:15:13 crc kubenswrapper[4766]: I1002 11:15:13.340144 4766 generic.go:334] "Generic (PLEG): container finished" podID="77f4ba15-529f-4e71-a7ea-74848f8bb55e" containerID="39835df239f044b69e0720445d53580e2d8946352714d801c68ca413d3cbd38c" exitCode=0 Oct 02 11:15:13 crc kubenswrapper[4766]: I1002 11:15:13.340234 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-777dc6f59c-knpng" event={"ID":"77f4ba15-529f-4e71-a7ea-74848f8bb55e","Type":"ContainerDied","Data":"39835df239f044b69e0720445d53580e2d8946352714d801c68ca413d3cbd38c"} Oct 02 11:15:13 crc kubenswrapper[4766]: I1002 11:15:13.687705 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:15:13 crc kubenswrapper[4766]: I1002 11:15:13.757261 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:15:16 crc kubenswrapper[4766]: I1002 11:15:16.376571 4766 generic.go:334] "Generic (PLEG): container finished" podID="fea98489-bbfa-4490-9e89-40a19bfb594f" containerID="edbb3da01856c2a41dd64eefb64dc0d38f95f5a264b444f1dbcac1d668a8635d" exitCode=0 Oct 02 11:15:16 crc kubenswrapper[4766]: I1002 11:15:16.376679 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xc8gb" event={"ID":"fea98489-bbfa-4490-9e89-40a19bfb594f","Type":"ContainerDied","Data":"edbb3da01856c2a41dd64eefb64dc0d38f95f5a264b444f1dbcac1d668a8635d"} Oct 02 11:15:16 crc kubenswrapper[4766]: I1002 11:15:16.512895 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:16 crc kubenswrapper[4766]: I1002 11:15:16.512945 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:15:16 crc kubenswrapper[4766]: I1002 11:15:16.858456 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-777dc6f59c-knpng" podUID="77f4ba15-529f-4e71-a7ea-74848f8bb55e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.154:5353: connect: connection refused" Oct 02 11:15:17 crc kubenswrapper[4766]: I1002 11:15:17.091437 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 11:15:17 crc kubenswrapper[4766]: I1002 11:15:17.139891 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:17 crc kubenswrapper[4766]: I1002 11:15:17.385549 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="da3f12a0-0986-43aa-9727-60efa7d4a1f8" containerName="cinder-scheduler" containerID="cri-o://ada1411166ef19e5906600e666b606d7f0baa1825ff3982a02212ccfa8e31381" gracePeriod=30 Oct 02 11:15:17 crc kubenswrapper[4766]: I1002 11:15:17.385549 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="da3f12a0-0986-43aa-9727-60efa7d4a1f8" containerName="probe" containerID="cri-o://052955713ff3cdfa26c7f727ee98ce52ee226feaa68f7142ccdb9ee7508fe5f3" gracePeriod=30 Oct 02 11:15:18 crc kubenswrapper[4766]: I1002 11:15:18.394399 4766 generic.go:334] "Generic (PLEG): container finished" podID="da3f12a0-0986-43aa-9727-60efa7d4a1f8" containerID="052955713ff3cdfa26c7f727ee98ce52ee226feaa68f7142ccdb9ee7508fe5f3" exitCode=0 Oct 02 11:15:18 crc kubenswrapper[4766]: I1002 11:15:18.394744 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da3f12a0-0986-43aa-9727-60efa7d4a1f8","Type":"ContainerDied","Data":"052955713ff3cdfa26c7f727ee98ce52ee226feaa68f7142ccdb9ee7508fe5f3"} Oct 02 11:15:18 crc kubenswrapper[4766]: W1002 11:15:18.458132 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf305f8b_2b53_4032_875e_531accfd848e.slice/crio-4086ea7d66729929e852dcecc41a64ec844392c1174bf398a4d072da1eac1b92 WatchSource:0}: Error finding container 4086ea7d66729929e852dcecc41a64ec844392c1174bf398a4d072da1eac1b92: Status 404 returned error can't find the container with id 4086ea7d66729929e852dcecc41a64ec844392c1174bf398a4d072da1eac1b92 Oct 02 11:15:18 crc kubenswrapper[4766]: I1002 11:15:18.707726 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xc8gb" Oct 02 11:15:18 crc kubenswrapper[4766]: I1002 11:15:18.860326 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea98489-bbfa-4490-9e89-40a19bfb594f-combined-ca-bundle\") pod \"fea98489-bbfa-4490-9e89-40a19bfb594f\" (UID: \"fea98489-bbfa-4490-9e89-40a19bfb594f\") " Oct 02 11:15:18 crc kubenswrapper[4766]: I1002 11:15:18.863991 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkrdr\" (UniqueName: \"kubernetes.io/projected/fea98489-bbfa-4490-9e89-40a19bfb594f-kube-api-access-tkrdr\") pod \"fea98489-bbfa-4490-9e89-40a19bfb594f\" (UID: \"fea98489-bbfa-4490-9e89-40a19bfb594f\") " Oct 02 11:15:18 crc kubenswrapper[4766]: I1002 11:15:18.864053 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fea98489-bbfa-4490-9e89-40a19bfb594f-config\") pod \"fea98489-bbfa-4490-9e89-40a19bfb594f\" (UID: \"fea98489-bbfa-4490-9e89-40a19bfb594f\") " Oct 02 11:15:18 crc kubenswrapper[4766]: I1002 11:15:18.890431 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fea98489-bbfa-4490-9e89-40a19bfb594f-kube-api-access-tkrdr" (OuterVolumeSpecName: "kube-api-access-tkrdr") pod "fea98489-bbfa-4490-9e89-40a19bfb594f" (UID: "fea98489-bbfa-4490-9e89-40a19bfb594f"). InnerVolumeSpecName "kube-api-access-tkrdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:18 crc kubenswrapper[4766]: I1002 11:15:18.906317 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fea98489-bbfa-4490-9e89-40a19bfb594f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fea98489-bbfa-4490-9e89-40a19bfb594f" (UID: "fea98489-bbfa-4490-9e89-40a19bfb594f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:18 crc kubenswrapper[4766]: I1002 11:15:18.922836 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fea98489-bbfa-4490-9e89-40a19bfb594f-config" (OuterVolumeSpecName: "config") pod "fea98489-bbfa-4490-9e89-40a19bfb594f" (UID: "fea98489-bbfa-4490-9e89-40a19bfb594f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:18 crc kubenswrapper[4766]: I1002 11:15:18.970205 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:18 crc kubenswrapper[4766]: I1002 11:15:18.970722 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fea98489-bbfa-4490-9e89-40a19bfb594f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:18 crc kubenswrapper[4766]: I1002 11:15:18.970763 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkrdr\" (UniqueName: \"kubernetes.io/projected/fea98489-bbfa-4490-9e89-40a19bfb594f-kube-api-access-tkrdr\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:18 crc kubenswrapper[4766]: I1002 11:15:18.970773 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fea98489-bbfa-4490-9e89-40a19bfb594f-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.071841 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-ovsdbserver-sb\") pod \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.071972 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5wlj\" (UniqueName: \"kubernetes.io/projected/77f4ba15-529f-4e71-a7ea-74848f8bb55e-kube-api-access-t5wlj\") pod \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.072002 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-config\") pod \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.072060 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-dns-svc\") pod \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.072106 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-ovsdbserver-nb\") pod \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.072207 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-dns-swift-storage-0\") pod \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\" (UID: \"77f4ba15-529f-4e71-a7ea-74848f8bb55e\") " Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.075455 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77f4ba15-529f-4e71-a7ea-74848f8bb55e-kube-api-access-t5wlj" (OuterVolumeSpecName: "kube-api-access-t5wlj") pod "77f4ba15-529f-4e71-a7ea-74848f8bb55e" (UID: "77f4ba15-529f-4e71-a7ea-74848f8bb55e"). InnerVolumeSpecName "kube-api-access-t5wlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.122900 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "77f4ba15-529f-4e71-a7ea-74848f8bb55e" (UID: "77f4ba15-529f-4e71-a7ea-74848f8bb55e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.126479 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "77f4ba15-529f-4e71-a7ea-74848f8bb55e" (UID: "77f4ba15-529f-4e71-a7ea-74848f8bb55e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.132644 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "77f4ba15-529f-4e71-a7ea-74848f8bb55e" (UID: "77f4ba15-529f-4e71-a7ea-74848f8bb55e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.137237 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "77f4ba15-529f-4e71-a7ea-74848f8bb55e" (UID: "77f4ba15-529f-4e71-a7ea-74848f8bb55e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.138470 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-config" (OuterVolumeSpecName: "config") pod "77f4ba15-529f-4e71-a7ea-74848f8bb55e" (UID: "77f4ba15-529f-4e71-a7ea-74848f8bb55e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.175840 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.175894 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5wlj\" (UniqueName: \"kubernetes.io/projected/77f4ba15-529f-4e71-a7ea-74848f8bb55e-kube-api-access-t5wlj\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.175913 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.175925 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.175935 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.175946 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77f4ba15-529f-4e71-a7ea-74848f8bb55e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.191713 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.407312 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"19506e05-cbde-4b54-8875-4a6791011bae","Type":"ContainerStarted","Data":"89269ce619857e6ab8b58e2e912011936221e0509a8201646e277164538de5d1"} Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.409000 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df305f8b-2b53-4032-875e-531accfd848e","Type":"ContainerStarted","Data":"4086ea7d66729929e852dcecc41a64ec844392c1174bf398a4d072da1eac1b92"} Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.414012 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-777dc6f59c-knpng" event={"ID":"77f4ba15-529f-4e71-a7ea-74848f8bb55e","Type":"ContainerDied","Data":"49ba44bdbcc716377a5922d2c00211ef23b87e7caf90ae4c736b09db7a555903"} Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.414072 4766 scope.go:117] "RemoveContainer" containerID="39835df239f044b69e0720445d53580e2d8946352714d801c68ca413d3cbd38c" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.414204 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-777dc6f59c-knpng" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.423457 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-xc8gb" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.424637 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-xc8gb" event={"ID":"fea98489-bbfa-4490-9e89-40a19bfb594f","Type":"ContainerDied","Data":"dd66c1dd9497b9bd79d7b172ac20ef09890d52077ca50fd87192e43a1a24124c"} Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.424704 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd66c1dd9497b9bd79d7b172ac20ef09890d52077ca50fd87192e43a1a24124c" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.514514 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-777dc6f59c-knpng"] Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.516899 4766 scope.go:117] "RemoveContainer" containerID="94582f2be827091009f5a7b6293e09610ca21ff4e1deac7163cfafc66e363ae3" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.529375 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-777dc6f59c-knpng"] Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.595062 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.941370 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77f4ba15-529f-4e71-a7ea-74848f8bb55e" path="/var/lib/kubelet/pods/77f4ba15-529f-4e71-a7ea-74848f8bb55e/volumes" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.947412 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-99bm5"] Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.964573 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-ljnft"] Oct 02 11:15:19 crc kubenswrapper[4766]: E1002 11:15:19.964987 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77f4ba15-529f-4e71-a7ea-74848f8bb55e" containerName="dnsmasq-dns" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.965009 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="77f4ba15-529f-4e71-a7ea-74848f8bb55e" containerName="dnsmasq-dns" Oct 02 11:15:19 crc kubenswrapper[4766]: E1002 11:15:19.965028 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77f4ba15-529f-4e71-a7ea-74848f8bb55e" containerName="init" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.965034 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="77f4ba15-529f-4e71-a7ea-74848f8bb55e" containerName="init" Oct 02 11:15:19 crc kubenswrapper[4766]: E1002 11:15:19.965045 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fea98489-bbfa-4490-9e89-40a19bfb594f" containerName="neutron-db-sync" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.965050 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fea98489-bbfa-4490-9e89-40a19bfb594f" containerName="neutron-db-sync" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.965236 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="77f4ba15-529f-4e71-a7ea-74848f8bb55e" containerName="dnsmasq-dns" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.965261 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="fea98489-bbfa-4490-9e89-40a19bfb594f" containerName="neutron-db-sync" Oct 02 11:15:19 crc kubenswrapper[4766]: I1002 11:15:19.966382 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.003311 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5d5d689cbb-b8wmb"] Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.005686 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.020054 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.020248 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.020349 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-25rq9" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.020560 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.021636 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-ljnft"] Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.033475 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d5d689cbb-b8wmb"] Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.080818 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.093486 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-combined-ca-bundle\") pod \"neutron-5d5d689cbb-b8wmb\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.093559 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snmnf\" (UniqueName: \"kubernetes.io/projected/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-kube-api-access-snmnf\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.093609 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-httpd-config\") pod \"neutron-5d5d689cbb-b8wmb\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.093635 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-config\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.093806 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-ovndb-tls-certs\") pod \"neutron-5d5d689cbb-b8wmb\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.093838 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.093867 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-config\") pod \"neutron-5d5d689cbb-b8wmb\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:20 crc kubenswrapper[4766]: W1002 11:15:20.093879 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32309add_4057_4300_b193_ec17033ace3a.slice/crio-0150f7369523d2d4d8ead809221c2f97ca984b426d8ba7a71ff0f7abc238282a WatchSource:0}: Error finding container 0150f7369523d2d4d8ead809221c2f97ca984b426d8ba7a71ff0f7abc238282a: Status 404 returned error can't find the container with id 0150f7369523d2d4d8ead809221c2f97ca984b426d8ba7a71ff0f7abc238282a Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.093934 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.093965 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47p5g\" (UniqueName: \"kubernetes.io/projected/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-kube-api-access-47p5g\") pod \"neutron-5d5d689cbb-b8wmb\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.094130 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-dns-svc\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.094186 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.196008 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-config\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.196308 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-ovndb-tls-certs\") pod \"neutron-5d5d689cbb-b8wmb\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.196327 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.196345 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-config\") pod \"neutron-5d5d689cbb-b8wmb\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.196380 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.196515 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47p5g\" (UniqueName: \"kubernetes.io/projected/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-kube-api-access-47p5g\") pod \"neutron-5d5d689cbb-b8wmb\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.196573 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-dns-svc\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.196621 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.196648 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-combined-ca-bundle\") pod \"neutron-5d5d689cbb-b8wmb\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.196671 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snmnf\" (UniqueName: \"kubernetes.io/projected/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-kube-api-access-snmnf\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.196717 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-httpd-config\") pod \"neutron-5d5d689cbb-b8wmb\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.198017 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.198644 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-config\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.199083 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.199756 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-dns-svc\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.200418 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.204723 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-config\") pod \"neutron-5d5d689cbb-b8wmb\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.205360 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-ovndb-tls-certs\") pod \"neutron-5d5d689cbb-b8wmb\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.205437 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-httpd-config\") pod \"neutron-5d5d689cbb-b8wmb\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.207073 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-combined-ca-bundle\") pod \"neutron-5d5d689cbb-b8wmb\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.215467 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47p5g\" (UniqueName: \"kubernetes.io/projected/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-kube-api-access-47p5g\") pod \"neutron-5d5d689cbb-b8wmb\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.218692 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snmnf\" (UniqueName: \"kubernetes.io/projected/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-kube-api-access-snmnf\") pod \"dnsmasq-dns-5784cf869f-ljnft\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.321642 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.347003 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.442517 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" event={"ID":"de3d498f-4d4b-453a-80ae-bf1456505ba3","Type":"ContainerStarted","Data":"2259de9d8ca58008098b26cc61e357e3f7d5f3b09133b0ffed15372297e474f1"} Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.443635 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.452107 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32309add-4057-4300-b193-ec17033ace3a","Type":"ContainerStarted","Data":"0150f7369523d2d4d8ead809221c2f97ca984b426d8ba7a71ff0f7abc238282a"} Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.454776 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"19506e05-cbde-4b54-8875-4a6791011bae","Type":"ContainerStarted","Data":"52ffef158eecc7d362de49919fd86cf2eda24f5dbd0c6f16cf9f9d6c54857b84"} Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.456983 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df305f8b-2b53-4032-875e-531accfd848e","Type":"ContainerStarted","Data":"0aeb5be1e525e3f40e9ec10c464314f0ff3c45d280b7069dac2edf94d312453d"} Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.461696 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"d896308d-0b8a-4cfc-ad92-311521c2e417","Type":"ContainerStarted","Data":"3f59b786bb34a976dbad318bdbfed8db8ed47518b0fa788de2b6cb253622cf0d"} Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.469412 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" podStartSLOduration=10.469394418 podStartE2EDuration="10.469394418s" podCreationTimestamp="2025-10-02 11:15:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:20.464863163 +0000 UTC m=+1435.407734107" watchObservedRunningTime="2025-10-02 11:15:20.469394418 +0000 UTC m=+1435.412265362" Oct 02 11:15:20 crc kubenswrapper[4766]: I1002 11:15:20.492054 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.516009591 podStartE2EDuration="18.492033772s" podCreationTimestamp="2025-10-02 11:15:02 +0000 UTC" firstStartedPulling="2025-10-02 11:15:03.627596457 +0000 UTC m=+1418.570467401" lastFinishedPulling="2025-10-02 11:15:19.603620638 +0000 UTC m=+1434.546491582" observedRunningTime="2025-10-02 11:15:20.480744331 +0000 UTC m=+1435.423615275" watchObservedRunningTime="2025-10-02 11:15:20.492033772 +0000 UTC m=+1435.434904716" Oct 02 11:15:21 crc kubenswrapper[4766]: W1002 11:15:21.006611 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4f37fd8_7e80_4c40_a7d2_7e7d78e60cb2.slice/crio-b8afa8f0e538fbf83f36c50f1f32969785af0f308a07b7113c9159f9e90f60b5 WatchSource:0}: Error finding container b8afa8f0e538fbf83f36c50f1f32969785af0f308a07b7113c9159f9e90f60b5: Status 404 returned error can't find the container with id b8afa8f0e538fbf83f36c50f1f32969785af0f308a07b7113c9159f9e90f60b5 Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.014775 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-ljnft"] Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.072709 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5d5d689cbb-b8wmb"] Oct 02 11:15:21 crc kubenswrapper[4766]: W1002 11:15:21.080680 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ab8b4f9_0502_4896_aee1_f5f96bd8d1cf.slice/crio-2b059259836e9be9e71ca1179c6a25d181ebf547605caab27caadf6723718f11 WatchSource:0}: Error finding container 2b059259836e9be9e71ca1179c6a25d181ebf547605caab27caadf6723718f11: Status 404 returned error can't find the container with id 2b059259836e9be9e71ca1179c6a25d181ebf547605caab27caadf6723718f11 Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.256587 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-85jtl"] Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.260420 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-85jtl" Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.276200 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-85jtl"] Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.334217 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnscn\" (UniqueName: \"kubernetes.io/projected/1a0d201d-b9b8-49e0-b51a-9e187d4b4441-kube-api-access-mnscn\") pod \"nova-api-db-create-85jtl\" (UID: \"1a0d201d-b9b8-49e0-b51a-9e187d4b4441\") " pod="openstack/nova-api-db-create-85jtl" Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.359579 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-lvlsh"] Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.360973 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lvlsh" Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.405647 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-lvlsh"] Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.442456 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzshb\" (UniqueName: \"kubernetes.io/projected/3061df7e-4dd6-4340-88af-67b6d9b3a6b7-kube-api-access-bzshb\") pod \"nova-cell0-db-create-lvlsh\" (UID: \"3061df7e-4dd6-4340-88af-67b6d9b3a6b7\") " pod="openstack/nova-cell0-db-create-lvlsh" Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.443601 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnscn\" (UniqueName: \"kubernetes.io/projected/1a0d201d-b9b8-49e0-b51a-9e187d4b4441-kube-api-access-mnscn\") pod \"nova-api-db-create-85jtl\" (UID: \"1a0d201d-b9b8-49e0-b51a-9e187d4b4441\") " pod="openstack/nova-api-db-create-85jtl" Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.461569 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnscn\" (UniqueName: \"kubernetes.io/projected/1a0d201d-b9b8-49e0-b51a-9e187d4b4441-kube-api-access-mnscn\") pod \"nova-api-db-create-85jtl\" (UID: \"1a0d201d-b9b8-49e0-b51a-9e187d4b4441\") " pod="openstack/nova-api-db-create-85jtl" Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.525563 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-qhzpn"] Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.527028 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qhzpn" Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.532812 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32309add-4057-4300-b193-ec17033ace3a","Type":"ContainerStarted","Data":"8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449"} Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.546552 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qhzpn"] Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.547731 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzshb\" (UniqueName: \"kubernetes.io/projected/3061df7e-4dd6-4340-88af-67b6d9b3a6b7-kube-api-access-bzshb\") pod \"nova-cell0-db-create-lvlsh\" (UID: \"3061df7e-4dd6-4340-88af-67b6d9b3a6b7\") " pod="openstack/nova-cell0-db-create-lvlsh" Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.569092 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzshb\" (UniqueName: \"kubernetes.io/projected/3061df7e-4dd6-4340-88af-67b6d9b3a6b7-kube-api-access-bzshb\") pod \"nova-cell0-db-create-lvlsh\" (UID: \"3061df7e-4dd6-4340-88af-67b6d9b3a6b7\") " pod="openstack/nova-cell0-db-create-lvlsh" Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.578185 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d5d689cbb-b8wmb" event={"ID":"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf","Type":"ContainerStarted","Data":"a5d6b8038eeebfa5add3d86a4032fc2e6b96dcf152395518f5b6721b67d760e9"} Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.578231 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d5d689cbb-b8wmb" event={"ID":"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf","Type":"ContainerStarted","Data":"2b059259836e9be9e71ca1179c6a25d181ebf547605caab27caadf6723718f11"} Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.584546 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"19506e05-cbde-4b54-8875-4a6791011bae","Type":"ContainerStarted","Data":"75c97a315dd71b9a9ba10a8a05af31817b83a2b42d7c9c8ef21d71db710c08e5"} Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.585330 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="19506e05-cbde-4b54-8875-4a6791011bae" containerName="glance-httpd" containerID="cri-o://75c97a315dd71b9a9ba10a8a05af31817b83a2b42d7c9c8ef21d71db710c08e5" gracePeriod=30 Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.585967 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="19506e05-cbde-4b54-8875-4a6791011bae" containerName="glance-log" containerID="cri-o://52ffef158eecc7d362de49919fd86cf2eda24f5dbd0c6f16cf9f9d6c54857b84" gracePeriod=30 Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.614244 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=11.614228136 podStartE2EDuration="11.614228136s" podCreationTimestamp="2025-10-02 11:15:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:21.609601807 +0000 UTC m=+1436.552472761" watchObservedRunningTime="2025-10-02 11:15:21.614228136 +0000 UTC m=+1436.557099080" Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.622808 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df305f8b-2b53-4032-875e-531accfd848e","Type":"ContainerStarted","Data":"689985d395b0444577ec22ae046e000e380b42e62bce27e8d7f73375918a7d5e"} Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.636171 4766 generic.go:334] "Generic (PLEG): container finished" podID="e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2" containerID="ec31f616b968fc330e876b786269e916f1f32e69467c6916e8777e3aeaaeeff2" exitCode=0 Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.637771 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-ljnft" event={"ID":"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2","Type":"ContainerDied","Data":"ec31f616b968fc330e876b786269e916f1f32e69467c6916e8777e3aeaaeeff2"} Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.637811 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-ljnft" event={"ID":"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2","Type":"ContainerStarted","Data":"b8afa8f0e538fbf83f36c50f1f32969785af0f308a07b7113c9159f9e90f60b5"} Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.637959 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" podUID="de3d498f-4d4b-453a-80ae-bf1456505ba3" containerName="dnsmasq-dns" containerID="cri-o://2259de9d8ca58008098b26cc61e357e3f7d5f3b09133b0ffed15372297e474f1" gracePeriod=10 Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.652682 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nv6tn\" (UniqueName: \"kubernetes.io/projected/33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9-kube-api-access-nv6tn\") pod \"nova-cell1-db-create-qhzpn\" (UID: \"33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9\") " pod="openstack/nova-cell1-db-create-qhzpn" Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.654787 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-85jtl" Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.698010 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lvlsh" Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.754286 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nv6tn\" (UniqueName: \"kubernetes.io/projected/33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9-kube-api-access-nv6tn\") pod \"nova-cell1-db-create-qhzpn\" (UID: \"33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9\") " pod="openstack/nova-cell1-db-create-qhzpn" Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.782039 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nv6tn\" (UniqueName: \"kubernetes.io/projected/33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9-kube-api-access-nv6tn\") pod \"nova-cell1-db-create-qhzpn\" (UID: \"33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9\") " pod="openstack/nova-cell1-db-create-qhzpn" Oct 02 11:15:21 crc kubenswrapper[4766]: I1002 11:15:21.953806 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qhzpn" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.354186 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.471250 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-scripts\") pod \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.471668 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da3f12a0-0986-43aa-9727-60efa7d4a1f8-etc-machine-id\") pod \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.471756 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-combined-ca-bundle\") pod \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.471857 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxlm9\" (UniqueName: \"kubernetes.io/projected/da3f12a0-0986-43aa-9727-60efa7d4a1f8-kube-api-access-nxlm9\") pod \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.471896 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-config-data\") pod \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.471949 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-config-data-custom\") pod \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\" (UID: \"da3f12a0-0986-43aa-9727-60efa7d4a1f8\") " Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.472943 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/da3f12a0-0986-43aa-9727-60efa7d4a1f8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "da3f12a0-0986-43aa-9727-60efa7d4a1f8" (UID: "da3f12a0-0986-43aa-9727-60efa7d4a1f8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.482655 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-scripts" (OuterVolumeSpecName: "scripts") pod "da3f12a0-0986-43aa-9727-60efa7d4a1f8" (UID: "da3f12a0-0986-43aa-9727-60efa7d4a1f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.482714 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da3f12a0-0986-43aa-9727-60efa7d4a1f8-kube-api-access-nxlm9" (OuterVolumeSpecName: "kube-api-access-nxlm9") pod "da3f12a0-0986-43aa-9727-60efa7d4a1f8" (UID: "da3f12a0-0986-43aa-9727-60efa7d4a1f8"). InnerVolumeSpecName "kube-api-access-nxlm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.515390 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "da3f12a0-0986-43aa-9727-60efa7d4a1f8" (UID: "da3f12a0-0986-43aa-9727-60efa7d4a1f8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.578746 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxlm9\" (UniqueName: \"kubernetes.io/projected/da3f12a0-0986-43aa-9727-60efa7d4a1f8-kube-api-access-nxlm9\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.578795 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.578804 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.578815 4766 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da3f12a0-0986-43aa-9727-60efa7d4a1f8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.654882 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da3f12a0-0986-43aa-9727-60efa7d4a1f8" (UID: "da3f12a0-0986-43aa-9727-60efa7d4a1f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.659290 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-config-data" (OuterVolumeSpecName: "config-data") pod "da3f12a0-0986-43aa-9727-60efa7d4a1f8" (UID: "da3f12a0-0986-43aa-9727-60efa7d4a1f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.664454 4766 generic.go:334] "Generic (PLEG): container finished" podID="da3f12a0-0986-43aa-9727-60efa7d4a1f8" containerID="ada1411166ef19e5906600e666b606d7f0baa1825ff3982a02212ccfa8e31381" exitCode=0 Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.664651 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da3f12a0-0986-43aa-9727-60efa7d4a1f8","Type":"ContainerDied","Data":"ada1411166ef19e5906600e666b606d7f0baa1825ff3982a02212ccfa8e31381"} Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.664684 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da3f12a0-0986-43aa-9727-60efa7d4a1f8","Type":"ContainerDied","Data":"f387f81b7cfa01b5813ca8df45f25272fb13c07c3c0a34158357a9a850e15e87"} Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.664701 4766 scope.go:117] "RemoveContainer" containerID="052955713ff3cdfa26c7f727ee98ce52ee226feaa68f7142ccdb9ee7508fe5f3" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.666659 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.680392 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.680673 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da3f12a0-0986-43aa-9727-60efa7d4a1f8-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.693760 4766 generic.go:334] "Generic (PLEG): container finished" podID="19506e05-cbde-4b54-8875-4a6791011bae" containerID="75c97a315dd71b9a9ba10a8a05af31817b83a2b42d7c9c8ef21d71db710c08e5" exitCode=0 Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.693801 4766 generic.go:334] "Generic (PLEG): container finished" podID="19506e05-cbde-4b54-8875-4a6791011bae" containerID="52ffef158eecc7d362de49919fd86cf2eda24f5dbd0c6f16cf9f9d6c54857b84" exitCode=143 Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.693872 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"19506e05-cbde-4b54-8875-4a6791011bae","Type":"ContainerDied","Data":"75c97a315dd71b9a9ba10a8a05af31817b83a2b42d7c9c8ef21d71db710c08e5"} Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.693900 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"19506e05-cbde-4b54-8875-4a6791011bae","Type":"ContainerDied","Data":"52ffef158eecc7d362de49919fd86cf2eda24f5dbd0c6f16cf9f9d6c54857b84"} Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.718026 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df305f8b-2b53-4032-875e-531accfd848e","Type":"ContainerStarted","Data":"e9a0593064a950c84f269116f2d77bb9eee0ad07be18d76c825c76a66b2f0d61"} Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.721719 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-85jtl"] Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.724653 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-ljnft" event={"ID":"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2","Type":"ContainerStarted","Data":"20b9b7aff7b3a88aadfaa1c77608b6649cd13f567278c0707c1122d97f8a9f6d"} Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.725706 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.727905 4766 generic.go:334] "Generic (PLEG): container finished" podID="de3d498f-4d4b-453a-80ae-bf1456505ba3" containerID="2259de9d8ca58008098b26cc61e357e3f7d5f3b09133b0ffed15372297e474f1" exitCode=0 Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.728203 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" event={"ID":"de3d498f-4d4b-453a-80ae-bf1456505ba3","Type":"ContainerDied","Data":"2259de9d8ca58008098b26cc61e357e3f7d5f3b09133b0ffed15372297e474f1"} Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.728238 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" event={"ID":"de3d498f-4d4b-453a-80ae-bf1456505ba3","Type":"ContainerDied","Data":"0c5ed28c002f76500f3fcd0862c8d11713e25a3793932cd7d556c9c05da42c6b"} Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.728251 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c5ed28c002f76500f3fcd0862c8d11713e25a3793932cd7d556c9c05da42c6b" Oct 02 11:15:22 crc kubenswrapper[4766]: W1002 11:15:22.729084 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a0d201d_b9b8_49e0_b51a_9e187d4b4441.slice/crio-a18ca8b1c0d65c00d248c511b28ef1be749cd7e45de0489f1709a10fbdce100f WatchSource:0}: Error finding container a18ca8b1c0d65c00d248c511b28ef1be749cd7e45de0489f1709a10fbdce100f: Status 404 returned error can't find the container with id a18ca8b1c0d65c00d248c511b28ef1be749cd7e45de0489f1709a10fbdce100f Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.746153 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32309add-4057-4300-b193-ec17033ace3a","Type":"ContainerStarted","Data":"47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef"} Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.746487 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="32309add-4057-4300-b193-ec17033ace3a" containerName="glance-log" containerID="cri-o://8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449" gracePeriod=30 Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.746733 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="32309add-4057-4300-b193-ec17033ace3a" containerName="glance-httpd" containerID="cri-o://47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef" gracePeriod=30 Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.748720 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-ljnft" podStartSLOduration=3.748058242 podStartE2EDuration="3.748058242s" podCreationTimestamp="2025-10-02 11:15:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:22.745252313 +0000 UTC m=+1437.688123257" watchObservedRunningTime="2025-10-02 11:15:22.748058242 +0000 UTC m=+1437.690929186" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.757611 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d5d689cbb-b8wmb" event={"ID":"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf","Type":"ContainerStarted","Data":"ddf1db2050ce314b5c6aa724ee6afdaca35e577bdbc4f9b2b93cd43a6a850308"} Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.757988 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.765104 4766 scope.go:117] "RemoveContainer" containerID="ada1411166ef19e5906600e666b606d7f0baa1825ff3982a02212ccfa8e31381" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.769763 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.782586 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-qhzpn"] Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.792429 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.838580 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.862694 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:22 crc kubenswrapper[4766]: E1002 11:15:22.863116 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3f12a0-0986-43aa-9727-60efa7d4a1f8" containerName="cinder-scheduler" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.863130 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3f12a0-0986-43aa-9727-60efa7d4a1f8" containerName="cinder-scheduler" Oct 02 11:15:22 crc kubenswrapper[4766]: E1002 11:15:22.863144 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da3f12a0-0986-43aa-9727-60efa7d4a1f8" containerName="probe" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.863151 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="da3f12a0-0986-43aa-9727-60efa7d4a1f8" containerName="probe" Oct 02 11:15:22 crc kubenswrapper[4766]: E1002 11:15:22.863171 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3d498f-4d4b-453a-80ae-bf1456505ba3" containerName="init" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.863177 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3d498f-4d4b-453a-80ae-bf1456505ba3" containerName="init" Oct 02 11:15:22 crc kubenswrapper[4766]: E1002 11:15:22.863192 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de3d498f-4d4b-453a-80ae-bf1456505ba3" containerName="dnsmasq-dns" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.863199 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="de3d498f-4d4b-453a-80ae-bf1456505ba3" containerName="dnsmasq-dns" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.863381 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="da3f12a0-0986-43aa-9727-60efa7d4a1f8" containerName="probe" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.863390 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="da3f12a0-0986-43aa-9727-60efa7d4a1f8" containerName="cinder-scheduler" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.863414 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="de3d498f-4d4b-453a-80ae-bf1456505ba3" containerName="dnsmasq-dns" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.864339 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.864353 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5d5d689cbb-b8wmb" podStartSLOduration=3.8643404180000003 podStartE2EDuration="3.864340418s" podCreationTimestamp="2025-10-02 11:15:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:22.810055964 +0000 UTC m=+1437.752926918" watchObservedRunningTime="2025-10-02 11:15:22.864340418 +0000 UTC m=+1437.807211362" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.871139 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.887707 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-ovsdbserver-sb\") pod \"de3d498f-4d4b-453a-80ae-bf1456505ba3\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.887928 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-dns-svc\") pod \"de3d498f-4d4b-453a-80ae-bf1456505ba3\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.888065 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-ovsdbserver-nb\") pod \"de3d498f-4d4b-453a-80ae-bf1456505ba3\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.888301 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-dns-swift-storage-0\") pod \"de3d498f-4d4b-453a-80ae-bf1456505ba3\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.888414 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-config\") pod \"de3d498f-4d4b-453a-80ae-bf1456505ba3\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.888546 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbhxb\" (UniqueName: \"kubernetes.io/projected/de3d498f-4d4b-453a-80ae-bf1456505ba3-kube-api-access-kbhxb\") pod \"de3d498f-4d4b-453a-80ae-bf1456505ba3\" (UID: \"de3d498f-4d4b-453a-80ae-bf1456505ba3\") " Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.926420 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de3d498f-4d4b-453a-80ae-bf1456505ba3-kube-api-access-kbhxb" (OuterVolumeSpecName: "kube-api-access-kbhxb") pod "de3d498f-4d4b-453a-80ae-bf1456505ba3" (UID: "de3d498f-4d4b-453a-80ae-bf1456505ba3"). InnerVolumeSpecName "kube-api-access-kbhxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.975022 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.982543 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=12.982521375 podStartE2EDuration="12.982521375s" podCreationTimestamp="2025-10-02 11:15:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:22.847086787 +0000 UTC m=+1437.789957731" watchObservedRunningTime="2025-10-02 11:15:22.982521375 +0000 UTC m=+1437.925392319" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.989005 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "de3d498f-4d4b-453a-80ae-bf1456505ba3" (UID: "de3d498f-4d4b-453a-80ae-bf1456505ba3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.990312 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.991368 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-config-data\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.991518 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.992750 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7vls\" (UniqueName: \"kubernetes.io/projected/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-kube-api-access-n7vls\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.995667 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.997587 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-scripts\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:22 crc kubenswrapper[4766]: I1002 11:15:22.999671 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbhxb\" (UniqueName: \"kubernetes.io/projected/de3d498f-4d4b-453a-80ae-bf1456505ba3-kube-api-access-kbhxb\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.014203 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.039111 4766 scope.go:117] "RemoveContainer" containerID="052955713ff3cdfa26c7f727ee98ce52ee226feaa68f7142ccdb9ee7508fe5f3" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.041816 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-lvlsh"] Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.044775 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "de3d498f-4d4b-453a-80ae-bf1456505ba3" (UID: "de3d498f-4d4b-453a-80ae-bf1456505ba3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: E1002 11:15:23.053979 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"052955713ff3cdfa26c7f727ee98ce52ee226feaa68f7142ccdb9ee7508fe5f3\": container with ID starting with 052955713ff3cdfa26c7f727ee98ce52ee226feaa68f7142ccdb9ee7508fe5f3 not found: ID does not exist" containerID="052955713ff3cdfa26c7f727ee98ce52ee226feaa68f7142ccdb9ee7508fe5f3" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.054028 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"052955713ff3cdfa26c7f727ee98ce52ee226feaa68f7142ccdb9ee7508fe5f3"} err="failed to get container status \"052955713ff3cdfa26c7f727ee98ce52ee226feaa68f7142ccdb9ee7508fe5f3\": rpc error: code = NotFound desc = could not find container \"052955713ff3cdfa26c7f727ee98ce52ee226feaa68f7142ccdb9ee7508fe5f3\": container with ID starting with 052955713ff3cdfa26c7f727ee98ce52ee226feaa68f7142ccdb9ee7508fe5f3 not found: ID does not exist" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.054058 4766 scope.go:117] "RemoveContainer" containerID="ada1411166ef19e5906600e666b606d7f0baa1825ff3982a02212ccfa8e31381" Oct 02 11:15:23 crc kubenswrapper[4766]: E1002 11:15:23.058270 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ada1411166ef19e5906600e666b606d7f0baa1825ff3982a02212ccfa8e31381\": container with ID starting with ada1411166ef19e5906600e666b606d7f0baa1825ff3982a02212ccfa8e31381 not found: ID does not exist" containerID="ada1411166ef19e5906600e666b606d7f0baa1825ff3982a02212ccfa8e31381" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.058337 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ada1411166ef19e5906600e666b606d7f0baa1825ff3982a02212ccfa8e31381"} err="failed to get container status \"ada1411166ef19e5906600e666b606d7f0baa1825ff3982a02212ccfa8e31381\": rpc error: code = NotFound desc = could not find container \"ada1411166ef19e5906600e666b606d7f0baa1825ff3982a02212ccfa8e31381\": container with ID starting with ada1411166ef19e5906600e666b606d7f0baa1825ff3982a02212ccfa8e31381 not found: ID does not exist" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.089622 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-config" (OuterVolumeSpecName: "config") pod "de3d498f-4d4b-453a-80ae-bf1456505ba3" (UID: "de3d498f-4d4b-453a-80ae-bf1456505ba3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.108234 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "de3d498f-4d4b-453a-80ae-bf1456505ba3" (UID: "de3d498f-4d4b-453a-80ae-bf1456505ba3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.108610 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "de3d498f-4d4b-453a-80ae-bf1456505ba3" (UID: "de3d498f-4d4b-453a-80ae-bf1456505ba3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.120643 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-scripts\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.120710 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.120736 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-config-data\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.120763 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.120835 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7vls\" (UniqueName: \"kubernetes.io/projected/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-kube-api-access-n7vls\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.120857 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.120942 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.120962 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.120972 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.120981 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de3d498f-4d4b-453a-80ae-bf1456505ba3-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.122124 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.126919 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-config-data\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.127640 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.128727 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.143035 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-scripts\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.152423 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7vls\" (UniqueName: \"kubernetes.io/projected/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-kube-api-access-n7vls\") pod \"cinder-scheduler-0\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.165616 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.238531 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.334706 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-combined-ca-bundle\") pod \"19506e05-cbde-4b54-8875-4a6791011bae\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.334811 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-config-data\") pod \"19506e05-cbde-4b54-8875-4a6791011bae\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.334838 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"19506e05-cbde-4b54-8875-4a6791011bae\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.334891 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc6zt\" (UniqueName: \"kubernetes.io/projected/19506e05-cbde-4b54-8875-4a6791011bae-kube-api-access-dc6zt\") pod \"19506e05-cbde-4b54-8875-4a6791011bae\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.334992 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-scripts\") pod \"19506e05-cbde-4b54-8875-4a6791011bae\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.335021 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19506e05-cbde-4b54-8875-4a6791011bae-httpd-run\") pod \"19506e05-cbde-4b54-8875-4a6791011bae\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.335101 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19506e05-cbde-4b54-8875-4a6791011bae-logs\") pod \"19506e05-cbde-4b54-8875-4a6791011bae\" (UID: \"19506e05-cbde-4b54-8875-4a6791011bae\") " Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.336249 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19506e05-cbde-4b54-8875-4a6791011bae-logs" (OuterVolumeSpecName: "logs") pod "19506e05-cbde-4b54-8875-4a6791011bae" (UID: "19506e05-cbde-4b54-8875-4a6791011bae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.340322 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19506e05-cbde-4b54-8875-4a6791011bae-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "19506e05-cbde-4b54-8875-4a6791011bae" (UID: "19506e05-cbde-4b54-8875-4a6791011bae"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.346597 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19506e05-cbde-4b54-8875-4a6791011bae-kube-api-access-dc6zt" (OuterVolumeSpecName: "kube-api-access-dc6zt") pod "19506e05-cbde-4b54-8875-4a6791011bae" (UID: "19506e05-cbde-4b54-8875-4a6791011bae"). InnerVolumeSpecName "kube-api-access-dc6zt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.349089 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-scripts" (OuterVolumeSpecName: "scripts") pod "19506e05-cbde-4b54-8875-4a6791011bae" (UID: "19506e05-cbde-4b54-8875-4a6791011bae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.366837 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "19506e05-cbde-4b54-8875-4a6791011bae" (UID: "19506e05-cbde-4b54-8875-4a6791011bae"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.376934 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19506e05-cbde-4b54-8875-4a6791011bae" (UID: "19506e05-cbde-4b54-8875-4a6791011bae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.434276 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-config-data" (OuterVolumeSpecName: "config-data") pod "19506e05-cbde-4b54-8875-4a6791011bae" (UID: "19506e05-cbde-4b54-8875-4a6791011bae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.437571 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/19506e05-cbde-4b54-8875-4a6791011bae-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.437598 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.437614 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.437643 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.437653 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc6zt\" (UniqueName: \"kubernetes.io/projected/19506e05-cbde-4b54-8875-4a6791011bae-kube-api-access-dc6zt\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.437664 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19506e05-cbde-4b54-8875-4a6791011bae-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.437673 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/19506e05-cbde-4b54-8875-4a6791011bae-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.518229 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.539356 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.645821 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.747175 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32309add-4057-4300-b193-ec17033ace3a-logs\") pod \"32309add-4057-4300-b193-ec17033ace3a\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.747244 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32309add-4057-4300-b193-ec17033ace3a-httpd-run\") pod \"32309add-4057-4300-b193-ec17033ace3a\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.747325 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-combined-ca-bundle\") pod \"32309add-4057-4300-b193-ec17033ace3a\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.747367 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-config-data\") pod \"32309add-4057-4300-b193-ec17033ace3a\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.747458 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfsjl\" (UniqueName: \"kubernetes.io/projected/32309add-4057-4300-b193-ec17033ace3a-kube-api-access-pfsjl\") pod \"32309add-4057-4300-b193-ec17033ace3a\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.747537 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"32309add-4057-4300-b193-ec17033ace3a\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.747608 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-scripts\") pod \"32309add-4057-4300-b193-ec17033ace3a\" (UID: \"32309add-4057-4300-b193-ec17033ace3a\") " Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.761871 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32309add-4057-4300-b193-ec17033ace3a-logs" (OuterVolumeSpecName: "logs") pod "32309add-4057-4300-b193-ec17033ace3a" (UID: "32309add-4057-4300-b193-ec17033ace3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.762123 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32309add-4057-4300-b193-ec17033ace3a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "32309add-4057-4300-b193-ec17033ace3a" (UID: "32309add-4057-4300-b193-ec17033ace3a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.775685 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32309add-4057-4300-b193-ec17033ace3a-kube-api-access-pfsjl" (OuterVolumeSpecName: "kube-api-access-pfsjl") pod "32309add-4057-4300-b193-ec17033ace3a" (UID: "32309add-4057-4300-b193-ec17033ace3a"). InnerVolumeSpecName "kube-api-access-pfsjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.780684 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-scripts" (OuterVolumeSpecName: "scripts") pod "32309add-4057-4300-b193-ec17033ace3a" (UID: "32309add-4057-4300-b193-ec17033ace3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.783019 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "32309add-4057-4300-b193-ec17033ace3a" (UID: "32309add-4057-4300-b193-ec17033ace3a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.836668 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32309add-4057-4300-b193-ec17033ace3a" (UID: "32309add-4057-4300-b193-ec17033ace3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.837021 4766 generic.go:334] "Generic (PLEG): container finished" podID="1a0d201d-b9b8-49e0-b51a-9e187d4b4441" containerID="bad133e51d3973fb304b20194be3bf5efca1233e15ed4a89282b8c538aa01f90" exitCode=0 Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.837355 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-85jtl" event={"ID":"1a0d201d-b9b8-49e0-b51a-9e187d4b4441","Type":"ContainerDied","Data":"bad133e51d3973fb304b20194be3bf5efca1233e15ed4a89282b8c538aa01f90"} Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.837419 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-85jtl" event={"ID":"1a0d201d-b9b8-49e0-b51a-9e187d4b4441","Type":"ContainerStarted","Data":"a18ca8b1c0d65c00d248c511b28ef1be749cd7e45de0489f1709a10fbdce100f"} Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.849771 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfsjl\" (UniqueName: \"kubernetes.io/projected/32309add-4057-4300-b193-ec17033ace3a-kube-api-access-pfsjl\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.849814 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.849823 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.849832 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32309add-4057-4300-b193-ec17033ace3a-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.849840 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32309add-4057-4300-b193-ec17033ace3a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.849848 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.860783 4766 generic.go:334] "Generic (PLEG): container finished" podID="32309add-4057-4300-b193-ec17033ace3a" containerID="47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef" exitCode=143 Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.860828 4766 generic.go:334] "Generic (PLEG): container finished" podID="32309add-4057-4300-b193-ec17033ace3a" containerID="8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449" exitCode=143 Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.860905 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32309add-4057-4300-b193-ec17033ace3a","Type":"ContainerDied","Data":"47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef"} Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.860939 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32309add-4057-4300-b193-ec17033ace3a","Type":"ContainerDied","Data":"8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449"} Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.860952 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32309add-4057-4300-b193-ec17033ace3a","Type":"ContainerDied","Data":"0150f7369523d2d4d8ead809221c2f97ca984b426d8ba7a71ff0f7abc238282a"} Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.860969 4766 scope.go:117] "RemoveContainer" containerID="47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.861135 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.876355 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.910388 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da3f12a0-0986-43aa-9727-60efa7d4a1f8" path="/var/lib/kubelet/pods/da3f12a0-0986-43aa-9727-60efa7d4a1f8/volumes" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.926117 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-config-data" (OuterVolumeSpecName: "config-data") pod "32309add-4057-4300-b193-ec17033ace3a" (UID: "32309add-4057-4300-b193-ec17033ace3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.928167 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.932568 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c986f6d7-99bm5" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.944724 4766 scope.go:117] "RemoveContainer" containerID="8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.953203 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32309add-4057-4300-b193-ec17033ace3a-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.953244 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.959797 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-lvlsh" podStartSLOduration=2.9597746579999997 podStartE2EDuration="2.959774658s" podCreationTimestamp="2025-10-02 11:15:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:23.919280423 +0000 UTC m=+1438.862151367" watchObservedRunningTime="2025-10-02 11:15:23.959774658 +0000 UTC m=+1438.902645602" Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.996029 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lvlsh" event={"ID":"3061df7e-4dd6-4340-88af-67b6d9b3a6b7","Type":"ContainerStarted","Data":"ea0c10f7b96fd5133417a93349e2d33afb1ade7b529354f115840f9c68087314"} Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.996096 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lvlsh" event={"ID":"3061df7e-4dd6-4340-88af-67b6d9b3a6b7","Type":"ContainerStarted","Data":"72cf3f2732d453e28515d8b980cc429fec4a5b67c1c83731e2ba1747f55de347"} Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.996112 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"19506e05-cbde-4b54-8875-4a6791011bae","Type":"ContainerDied","Data":"89269ce619857e6ab8b58e2e912011936221e0509a8201646e277164538de5d1"} Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.996132 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qhzpn" event={"ID":"33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9","Type":"ContainerStarted","Data":"c9d72b05ecbb9e359c79bf03ad7d011f837f63b6ad4298ee304c7cc1c4031928"} Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.996177 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qhzpn" event={"ID":"33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9","Type":"ContainerStarted","Data":"c461271b00c2ca2d0634ee5946ae8700435f62bae47adef20bfdcc8420c05865"} Oct 02 11:15:23 crc kubenswrapper[4766]: I1002 11:15:23.996192 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.058168 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-99bm5"] Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.061755 4766 scope.go:117] "RemoveContainer" containerID="47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef" Oct 02 11:15:24 crc kubenswrapper[4766]: E1002 11:15:24.063375 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef\": container with ID starting with 47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef not found: ID does not exist" containerID="47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.063411 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef"} err="failed to get container status \"47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef\": rpc error: code = NotFound desc = could not find container \"47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef\": container with ID starting with 47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef not found: ID does not exist" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.063433 4766 scope.go:117] "RemoveContainer" containerID="8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449" Oct 02 11:15:24 crc kubenswrapper[4766]: E1002 11:15:24.069930 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449\": container with ID starting with 8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449 not found: ID does not exist" containerID="8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.069982 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449"} err="failed to get container status \"8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449\": rpc error: code = NotFound desc = could not find container \"8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449\": container with ID starting with 8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449 not found: ID does not exist" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.070015 4766 scope.go:117] "RemoveContainer" containerID="47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.075169 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef"} err="failed to get container status \"47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef\": rpc error: code = NotFound desc = could not find container \"47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef\": container with ID starting with 47d2e5b141e6f31a323a9516a71f91e427b079d419ada71eee79e6bbe057b7ef not found: ID does not exist" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.075258 4766 scope.go:117] "RemoveContainer" containerID="8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.082656 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449"} err="failed to get container status \"8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449\": rpc error: code = NotFound desc = could not find container \"8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449\": container with ID starting with 8fb5bb9f2a15530bbd0a9c53eef8a17c762be6eb488bdaba52837093edab5449 not found: ID does not exist" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.082708 4766 scope.go:117] "RemoveContainer" containerID="75c97a315dd71b9a9ba10a8a05af31817b83a2b42d7c9c8ef21d71db710c08e5" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.087393 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69c986f6d7-99bm5"] Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.133632 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.137768 4766 scope.go:117] "RemoveContainer" containerID="52ffef158eecc7d362de49919fd86cf2eda24f5dbd0c6f16cf9f9d6c54857b84" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.147228 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.160711 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:15:24 crc kubenswrapper[4766]: E1002 11:15:24.161134 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19506e05-cbde-4b54-8875-4a6791011bae" containerName="glance-httpd" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.161152 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="19506e05-cbde-4b54-8875-4a6791011bae" containerName="glance-httpd" Oct 02 11:15:24 crc kubenswrapper[4766]: E1002 11:15:24.161170 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19506e05-cbde-4b54-8875-4a6791011bae" containerName="glance-log" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.161177 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="19506e05-cbde-4b54-8875-4a6791011bae" containerName="glance-log" Oct 02 11:15:24 crc kubenswrapper[4766]: E1002 11:15:24.161196 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32309add-4057-4300-b193-ec17033ace3a" containerName="glance-log" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.161201 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="32309add-4057-4300-b193-ec17033ace3a" containerName="glance-log" Oct 02 11:15:24 crc kubenswrapper[4766]: E1002 11:15:24.161226 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32309add-4057-4300-b193-ec17033ace3a" containerName="glance-httpd" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.161232 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="32309add-4057-4300-b193-ec17033ace3a" containerName="glance-httpd" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.161415 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="19506e05-cbde-4b54-8875-4a6791011bae" containerName="glance-log" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.161447 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="32309add-4057-4300-b193-ec17033ace3a" containerName="glance-log" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.161457 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="32309add-4057-4300-b193-ec17033ace3a" containerName="glance-httpd" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.161467 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="19506e05-cbde-4b54-8875-4a6791011bae" containerName="glance-httpd" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.162420 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.168056 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.168238 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.172172 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.236732 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.266244 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.277168 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.278975 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.280979 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.281253 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.290961 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.291063 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.291111 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.291188 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-logs\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.291351 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.291476 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.291542 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5lfs\" (UniqueName: \"kubernetes.io/projected/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-kube-api-access-t5lfs\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.291583 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.304188 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.395109 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dabeeea6-b022-49f4-b3db-3a7d83f29e51-logs\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.395200 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-scripts\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.395248 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.395280 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g58gq\" (UniqueName: \"kubernetes.io/projected/dabeeea6-b022-49f4-b3db-3a7d83f29e51-kube-api-access-g58gq\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.395631 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.397389 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.398224 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5lfs\" (UniqueName: \"kubernetes.io/projected/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-kube-api-access-t5lfs\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.399219 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.399334 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-config-data\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.399486 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.399577 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.399656 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dabeeea6-b022-49f4-b3db-3a7d83f29e51-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.399959 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.399978 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.400063 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.400193 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.400558 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-logs\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.401552 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.401662 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-logs\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.421447 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5lfs\" (UniqueName: \"kubernetes.io/projected/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-kube-api-access-t5lfs\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.422226 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.434524 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.435547 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.448134 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.470232 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.488047 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.503566 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-config-data\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.503708 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.504549 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.504599 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dabeeea6-b022-49f4-b3db-3a7d83f29e51-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.504862 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dabeeea6-b022-49f4-b3db-3a7d83f29e51-logs\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.505169 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-scripts\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.505205 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.505248 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g58gq\" (UniqueName: \"kubernetes.io/projected/dabeeea6-b022-49f4-b3db-3a7d83f29e51-kube-api-access-g58gq\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.507656 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.508882 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dabeeea6-b022-49f4-b3db-3a7d83f29e51-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.508973 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dabeeea6-b022-49f4-b3db-3a7d83f29e51-logs\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.515648 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.520938 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-config-data\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.527809 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.531864 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g58gq\" (UniqueName: \"kubernetes.io/projected/dabeeea6-b022-49f4-b3db-3a7d83f29e51-kube-api-access-g58gq\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.570183 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-scripts\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.585883 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-55f8b9d7c-hfdcr"] Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.633121 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.636900 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.638717 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55f8b9d7c-hfdcr"] Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.639768 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.653244 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.827584 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-combined-ca-bundle\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.827923 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-internal-tls-certs\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.827944 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-httpd-config\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.827975 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-config\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.827994 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9svq9\" (UniqueName: \"kubernetes.io/projected/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-kube-api-access-9svq9\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.828018 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-ovndb-tls-certs\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.828098 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-public-tls-certs\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.902186 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.930151 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-public-tls-certs\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.930300 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-combined-ca-bundle\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.930336 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-internal-tls-certs\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.930361 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-httpd-config\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.930397 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-config\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.930423 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9svq9\" (UniqueName: \"kubernetes.io/projected/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-kube-api-access-9svq9\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.930457 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-ovndb-tls-certs\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.938105 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-ovndb-tls-certs\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.940680 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-config\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.943414 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-internal-tls-certs\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.948535 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-combined-ca-bundle\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.950443 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-httpd-config\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.954246 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9svq9\" (UniqueName: \"kubernetes.io/projected/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-kube-api-access-9svq9\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.961908 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-public-tls-certs\") pod \"neutron-55f8b9d7c-hfdcr\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.973012 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6","Type":"ContainerStarted","Data":"4b127a06d475d3d8b01682ab934e5c8debe86e90c48741e8f524ec9e9717ce67"} Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.978601 4766 generic.go:334] "Generic (PLEG): container finished" podID="3061df7e-4dd6-4340-88af-67b6d9b3a6b7" containerID="ea0c10f7b96fd5133417a93349e2d33afb1ade7b529354f115840f9c68087314" exitCode=0 Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.978699 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lvlsh" event={"ID":"3061df7e-4dd6-4340-88af-67b6d9b3a6b7","Type":"ContainerDied","Data":"ea0c10f7b96fd5133417a93349e2d33afb1ade7b529354f115840f9c68087314"} Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.983341 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df305f8b-2b53-4032-875e-531accfd848e","Type":"ContainerStarted","Data":"174d21f546fbd57c86e95a7c8a925bf9b67f2f3123647b46a85db7d928818a01"} Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.984253 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.985384 4766 generic.go:334] "Generic (PLEG): container finished" podID="33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9" containerID="c9d72b05ecbb9e359c79bf03ad7d011f837f63b6ad4298ee304c7cc1c4031928" exitCode=0 Oct 02 11:15:24 crc kubenswrapper[4766]: I1002 11:15:24.985853 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qhzpn" event={"ID":"33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9","Type":"ContainerDied","Data":"c9d72b05ecbb9e359c79bf03ad7d011f837f63b6ad4298ee304c7cc1c4031928"} Oct 02 11:15:25 crc kubenswrapper[4766]: I1002 11:15:25.031745 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=8.336883593 podStartE2EDuration="14.031719886s" podCreationTimestamp="2025-10-02 11:15:11 +0000 UTC" firstStartedPulling="2025-10-02 11:15:18.480438582 +0000 UTC m=+1433.423309526" lastFinishedPulling="2025-10-02 11:15:24.175274875 +0000 UTC m=+1439.118145819" observedRunningTime="2025-10-02 11:15:25.019833146 +0000 UTC m=+1439.962704100" watchObservedRunningTime="2025-10-02 11:15:25.031719886 +0000 UTC m=+1439.974590830" Oct 02 11:15:25 crc kubenswrapper[4766]: I1002 11:15:25.036478 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:25 crc kubenswrapper[4766]: I1002 11:15:25.220391 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:15:25 crc kubenswrapper[4766]: W1002 11:15:25.242627 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae9f1734_5cfa_41f0_baaf_0b4344a2a14a.slice/crio-ff52492dca70c0637a46784a4a50c75da0154b86aba0b18922b2301999d13dd7 WatchSource:0}: Error finding container ff52492dca70c0637a46784a4a50c75da0154b86aba0b18922b2301999d13dd7: Status 404 returned error can't find the container with id ff52492dca70c0637a46784a4a50c75da0154b86aba0b18922b2301999d13dd7 Oct 02 11:15:25 crc kubenswrapper[4766]: I1002 11:15:25.328484 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:15:25 crc kubenswrapper[4766]: I1002 11:15:25.702082 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:15:25 crc kubenswrapper[4766]: I1002 11:15:25.835238 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55f8b9d7c-hfdcr"] Oct 02 11:15:25 crc kubenswrapper[4766]: I1002 11:15:25.928780 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19506e05-cbde-4b54-8875-4a6791011bae" path="/var/lib/kubelet/pods/19506e05-cbde-4b54-8875-4a6791011bae/volumes" Oct 02 11:15:25 crc kubenswrapper[4766]: I1002 11:15:25.930233 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32309add-4057-4300-b193-ec17033ace3a" path="/var/lib/kubelet/pods/32309add-4057-4300-b193-ec17033ace3a/volumes" Oct 02 11:15:25 crc kubenswrapper[4766]: I1002 11:15:25.931214 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de3d498f-4d4b-453a-80ae-bf1456505ba3" path="/var/lib/kubelet/pods/de3d498f-4d4b-453a-80ae-bf1456505ba3/volumes" Oct 02 11:15:26 crc kubenswrapper[4766]: I1002 11:15:26.008729 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a","Type":"ContainerStarted","Data":"ff52492dca70c0637a46784a4a50c75da0154b86aba0b18922b2301999d13dd7"} Oct 02 11:15:26 crc kubenswrapper[4766]: I1002 11:15:26.020876 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dabeeea6-b022-49f4-b3db-3a7d83f29e51","Type":"ContainerStarted","Data":"82f63c115caf2cc6b96824e1c54618114f650efa4925370a65150d9f5f8eb671"} Oct 02 11:15:26 crc kubenswrapper[4766]: I1002 11:15:26.030382 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55f8b9d7c-hfdcr" event={"ID":"ec8cdac7-81c9-41e7-a956-41d13e5b91a6","Type":"ContainerStarted","Data":"f9cf68c62ab67bd2c2561b1c4ed33ee8e77490994b3ccdb82b73650c412cafd1"} Oct 02 11:15:26 crc kubenswrapper[4766]: I1002 11:15:26.033438 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-qhzpn" event={"ID":"33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9","Type":"ContainerDied","Data":"c461271b00c2ca2d0634ee5946ae8700435f62bae47adef20bfdcc8420c05865"} Oct 02 11:15:26 crc kubenswrapper[4766]: I1002 11:15:26.033468 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c461271b00c2ca2d0634ee5946ae8700435f62bae47adef20bfdcc8420c05865" Oct 02 11:15:26 crc kubenswrapper[4766]: I1002 11:15:26.186542 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qhzpn" Oct 02 11:15:26 crc kubenswrapper[4766]: I1002 11:15:26.209327 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-85jtl" Oct 02 11:15:26 crc kubenswrapper[4766]: I1002 11:15:26.364400 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nv6tn\" (UniqueName: \"kubernetes.io/projected/33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9-kube-api-access-nv6tn\") pod \"33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9\" (UID: \"33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9\") " Oct 02 11:15:26 crc kubenswrapper[4766]: I1002 11:15:26.364616 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnscn\" (UniqueName: \"kubernetes.io/projected/1a0d201d-b9b8-49e0-b51a-9e187d4b4441-kube-api-access-mnscn\") pod \"1a0d201d-b9b8-49e0-b51a-9e187d4b4441\" (UID: \"1a0d201d-b9b8-49e0-b51a-9e187d4b4441\") " Oct 02 11:15:26 crc kubenswrapper[4766]: I1002 11:15:26.369896 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a0d201d-b9b8-49e0-b51a-9e187d4b4441-kube-api-access-mnscn" (OuterVolumeSpecName: "kube-api-access-mnscn") pod "1a0d201d-b9b8-49e0-b51a-9e187d4b4441" (UID: "1a0d201d-b9b8-49e0-b51a-9e187d4b4441"). InnerVolumeSpecName "kube-api-access-mnscn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:26 crc kubenswrapper[4766]: I1002 11:15:26.372174 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9-kube-api-access-nv6tn" (OuterVolumeSpecName: "kube-api-access-nv6tn") pod "33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9" (UID: "33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9"). InnerVolumeSpecName "kube-api-access-nv6tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:26 crc kubenswrapper[4766]: I1002 11:15:26.467582 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnscn\" (UniqueName: \"kubernetes.io/projected/1a0d201d-b9b8-49e0-b51a-9e187d4b4441-kube-api-access-mnscn\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:26 crc kubenswrapper[4766]: I1002 11:15:26.467610 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nv6tn\" (UniqueName: \"kubernetes.io/projected/33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9-kube-api-access-nv6tn\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:26 crc kubenswrapper[4766]: I1002 11:15:26.627206 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lvlsh" Oct 02 11:15:26 crc kubenswrapper[4766]: I1002 11:15:26.771842 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzshb\" (UniqueName: \"kubernetes.io/projected/3061df7e-4dd6-4340-88af-67b6d9b3a6b7-kube-api-access-bzshb\") pod \"3061df7e-4dd6-4340-88af-67b6d9b3a6b7\" (UID: \"3061df7e-4dd6-4340-88af-67b6d9b3a6b7\") " Oct 02 11:15:26 crc kubenswrapper[4766]: I1002 11:15:26.785734 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3061df7e-4dd6-4340-88af-67b6d9b3a6b7-kube-api-access-bzshb" (OuterVolumeSpecName: "kube-api-access-bzshb") pod "3061df7e-4dd6-4340-88af-67b6d9b3a6b7" (UID: "3061df7e-4dd6-4340-88af-67b6d9b3a6b7"). InnerVolumeSpecName "kube-api-access-bzshb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:26 crc kubenswrapper[4766]: I1002 11:15:26.876151 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzshb\" (UniqueName: \"kubernetes.io/projected/3061df7e-4dd6-4340-88af-67b6d9b3a6b7-kube-api-access-bzshb\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:27 crc kubenswrapper[4766]: I1002 11:15:27.002174 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:15:27 crc kubenswrapper[4766]: I1002 11:15:27.118152 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-85jtl" event={"ID":"1a0d201d-b9b8-49e0-b51a-9e187d4b4441","Type":"ContainerDied","Data":"a18ca8b1c0d65c00d248c511b28ef1be749cd7e45de0489f1709a10fbdce100f"} Oct 02 11:15:27 crc kubenswrapper[4766]: I1002 11:15:27.118190 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a18ca8b1c0d65c00d248c511b28ef1be749cd7e45de0489f1709a10fbdce100f" Oct 02 11:15:27 crc kubenswrapper[4766]: I1002 11:15:27.118244 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-85jtl" Oct 02 11:15:27 crc kubenswrapper[4766]: I1002 11:15:27.142323 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6","Type":"ContainerStarted","Data":"e5e102f20a63a8d722e965baa46260978733d5085519f5210483d94b2bb40281"} Oct 02 11:15:27 crc kubenswrapper[4766]: I1002 11:15:27.147902 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a","Type":"ContainerStarted","Data":"025f10aac32ce0fdc683a046fa9324ba50ecb19358a70588497d2de63e2536bf"} Oct 02 11:15:27 crc kubenswrapper[4766]: I1002 11:15:27.152446 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-lvlsh" event={"ID":"3061df7e-4dd6-4340-88af-67b6d9b3a6b7","Type":"ContainerDied","Data":"72cf3f2732d453e28515d8b980cc429fec4a5b67c1c83731e2ba1747f55de347"} Oct 02 11:15:27 crc kubenswrapper[4766]: I1002 11:15:27.152485 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72cf3f2732d453e28515d8b980cc429fec4a5b67c1c83731e2ba1747f55de347" Oct 02 11:15:27 crc kubenswrapper[4766]: I1002 11:15:27.152567 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-lvlsh" Oct 02 11:15:27 crc kubenswrapper[4766]: I1002 11:15:27.165895 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dabeeea6-b022-49f4-b3db-3a7d83f29e51","Type":"ContainerStarted","Data":"96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa"} Oct 02 11:15:27 crc kubenswrapper[4766]: I1002 11:15:27.170063 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-qhzpn" Oct 02 11:15:27 crc kubenswrapper[4766]: I1002 11:15:27.172416 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55f8b9d7c-hfdcr" event={"ID":"ec8cdac7-81c9-41e7-a956-41d13e5b91a6","Type":"ContainerStarted","Data":"f5124828eb5c9f7610af6224039bb25da1c36b7f880181f818d2c8a8eefcf481"} Oct 02 11:15:27 crc kubenswrapper[4766]: I1002 11:15:27.172453 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:27 crc kubenswrapper[4766]: I1002 11:15:27.172464 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55f8b9d7c-hfdcr" event={"ID":"ec8cdac7-81c9-41e7-a956-41d13e5b91a6","Type":"ContainerStarted","Data":"b88eb59f2ed5fce10e26153dd82767f0f8e62d65f3120c04ecdf968f4b718053"} Oct 02 11:15:27 crc kubenswrapper[4766]: I1002 11:15:27.221877 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-55f8b9d7c-hfdcr" podStartSLOduration=3.221857981 podStartE2EDuration="3.221857981s" podCreationTimestamp="2025-10-02 11:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:27.195447307 +0000 UTC m=+1442.138318261" watchObservedRunningTime="2025-10-02 11:15:27.221857981 +0000 UTC m=+1442.164728925" Oct 02 11:15:28 crc kubenswrapper[4766]: I1002 11:15:28.007864 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:28 crc kubenswrapper[4766]: I1002 11:15:28.180513 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a","Type":"ContainerStarted","Data":"d4bd3925e3ecf25a00db896208d504ed75f864301cbcf99f17fb47c2a0535383"} Oct 02 11:15:28 crc kubenswrapper[4766]: I1002 11:15:28.180664 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" containerName="glance-log" containerID="cri-o://025f10aac32ce0fdc683a046fa9324ba50ecb19358a70588497d2de63e2536bf" gracePeriod=30 Oct 02 11:15:28 crc kubenswrapper[4766]: I1002 11:15:28.181258 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" containerName="glance-httpd" containerID="cri-o://d4bd3925e3ecf25a00db896208d504ed75f864301cbcf99f17fb47c2a0535383" gracePeriod=30 Oct 02 11:15:28 crc kubenswrapper[4766]: I1002 11:15:28.187007 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dabeeea6-b022-49f4-b3db-3a7d83f29e51","Type":"ContainerStarted","Data":"689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4"} Oct 02 11:15:28 crc kubenswrapper[4766]: I1002 11:15:28.187176 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="dabeeea6-b022-49f4-b3db-3a7d83f29e51" containerName="glance-log" containerID="cri-o://96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa" gracePeriod=30 Oct 02 11:15:28 crc kubenswrapper[4766]: I1002 11:15:28.187285 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="dabeeea6-b022-49f4-b3db-3a7d83f29e51" containerName="glance-httpd" containerID="cri-o://689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4" gracePeriod=30 Oct 02 11:15:28 crc kubenswrapper[4766]: I1002 11:15:28.190036 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6","Type":"ContainerStarted","Data":"4fa69755c8413dc8a6865886029bf8ec092fa254162c708d148185b7305c545c"} Oct 02 11:15:28 crc kubenswrapper[4766]: I1002 11:15:28.190712 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df305f8b-2b53-4032-875e-531accfd848e" containerName="ceilometer-central-agent" containerID="cri-o://0aeb5be1e525e3f40e9ec10c464314f0ff3c45d280b7069dac2edf94d312453d" gracePeriod=30 Oct 02 11:15:28 crc kubenswrapper[4766]: I1002 11:15:28.190805 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df305f8b-2b53-4032-875e-531accfd848e" containerName="proxy-httpd" containerID="cri-o://174d21f546fbd57c86e95a7c8a925bf9b67f2f3123647b46a85db7d928818a01" gracePeriod=30 Oct 02 11:15:28 crc kubenswrapper[4766]: I1002 11:15:28.190836 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df305f8b-2b53-4032-875e-531accfd848e" containerName="sg-core" containerID="cri-o://e9a0593064a950c84f269116f2d77bb9eee0ad07be18d76c825c76a66b2f0d61" gracePeriod=30 Oct 02 11:15:28 crc kubenswrapper[4766]: I1002 11:15:28.190864 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="df305f8b-2b53-4032-875e-531accfd848e" containerName="ceilometer-notification-agent" containerID="cri-o://689985d395b0444577ec22ae046e000e380b42e62bce27e8d7f73375918a7d5e" gracePeriod=30 Oct 02 11:15:28 crc kubenswrapper[4766]: I1002 11:15:28.207827 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.207809542 podStartE2EDuration="4.207809542s" podCreationTimestamp="2025-10-02 11:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:28.205839418 +0000 UTC m=+1443.148710362" watchObservedRunningTime="2025-10-02 11:15:28.207809542 +0000 UTC m=+1443.150680486" Oct 02 11:15:28 crc kubenswrapper[4766]: I1002 11:15:28.239592 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 11:15:28 crc kubenswrapper[4766]: I1002 11:15:28.248582 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.248560434 podStartE2EDuration="4.248560434s" podCreationTimestamp="2025-10-02 11:15:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:28.240254348 +0000 UTC m=+1443.183125292" watchObservedRunningTime="2025-10-02 11:15:28.248560434 +0000 UTC m=+1443.191431378" Oct 02 11:15:28 crc kubenswrapper[4766]: I1002 11:15:28.265440 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.265421573 podStartE2EDuration="6.265421573s" podCreationTimestamp="2025-10-02 11:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:28.261561009 +0000 UTC m=+1443.204431963" watchObservedRunningTime="2025-10-02 11:15:28.265421573 +0000 UTC m=+1443.208292517" Oct 02 11:15:28 crc kubenswrapper[4766]: I1002 11:15:28.972256 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.121778 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-scripts\") pod \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.121933 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g58gq\" (UniqueName: \"kubernetes.io/projected/dabeeea6-b022-49f4-b3db-3a7d83f29e51-kube-api-access-g58gq\") pod \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.122039 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-combined-ca-bundle\") pod \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.122192 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-config-data\") pod \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.122272 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dabeeea6-b022-49f4-b3db-3a7d83f29e51-logs\") pod \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.122436 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-public-tls-certs\") pod \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.122632 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.122697 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dabeeea6-b022-49f4-b3db-3a7d83f29e51-httpd-run\") pod \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\" (UID: \"dabeeea6-b022-49f4-b3db-3a7d83f29e51\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.124157 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dabeeea6-b022-49f4-b3db-3a7d83f29e51-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dabeeea6-b022-49f4-b3db-3a7d83f29e51" (UID: "dabeeea6-b022-49f4-b3db-3a7d83f29e51"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.126321 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dabeeea6-b022-49f4-b3db-3a7d83f29e51-logs" (OuterVolumeSpecName: "logs") pod "dabeeea6-b022-49f4-b3db-3a7d83f29e51" (UID: "dabeeea6-b022-49f4-b3db-3a7d83f29e51"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.131814 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-scripts" (OuterVolumeSpecName: "scripts") pod "dabeeea6-b022-49f4-b3db-3a7d83f29e51" (UID: "dabeeea6-b022-49f4-b3db-3a7d83f29e51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.157430 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dabeeea6-b022-49f4-b3db-3a7d83f29e51-kube-api-access-g58gq" (OuterVolumeSpecName: "kube-api-access-g58gq") pod "dabeeea6-b022-49f4-b3db-3a7d83f29e51" (UID: "dabeeea6-b022-49f4-b3db-3a7d83f29e51"). InnerVolumeSpecName "kube-api-access-g58gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.158107 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "dabeeea6-b022-49f4-b3db-3a7d83f29e51" (UID: "dabeeea6-b022-49f4-b3db-3a7d83f29e51"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.182472 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dabeeea6-b022-49f4-b3db-3a7d83f29e51" (UID: "dabeeea6-b022-49f4-b3db-3a7d83f29e51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.221717 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dabeeea6-b022-49f4-b3db-3a7d83f29e51" (UID: "dabeeea6-b022-49f4-b3db-3a7d83f29e51"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.227630 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.227660 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dabeeea6-b022-49f4-b3db-3a7d83f29e51-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.227669 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.227689 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.227700 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dabeeea6-b022-49f4-b3db-3a7d83f29e51-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.227708 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.227716 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g58gq\" (UniqueName: \"kubernetes.io/projected/dabeeea6-b022-49f4-b3db-3a7d83f29e51-kube-api-access-g58gq\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.256744 4766 generic.go:334] "Generic (PLEG): container finished" podID="df305f8b-2b53-4032-875e-531accfd848e" containerID="174d21f546fbd57c86e95a7c8a925bf9b67f2f3123647b46a85db7d928818a01" exitCode=0 Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.256781 4766 generic.go:334] "Generic (PLEG): container finished" podID="df305f8b-2b53-4032-875e-531accfd848e" containerID="e9a0593064a950c84f269116f2d77bb9eee0ad07be18d76c825c76a66b2f0d61" exitCode=2 Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.256790 4766 generic.go:334] "Generic (PLEG): container finished" podID="df305f8b-2b53-4032-875e-531accfd848e" containerID="689985d395b0444577ec22ae046e000e380b42e62bce27e8d7f73375918a7d5e" exitCode=0 Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.256800 4766 generic.go:334] "Generic (PLEG): container finished" podID="df305f8b-2b53-4032-875e-531accfd848e" containerID="0aeb5be1e525e3f40e9ec10c464314f0ff3c45d280b7069dac2edf94d312453d" exitCode=0 Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.256859 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df305f8b-2b53-4032-875e-531accfd848e","Type":"ContainerDied","Data":"174d21f546fbd57c86e95a7c8a925bf9b67f2f3123647b46a85db7d928818a01"} Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.256885 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df305f8b-2b53-4032-875e-531accfd848e","Type":"ContainerDied","Data":"e9a0593064a950c84f269116f2d77bb9eee0ad07be18d76c825c76a66b2f0d61"} Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.256896 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df305f8b-2b53-4032-875e-531accfd848e","Type":"ContainerDied","Data":"689985d395b0444577ec22ae046e000e380b42e62bce27e8d7f73375918a7d5e"} Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.256906 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df305f8b-2b53-4032-875e-531accfd848e","Type":"ContainerDied","Data":"0aeb5be1e525e3f40e9ec10c464314f0ff3c45d280b7069dac2edf94d312453d"} Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.268659 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-config-data" (OuterVolumeSpecName: "config-data") pod "dabeeea6-b022-49f4-b3db-3a7d83f29e51" (UID: "dabeeea6-b022-49f4-b3db-3a7d83f29e51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.291039 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.296011 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.298000 4766 generic.go:334] "Generic (PLEG): container finished" podID="ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" containerID="d4bd3925e3ecf25a00db896208d504ed75f864301cbcf99f17fb47c2a0535383" exitCode=0 Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.298027 4766 generic.go:334] "Generic (PLEG): container finished" podID="ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" containerID="025f10aac32ce0fdc683a046fa9324ba50ecb19358a70588497d2de63e2536bf" exitCode=143 Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.298086 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a","Type":"ContainerDied","Data":"d4bd3925e3ecf25a00db896208d504ed75f864301cbcf99f17fb47c2a0535383"} Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.298110 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a","Type":"ContainerDied","Data":"025f10aac32ce0fdc683a046fa9324ba50ecb19358a70588497d2de63e2536bf"} Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.324547 4766 generic.go:334] "Generic (PLEG): container finished" podID="dabeeea6-b022-49f4-b3db-3a7d83f29e51" containerID="689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4" exitCode=0 Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.324577 4766 generic.go:334] "Generic (PLEG): container finished" podID="dabeeea6-b022-49f4-b3db-3a7d83f29e51" containerID="96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa" exitCode=143 Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.325598 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.326753 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dabeeea6-b022-49f4-b3db-3a7d83f29e51","Type":"ContainerDied","Data":"689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4"} Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.326829 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dabeeea6-b022-49f4-b3db-3a7d83f29e51","Type":"ContainerDied","Data":"96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa"} Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.326848 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dabeeea6-b022-49f4-b3db-3a7d83f29e51","Type":"ContainerDied","Data":"82f63c115caf2cc6b96824e1c54618114f650efa4925370a65150d9f5f8eb671"} Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.326875 4766 scope.go:117] "RemoveContainer" containerID="689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.328894 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dabeeea6-b022-49f4-b3db-3a7d83f29e51-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.328915 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.352566 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.423485 4766 scope.go:117] "RemoveContainer" containerID="96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.430417 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-sg-core-conf-yaml\") pod \"df305f8b-2b53-4032-875e-531accfd848e\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.430491 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-combined-ca-bundle\") pod \"df305f8b-2b53-4032-875e-531accfd848e\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.430552 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df305f8b-2b53-4032-875e-531accfd848e-log-httpd\") pod \"df305f8b-2b53-4032-875e-531accfd848e\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.430573 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-config-data\") pod \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.430588 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-internal-tls-certs\") pod \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.430609 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-scripts\") pod \"df305f8b-2b53-4032-875e-531accfd848e\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.430626 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvvbv\" (UniqueName: \"kubernetes.io/projected/df305f8b-2b53-4032-875e-531accfd848e-kube-api-access-tvvbv\") pod \"df305f8b-2b53-4032-875e-531accfd848e\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.430658 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5lfs\" (UniqueName: \"kubernetes.io/projected/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-kube-api-access-t5lfs\") pod \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.430685 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df305f8b-2b53-4032-875e-531accfd848e-run-httpd\") pod \"df305f8b-2b53-4032-875e-531accfd848e\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.430708 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-combined-ca-bundle\") pod \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.430728 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-httpd-run\") pod \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.430754 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-scripts\") pod \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.430781 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.430805 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-logs\") pod \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\" (UID: \"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.430866 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-config-data\") pod \"df305f8b-2b53-4032-875e-531accfd848e\" (UID: \"df305f8b-2b53-4032-875e-531accfd848e\") " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.435564 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.437516 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df305f8b-2b53-4032-875e-531accfd848e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "df305f8b-2b53-4032-875e-531accfd848e" (UID: "df305f8b-2b53-4032-875e-531accfd848e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.445623 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.451382 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" (UID: "ae9f1734-5cfa-41f0-baaf-0b4344a2a14a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.453426 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df305f8b-2b53-4032-875e-531accfd848e-kube-api-access-tvvbv" (OuterVolumeSpecName: "kube-api-access-tvvbv") pod "df305f8b-2b53-4032-875e-531accfd848e" (UID: "df305f8b-2b53-4032-875e-531accfd848e"). InnerVolumeSpecName "kube-api-access-tvvbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.458818 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-kube-api-access-t5lfs" (OuterVolumeSpecName: "kube-api-access-t5lfs") pod "ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" (UID: "ae9f1734-5cfa-41f0-baaf-0b4344a2a14a"). InnerVolumeSpecName "kube-api-access-t5lfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.461275 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df305f8b-2b53-4032-875e-531accfd848e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "df305f8b-2b53-4032-875e-531accfd848e" (UID: "df305f8b-2b53-4032-875e-531accfd848e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.463426 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-scripts" (OuterVolumeSpecName: "scripts") pod "df305f8b-2b53-4032-875e-531accfd848e" (UID: "df305f8b-2b53-4032-875e-531accfd848e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.463545 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-logs" (OuterVolumeSpecName: "logs") pod "ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" (UID: "ae9f1734-5cfa-41f0-baaf-0b4344a2a14a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.481231 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-scripts" (OuterVolumeSpecName: "scripts") pod "ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" (UID: "ae9f1734-5cfa-41f0-baaf-0b4344a2a14a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.481325 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" (UID: "ae9f1734-5cfa-41f0-baaf-0b4344a2a14a"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.487154 4766 scope.go:117] "RemoveContainer" containerID="689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.488641 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:15:29 crc kubenswrapper[4766]: E1002 11:15:29.490144 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4\": container with ID starting with 689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4 not found: ID does not exist" containerID="689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.490191 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4"} err="failed to get container status \"689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4\": rpc error: code = NotFound desc = could not find container \"689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4\": container with ID starting with 689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4 not found: ID does not exist" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.490237 4766 scope.go:117] "RemoveContainer" containerID="96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa" Oct 02 11:15:29 crc kubenswrapper[4766]: E1002 11:15:29.494079 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa\": container with ID starting with 96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa not found: ID does not exist" containerID="96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.494124 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa"} err="failed to get container status \"96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa\": rpc error: code = NotFound desc = could not find container \"96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa\": container with ID starting with 96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa not found: ID does not exist" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.494150 4766 scope.go:117] "RemoveContainer" containerID="689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4" Oct 02 11:15:29 crc kubenswrapper[4766]: E1002 11:15:29.495605 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" containerName="glance-httpd" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.495726 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" containerName="glance-httpd" Oct 02 11:15:29 crc kubenswrapper[4766]: E1002 11:15:29.495818 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabeeea6-b022-49f4-b3db-3a7d83f29e51" containerName="glance-log" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.495899 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabeeea6-b022-49f4-b3db-3a7d83f29e51" containerName="glance-log" Oct 02 11:15:29 crc kubenswrapper[4766]: E1002 11:15:29.497204 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df305f8b-2b53-4032-875e-531accfd848e" containerName="ceilometer-notification-agent" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.497593 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="df305f8b-2b53-4032-875e-531accfd848e" containerName="ceilometer-notification-agent" Oct 02 11:15:29 crc kubenswrapper[4766]: E1002 11:15:29.497854 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df305f8b-2b53-4032-875e-531accfd848e" containerName="ceilometer-central-agent" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.498359 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="df305f8b-2b53-4032-875e-531accfd848e" containerName="ceilometer-central-agent" Oct 02 11:15:29 crc kubenswrapper[4766]: E1002 11:15:29.498726 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3061df7e-4dd6-4340-88af-67b6d9b3a6b7" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.498955 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3061df7e-4dd6-4340-88af-67b6d9b3a6b7" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4766]: E1002 11:15:29.499210 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a0d201d-b9b8-49e0-b51a-9e187d4b4441" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.499454 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a0d201d-b9b8-49e0-b51a-9e187d4b4441" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4766]: E1002 11:15:29.499709 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" containerName="glance-log" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.500027 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" containerName="glance-log" Oct 02 11:15:29 crc kubenswrapper[4766]: E1002 11:15:29.500201 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dabeeea6-b022-49f4-b3db-3a7d83f29e51" containerName="glance-httpd" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.500482 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dabeeea6-b022-49f4-b3db-3a7d83f29e51" containerName="glance-httpd" Oct 02 11:15:29 crc kubenswrapper[4766]: E1002 11:15:29.500785 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.501515 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4766]: E1002 11:15:29.501630 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df305f8b-2b53-4032-875e-531accfd848e" containerName="sg-core" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.501888 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="df305f8b-2b53-4032-875e-531accfd848e" containerName="sg-core" Oct 02 11:15:29 crc kubenswrapper[4766]: E1002 11:15:29.501979 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df305f8b-2b53-4032-875e-531accfd848e" containerName="proxy-httpd" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.502046 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="df305f8b-2b53-4032-875e-531accfd848e" containerName="proxy-httpd" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.502963 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.504797 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="df305f8b-2b53-4032-875e-531accfd848e" containerName="ceilometer-central-agent" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.505849 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="df305f8b-2b53-4032-875e-531accfd848e" containerName="sg-core" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.497378 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4"} err="failed to get container status \"689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4\": rpc error: code = NotFound desc = could not find container \"689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4\": container with ID starting with 689a0c562513aec20c969952086a8b0253177fc7d0a221e0a7662af095b3dbb4 not found: ID does not exist" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.506121 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="df305f8b-2b53-4032-875e-531accfd848e" containerName="proxy-httpd" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.509855 4766 scope.go:117] "RemoveContainer" containerID="96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.510471 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" containerName="glance-log" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.512373 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" containerName="glance-httpd" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.512420 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="dabeeea6-b022-49f4-b3db-3a7d83f29e51" containerName="glance-httpd" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.512455 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a0d201d-b9b8-49e0-b51a-9e187d4b4441" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.512464 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="df305f8b-2b53-4032-875e-531accfd848e" containerName="ceilometer-notification-agent" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.512474 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3061df7e-4dd6-4340-88af-67b6d9b3a6b7" containerName="mariadb-database-create" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.512488 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="dabeeea6-b022-49f4-b3db-3a7d83f29e51" containerName="glance-log" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.513740 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa"} err="failed to get container status \"96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa\": rpc error: code = NotFound desc = could not find container \"96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa\": container with ID starting with 96f3040afee524891bfa90b61dff5d5a83553fac31b913a57d2319bb852a56aa not found: ID does not exist" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.514297 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.514589 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.517966 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.518194 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.533114 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5lfs\" (UniqueName: \"kubernetes.io/projected/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-kube-api-access-t5lfs\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.533147 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df305f8b-2b53-4032-875e-531accfd848e-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.533156 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.533164 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.533229 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.533239 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.533247 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/df305f8b-2b53-4032-875e-531accfd848e-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.533255 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.533264 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvvbv\" (UniqueName: \"kubernetes.io/projected/df305f8b-2b53-4032-875e-531accfd848e-kube-api-access-tvvbv\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.555580 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" (UID: "ae9f1734-5cfa-41f0-baaf-0b4344a2a14a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.563696 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.615191 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" (UID: "ae9f1734-5cfa-41f0-baaf-0b4344a2a14a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.629667 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "df305f8b-2b53-4032-875e-531accfd848e" (UID: "df305f8b-2b53-4032-875e-531accfd848e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.635218 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-config-data\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.635441 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-scripts\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.635578 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flv52\" (UniqueName: \"kubernetes.io/projected/c3384223-2ad0-4593-976c-54c2d3cce52e-kube-api-access-flv52\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.635835 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3384223-2ad0-4593-976c-54c2d3cce52e-logs\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.635965 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3384223-2ad0-4593-976c-54c2d3cce52e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.636193 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.636676 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.636809 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.637011 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.637101 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.637261 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.637360 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.640386 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df305f8b-2b53-4032-875e-531accfd848e" (UID: "df305f8b-2b53-4032-875e-531accfd848e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.656766 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-config-data" (OuterVolumeSpecName: "config-data") pod "ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" (UID: "ae9f1734-5cfa-41f0-baaf-0b4344a2a14a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.668097 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-config-data" (OuterVolumeSpecName: "config-data") pod "df305f8b-2b53-4032-875e-531accfd848e" (UID: "df305f8b-2b53-4032-875e-531accfd848e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.739167 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.739251 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.739287 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-config-data\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.739311 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-scripts\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.739333 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flv52\" (UniqueName: \"kubernetes.io/projected/c3384223-2ad0-4593-976c-54c2d3cce52e-kube-api-access-flv52\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.739378 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3384223-2ad0-4593-976c-54c2d3cce52e-logs\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.739412 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3384223-2ad0-4593-976c-54c2d3cce52e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.739487 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.739562 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.739574 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df305f8b-2b53-4032-875e-531accfd848e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.739585 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.740071 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.740306 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3384223-2ad0-4593-976c-54c2d3cce52e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.740092 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3384223-2ad0-4593-976c-54c2d3cce52e-logs\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.746017 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-scripts\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.749033 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.749139 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-config-data\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.749417 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.762299 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flv52\" (UniqueName: \"kubernetes.io/projected/c3384223-2ad0-4593-976c-54c2d3cce52e-kube-api-access-flv52\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.774305 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-0\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " pod="openstack/glance-default-external-api-0" Oct 02 11:15:29 crc kubenswrapper[4766]: I1002 11:15:29.892444 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dabeeea6-b022-49f4-b3db-3a7d83f29e51" path="/var/lib/kubelet/pods/dabeeea6-b022-49f4-b3db-3a7d83f29e51/volumes" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.023363 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.324751 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.335810 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"df305f8b-2b53-4032-875e-531accfd848e","Type":"ContainerDied","Data":"4086ea7d66729929e852dcecc41a64ec844392c1174bf398a4d072da1eac1b92"} Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.335871 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.335888 4766 scope.go:117] "RemoveContainer" containerID="174d21f546fbd57c86e95a7c8a925bf9b67f2f3123647b46a85db7d928818a01" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.339467 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ae9f1734-5cfa-41f0-baaf-0b4344a2a14a","Type":"ContainerDied","Data":"ff52492dca70c0637a46784a4a50c75da0154b86aba0b18922b2301999d13dd7"} Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.339561 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.392697 4766 scope.go:117] "RemoveContainer" containerID="e9a0593064a950c84f269116f2d77bb9eee0ad07be18d76c825c76a66b2f0d61" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.422860 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.453521 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.473293 4766 scope.go:117] "RemoveContainer" containerID="689985d395b0444577ec22ae046e000e380b42e62bce27e8d7f73375918a7d5e" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.479568 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.492561 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.524572 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.527450 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.535632 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7979dc8455-gmpjj"] Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.535937 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" podUID="e17e6f13-91a3-4632-9477-81fa3ca78af0" containerName="dnsmasq-dns" containerID="cri-o://d54c10723b7b6845713cb0b9bb5a326178cc22672b05226363388e5c0c74e0f4" gracePeriod=10 Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.546688 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.546971 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.547184 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.553026 4766 scope.go:117] "RemoveContainer" containerID="0aeb5be1e525e3f40e9ec10c464314f0ff3c45d280b7069dac2edf94d312453d" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.570231 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.578357 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.582014 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.582198 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.592895 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.603623 4766 scope.go:117] "RemoveContainer" containerID="d4bd3925e3ecf25a00db896208d504ed75f864301cbcf99f17fb47c2a0535383" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.652904 4766 scope.go:117] "RemoveContainer" containerID="025f10aac32ce0fdc683a046fa9324ba50ecb19358a70588497d2de63e2536bf" Oct 02 11:15:30 crc kubenswrapper[4766]: W1002 11:15:30.654078 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3384223_2ad0_4593_976c_54c2d3cce52e.slice/crio-2678f39c8d212c3ab839bf56d4dd3f9aa1f4b268f5e1cbe92ca7cb046f6d3082 WatchSource:0}: Error finding container 2678f39c8d212c3ab839bf56d4dd3f9aa1f4b268f5e1cbe92ca7cb046f6d3082: Status 404 returned error can't find the container with id 2678f39c8d212c3ab839bf56d4dd3f9aa1f4b268f5e1cbe92ca7cb046f6d3082 Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.661904 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.661990 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrh4w\" (UniqueName: \"kubernetes.io/projected/01ebdded-afd8-4298-bda8-cabdcad97e6a-kube-api-access-lrh4w\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.662014 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.662038 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-scripts\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.662056 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.662081 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01ebdded-afd8-4298-bda8-cabdcad97e6a-run-httpd\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.662114 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01ebdded-afd8-4298-bda8-cabdcad97e6a-log-httpd\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.662133 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.662153 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.662176 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.662210 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.662244 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-config-data\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.662280 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-logs\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.662305 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7x5z\" (UniqueName: \"kubernetes.io/projected/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-kube-api-access-t7x5z\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.662330 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.676131 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.763490 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrh4w\" (UniqueName: \"kubernetes.io/projected/01ebdded-afd8-4298-bda8-cabdcad97e6a-kube-api-access-lrh4w\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.763751 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.763779 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-scripts\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.763809 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.763825 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01ebdded-afd8-4298-bda8-cabdcad97e6a-run-httpd\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.763850 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01ebdded-afd8-4298-bda8-cabdcad97e6a-log-httpd\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.763876 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.763893 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.763913 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.763930 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.763953 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-config-data\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.763985 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-logs\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.764005 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7x5z\" (UniqueName: \"kubernetes.io/projected/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-kube-api-access-t7x5z\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.764032 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.764099 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.764977 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.765343 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-logs\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.765846 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.768076 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01ebdded-afd8-4298-bda8-cabdcad97e6a-run-httpd\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.774529 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.777872 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01ebdded-afd8-4298-bda8-cabdcad97e6a-log-httpd\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.779770 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-config-data\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.788015 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.789146 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.791321 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.792834 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.795199 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.801242 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-scripts\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.801289 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrh4w\" (UniqueName: \"kubernetes.io/projected/01ebdded-afd8-4298-bda8-cabdcad97e6a-kube-api-access-lrh4w\") pod \"ceilometer-0\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.808843 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7x5z\" (UniqueName: \"kubernetes.io/projected/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-kube-api-access-t7x5z\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.819018 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " pod="openstack/glance-default-internal-api-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.877000 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:30 crc kubenswrapper[4766]: I1002 11:15:30.903923 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.179634 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.276564 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffd9s\" (UniqueName: \"kubernetes.io/projected/e17e6f13-91a3-4632-9477-81fa3ca78af0-kube-api-access-ffd9s\") pod \"e17e6f13-91a3-4632-9477-81fa3ca78af0\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.276658 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-config\") pod \"e17e6f13-91a3-4632-9477-81fa3ca78af0\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.276884 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-ovsdbserver-nb\") pod \"e17e6f13-91a3-4632-9477-81fa3ca78af0\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.276969 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-ovsdbserver-sb\") pod \"e17e6f13-91a3-4632-9477-81fa3ca78af0\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.276992 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-dns-swift-storage-0\") pod \"e17e6f13-91a3-4632-9477-81fa3ca78af0\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.277065 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-dns-svc\") pod \"e17e6f13-91a3-4632-9477-81fa3ca78af0\" (UID: \"e17e6f13-91a3-4632-9477-81fa3ca78af0\") " Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.296737 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17e6f13-91a3-4632-9477-81fa3ca78af0-kube-api-access-ffd9s" (OuterVolumeSpecName: "kube-api-access-ffd9s") pod "e17e6f13-91a3-4632-9477-81fa3ca78af0" (UID: "e17e6f13-91a3-4632-9477-81fa3ca78af0"). InnerVolumeSpecName "kube-api-access-ffd9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.333186 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e17e6f13-91a3-4632-9477-81fa3ca78af0" (UID: "e17e6f13-91a3-4632-9477-81fa3ca78af0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.359546 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e17e6f13-91a3-4632-9477-81fa3ca78af0" (UID: "e17e6f13-91a3-4632-9477-81fa3ca78af0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.363496 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-config" (OuterVolumeSpecName: "config") pod "e17e6f13-91a3-4632-9477-81fa3ca78af0" (UID: "e17e6f13-91a3-4632-9477-81fa3ca78af0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.369156 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e17e6f13-91a3-4632-9477-81fa3ca78af0" (UID: "e17e6f13-91a3-4632-9477-81fa3ca78af0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.379062 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffd9s\" (UniqueName: \"kubernetes.io/projected/e17e6f13-91a3-4632-9477-81fa3ca78af0-kube-api-access-ffd9s\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.379093 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.379103 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.379112 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.379120 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.386617 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e17e6f13-91a3-4632-9477-81fa3ca78af0" (UID: "e17e6f13-91a3-4632-9477-81fa3ca78af0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.390758 4766 generic.go:334] "Generic (PLEG): container finished" podID="e17e6f13-91a3-4632-9477-81fa3ca78af0" containerID="d54c10723b7b6845713cb0b9bb5a326178cc22672b05226363388e5c0c74e0f4" exitCode=0 Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.390817 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" event={"ID":"e17e6f13-91a3-4632-9477-81fa3ca78af0","Type":"ContainerDied","Data":"d54c10723b7b6845713cb0b9bb5a326178cc22672b05226363388e5c0c74e0f4"} Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.390843 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" event={"ID":"e17e6f13-91a3-4632-9477-81fa3ca78af0","Type":"ContainerDied","Data":"293ffaa790a41905a17e62b26a1b02bd2f2010f13d8dbe62f5d7cb6837e7995e"} Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.390860 4766 scope.go:117] "RemoveContainer" containerID="d54c10723b7b6845713cb0b9bb5a326178cc22672b05226363388e5c0c74e0f4" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.390969 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7979dc8455-gmpjj" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.396042 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3384223-2ad0-4593-976c-54c2d3cce52e","Type":"ContainerStarted","Data":"2678f39c8d212c3ab839bf56d4dd3f9aa1f4b268f5e1cbe92ca7cb046f6d3082"} Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.419440 4766 scope.go:117] "RemoveContainer" containerID="d496429221b83e5bdb0bd9206db1b115a9ef1902cef3fbeb0245b43237182a7f" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.461441 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-bccc-account-create-qv4hh"] Oct 02 11:15:31 crc kubenswrapper[4766]: E1002 11:15:31.461975 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17e6f13-91a3-4632-9477-81fa3ca78af0" containerName="init" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.461991 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17e6f13-91a3-4632-9477-81fa3ca78af0" containerName="init" Oct 02 11:15:31 crc kubenswrapper[4766]: E1002 11:15:31.462010 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17e6f13-91a3-4632-9477-81fa3ca78af0" containerName="dnsmasq-dns" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.462016 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17e6f13-91a3-4632-9477-81fa3ca78af0" containerName="dnsmasq-dns" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.462213 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e17e6f13-91a3-4632-9477-81fa3ca78af0" containerName="dnsmasq-dns" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.462908 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bccc-account-create-qv4hh" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.469173 4766 scope.go:117] "RemoveContainer" containerID="d54c10723b7b6845713cb0b9bb5a326178cc22672b05226363388e5c0c74e0f4" Oct 02 11:15:31 crc kubenswrapper[4766]: E1002 11:15:31.469616 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d54c10723b7b6845713cb0b9bb5a326178cc22672b05226363388e5c0c74e0f4\": container with ID starting with d54c10723b7b6845713cb0b9bb5a326178cc22672b05226363388e5c0c74e0f4 not found: ID does not exist" containerID="d54c10723b7b6845713cb0b9bb5a326178cc22672b05226363388e5c0c74e0f4" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.469659 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54c10723b7b6845713cb0b9bb5a326178cc22672b05226363388e5c0c74e0f4"} err="failed to get container status \"d54c10723b7b6845713cb0b9bb5a326178cc22672b05226363388e5c0c74e0f4\": rpc error: code = NotFound desc = could not find container \"d54c10723b7b6845713cb0b9bb5a326178cc22672b05226363388e5c0c74e0f4\": container with ID starting with d54c10723b7b6845713cb0b9bb5a326178cc22672b05226363388e5c0c74e0f4 not found: ID does not exist" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.469687 4766 scope.go:117] "RemoveContainer" containerID="d496429221b83e5bdb0bd9206db1b115a9ef1902cef3fbeb0245b43237182a7f" Oct 02 11:15:31 crc kubenswrapper[4766]: E1002 11:15:31.469882 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d496429221b83e5bdb0bd9206db1b115a9ef1902cef3fbeb0245b43237182a7f\": container with ID starting with d496429221b83e5bdb0bd9206db1b115a9ef1902cef3fbeb0245b43237182a7f not found: ID does not exist" containerID="d496429221b83e5bdb0bd9206db1b115a9ef1902cef3fbeb0245b43237182a7f" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.469897 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d496429221b83e5bdb0bd9206db1b115a9ef1902cef3fbeb0245b43237182a7f"} err="failed to get container status \"d496429221b83e5bdb0bd9206db1b115a9ef1902cef3fbeb0245b43237182a7f\": rpc error: code = NotFound desc = could not find container \"d496429221b83e5bdb0bd9206db1b115a9ef1902cef3fbeb0245b43237182a7f\": container with ID starting with d496429221b83e5bdb0bd9206db1b115a9ef1902cef3fbeb0245b43237182a7f not found: ID does not exist" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.470436 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-bccc-account-create-qv4hh"] Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.482548 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.485298 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7979dc8455-gmpjj"] Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.493254 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7979dc8455-gmpjj"] Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.493346 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e17e6f13-91a3-4632-9477-81fa3ca78af0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.527253 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.582166 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:31 crc kubenswrapper[4766]: W1002 11:15:31.590941 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01ebdded_afd8_4298_bda8_cabdcad97e6a.slice/crio-b3a9abf5d0028a94b6888d044b989dd4d601e151b620244e9a6a343b7b6fbc92 WatchSource:0}: Error finding container b3a9abf5d0028a94b6888d044b989dd4d601e151b620244e9a6a343b7b6fbc92: Status 404 returned error can't find the container with id b3a9abf5d0028a94b6888d044b989dd4d601e151b620244e9a6a343b7b6fbc92 Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.591960 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.594782 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5wnf\" (UniqueName: \"kubernetes.io/projected/bc2e60b1-7e85-49d4-8efb-836891b9acba-kube-api-access-n5wnf\") pod \"nova-api-bccc-account-create-qv4hh\" (UID: \"bc2e60b1-7e85-49d4-8efb-836891b9acba\") " pod="openstack/nova-api-bccc-account-create-qv4hh" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.600462 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:15:31 crc kubenswrapper[4766]: W1002 11:15:31.603353 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d3cd6a1_f457_4e7c_93ce_fce8c4a1d598.slice/crio-999fb30da080219cb77954c8dd4c088483abadab4f8b242483e557bbf7b94ab3 WatchSource:0}: Error finding container 999fb30da080219cb77954c8dd4c088483abadab4f8b242483e557bbf7b94ab3: Status 404 returned error can't find the container with id 999fb30da080219cb77954c8dd4c088483abadab4f8b242483e557bbf7b94ab3 Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.653604 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9d41-account-create-gdwhr"] Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.654954 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9d41-account-create-gdwhr" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.658003 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.675352 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9d41-account-create-gdwhr"] Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.695866 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5wnf\" (UniqueName: \"kubernetes.io/projected/bc2e60b1-7e85-49d4-8efb-836891b9acba-kube-api-access-n5wnf\") pod \"nova-api-bccc-account-create-qv4hh\" (UID: \"bc2e60b1-7e85-49d4-8efb-836891b9acba\") " pod="openstack/nova-api-bccc-account-create-qv4hh" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.713920 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5wnf\" (UniqueName: \"kubernetes.io/projected/bc2e60b1-7e85-49d4-8efb-836891b9acba-kube-api-access-n5wnf\") pod \"nova-api-bccc-account-create-qv4hh\" (UID: \"bc2e60b1-7e85-49d4-8efb-836891b9acba\") " pod="openstack/nova-api-bccc-account-create-qv4hh" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.797497 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvfc9\" (UniqueName: \"kubernetes.io/projected/e870bd08-c21d-4537-b2bd-f19eff7e3877-kube-api-access-pvfc9\") pod \"nova-cell0-9d41-account-create-gdwhr\" (UID: \"e870bd08-c21d-4537-b2bd-f19eff7e3877\") " pod="openstack/nova-cell0-9d41-account-create-gdwhr" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.845952 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-6336-account-create-vcx9z"] Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.847497 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6336-account-create-vcx9z" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.849488 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.859981 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6336-account-create-vcx9z"] Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.877845 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bccc-account-create-qv4hh" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.896726 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae9f1734-5cfa-41f0-baaf-0b4344a2a14a" path="/var/lib/kubelet/pods/ae9f1734-5cfa-41f0-baaf-0b4344a2a14a/volumes" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.897456 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df305f8b-2b53-4032-875e-531accfd848e" path="/var/lib/kubelet/pods/df305f8b-2b53-4032-875e-531accfd848e/volumes" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.898696 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e17e6f13-91a3-4632-9477-81fa3ca78af0" path="/var/lib/kubelet/pods/e17e6f13-91a3-4632-9477-81fa3ca78af0/volumes" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.899185 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvfc9\" (UniqueName: \"kubernetes.io/projected/e870bd08-c21d-4537-b2bd-f19eff7e3877-kube-api-access-pvfc9\") pod \"nova-cell0-9d41-account-create-gdwhr\" (UID: \"e870bd08-c21d-4537-b2bd-f19eff7e3877\") " pod="openstack/nova-cell0-9d41-account-create-gdwhr" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.899358 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ddq8\" (UniqueName: \"kubernetes.io/projected/3089b483-f9df-4165-a28e-181e6134f8dc-kube-api-access-4ddq8\") pod \"nova-cell1-6336-account-create-vcx9z\" (UID: \"3089b483-f9df-4165-a28e-181e6134f8dc\") " pod="openstack/nova-cell1-6336-account-create-vcx9z" Oct 02 11:15:31 crc kubenswrapper[4766]: I1002 11:15:31.920199 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvfc9\" (UniqueName: \"kubernetes.io/projected/e870bd08-c21d-4537-b2bd-f19eff7e3877-kube-api-access-pvfc9\") pod \"nova-cell0-9d41-account-create-gdwhr\" (UID: \"e870bd08-c21d-4537-b2bd-f19eff7e3877\") " pod="openstack/nova-cell0-9d41-account-create-gdwhr" Oct 02 11:15:32 crc kubenswrapper[4766]: I1002 11:15:32.000472 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ddq8\" (UniqueName: \"kubernetes.io/projected/3089b483-f9df-4165-a28e-181e6134f8dc-kube-api-access-4ddq8\") pod \"nova-cell1-6336-account-create-vcx9z\" (UID: \"3089b483-f9df-4165-a28e-181e6134f8dc\") " pod="openstack/nova-cell1-6336-account-create-vcx9z" Oct 02 11:15:32 crc kubenswrapper[4766]: I1002 11:15:32.024000 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ddq8\" (UniqueName: \"kubernetes.io/projected/3089b483-f9df-4165-a28e-181e6134f8dc-kube-api-access-4ddq8\") pod \"nova-cell1-6336-account-create-vcx9z\" (UID: \"3089b483-f9df-4165-a28e-181e6134f8dc\") " pod="openstack/nova-cell1-6336-account-create-vcx9z" Oct 02 11:15:32 crc kubenswrapper[4766]: I1002 11:15:32.080068 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9d41-account-create-gdwhr" Oct 02 11:15:32 crc kubenswrapper[4766]: I1002 11:15:32.179093 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6336-account-create-vcx9z" Oct 02 11:15:32 crc kubenswrapper[4766]: I1002 11:15:32.388177 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-bccc-account-create-qv4hh"] Oct 02 11:15:32 crc kubenswrapper[4766]: W1002 11:15:32.396026 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc2e60b1_7e85_49d4_8efb_836891b9acba.slice/crio-8da83756144c52983e871064619f309e90d4695e2362d4be242ea90b7c182468 WatchSource:0}: Error finding container 8da83756144c52983e871064619f309e90d4695e2362d4be242ea90b7c182468: Status 404 returned error can't find the container with id 8da83756144c52983e871064619f309e90d4695e2362d4be242ea90b7c182468 Oct 02 11:15:32 crc kubenswrapper[4766]: I1002 11:15:32.426445 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01ebdded-afd8-4298-bda8-cabdcad97e6a","Type":"ContainerStarted","Data":"b3a9abf5d0028a94b6888d044b989dd4d601e151b620244e9a6a343b7b6fbc92"} Oct 02 11:15:32 crc kubenswrapper[4766]: I1002 11:15:32.431629 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598","Type":"ContainerStarted","Data":"999fb30da080219cb77954c8dd4c088483abadab4f8b242483e557bbf7b94ab3"} Oct 02 11:15:32 crc kubenswrapper[4766]: I1002 11:15:32.443575 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3384223-2ad0-4593-976c-54c2d3cce52e","Type":"ContainerStarted","Data":"f56abd7e1794b77ea62792a0bd79f484e63160d0da2a2b9689b20ab708f80f3a"} Oct 02 11:15:32 crc kubenswrapper[4766]: I1002 11:15:32.630771 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9d41-account-create-gdwhr"] Oct 02 11:15:32 crc kubenswrapper[4766]: I1002 11:15:32.868670 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-6336-account-create-vcx9z"] Oct 02 11:15:32 crc kubenswrapper[4766]: W1002 11:15:32.875950 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3089b483_f9df_4165_a28e_181e6134f8dc.slice/crio-b03fa77e6c5da1ed19c10b903ad7d864097ebd381667c233c1aed1a7b8b2bc64 WatchSource:0}: Error finding container b03fa77e6c5da1ed19c10b903ad7d864097ebd381667c233c1aed1a7b8b2bc64: Status 404 returned error can't find the container with id b03fa77e6c5da1ed19c10b903ad7d864097ebd381667c233c1aed1a7b8b2bc64 Oct 02 11:15:33 crc kubenswrapper[4766]: I1002 11:15:33.453346 4766 generic.go:334] "Generic (PLEG): container finished" podID="e870bd08-c21d-4537-b2bd-f19eff7e3877" containerID="3165e7e8bf3173b47ec39b7859d2841e0ef5c79cdc3391e216907ef6acd85df3" exitCode=0 Oct 02 11:15:33 crc kubenswrapper[4766]: I1002 11:15:33.453449 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9d41-account-create-gdwhr" event={"ID":"e870bd08-c21d-4537-b2bd-f19eff7e3877","Type":"ContainerDied","Data":"3165e7e8bf3173b47ec39b7859d2841e0ef5c79cdc3391e216907ef6acd85df3"} Oct 02 11:15:33 crc kubenswrapper[4766]: I1002 11:15:33.454034 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9d41-account-create-gdwhr" event={"ID":"e870bd08-c21d-4537-b2bd-f19eff7e3877","Type":"ContainerStarted","Data":"f7f9c2130a1b39451722570dee574de9c14ea5e082843a8ae8296d435edb0d6d"} Oct 02 11:15:33 crc kubenswrapper[4766]: I1002 11:15:33.456095 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3384223-2ad0-4593-976c-54c2d3cce52e","Type":"ContainerStarted","Data":"fb1c0454ba668b962552208833c616de4db07019cb885d1cc5bdc0cc294a91b5"} Oct 02 11:15:33 crc kubenswrapper[4766]: I1002 11:15:33.457524 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01ebdded-afd8-4298-bda8-cabdcad97e6a","Type":"ContainerStarted","Data":"9cc03015b720453337fe827785770a92bf5d0bec822dc8d32cf2488360220049"} Oct 02 11:15:33 crc kubenswrapper[4766]: I1002 11:15:33.459162 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598","Type":"ContainerStarted","Data":"1e54d0211edb5bd37091707d979fb377b49c8ac9d64e30887c84f1c90a8d9682"} Oct 02 11:15:33 crc kubenswrapper[4766]: I1002 11:15:33.459209 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598","Type":"ContainerStarted","Data":"4a4f512030ce39ec49c91ab00c8b625423cbf01429bd4a06c2d5b12c2bf0ca55"} Oct 02 11:15:33 crc kubenswrapper[4766]: I1002 11:15:33.461247 4766 generic.go:334] "Generic (PLEG): container finished" podID="bc2e60b1-7e85-49d4-8efb-836891b9acba" containerID="0e158a295e0a51624444cffd5cf073f04a1ecabdd07a8ff50f1a37d13288e8a2" exitCode=0 Oct 02 11:15:33 crc kubenswrapper[4766]: I1002 11:15:33.461293 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bccc-account-create-qv4hh" event={"ID":"bc2e60b1-7e85-49d4-8efb-836891b9acba","Type":"ContainerDied","Data":"0e158a295e0a51624444cffd5cf073f04a1ecabdd07a8ff50f1a37d13288e8a2"} Oct 02 11:15:33 crc kubenswrapper[4766]: I1002 11:15:33.461312 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bccc-account-create-qv4hh" event={"ID":"bc2e60b1-7e85-49d4-8efb-836891b9acba","Type":"ContainerStarted","Data":"8da83756144c52983e871064619f309e90d4695e2362d4be242ea90b7c182468"} Oct 02 11:15:33 crc kubenswrapper[4766]: I1002 11:15:33.462925 4766 generic.go:334] "Generic (PLEG): container finished" podID="3089b483-f9df-4165-a28e-181e6134f8dc" containerID="7c63b673954ab2d5062dc5c8c71c3cfb5b1fdbffd48e5e8ae4259cb2050e9614" exitCode=0 Oct 02 11:15:33 crc kubenswrapper[4766]: I1002 11:15:33.462979 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6336-account-create-vcx9z" event={"ID":"3089b483-f9df-4165-a28e-181e6134f8dc","Type":"ContainerDied","Data":"7c63b673954ab2d5062dc5c8c71c3cfb5b1fdbffd48e5e8ae4259cb2050e9614"} Oct 02 11:15:33 crc kubenswrapper[4766]: I1002 11:15:33.463010 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6336-account-create-vcx9z" event={"ID":"3089b483-f9df-4165-a28e-181e6134f8dc","Type":"ContainerStarted","Data":"b03fa77e6c5da1ed19c10b903ad7d864097ebd381667c233c1aed1a7b8b2bc64"} Oct 02 11:15:33 crc kubenswrapper[4766]: I1002 11:15:33.511712 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.511692639 podStartE2EDuration="4.511692639s" podCreationTimestamp="2025-10-02 11:15:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:33.500902745 +0000 UTC m=+1448.443773689" watchObservedRunningTime="2025-10-02 11:15:33.511692639 +0000 UTC m=+1448.454563583" Oct 02 11:15:33 crc kubenswrapper[4766]: I1002 11:15:33.611480 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.611459907 podStartE2EDuration="3.611459907s" podCreationTimestamp="2025-10-02 11:15:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:33.557002227 +0000 UTC m=+1448.499873171" watchObservedRunningTime="2025-10-02 11:15:33.611459907 +0000 UTC m=+1448.554330851" Oct 02 11:15:33 crc kubenswrapper[4766]: I1002 11:15:33.903408 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 11:15:34 crc kubenswrapper[4766]: I1002 11:15:34.483810 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01ebdded-afd8-4298-bda8-cabdcad97e6a","Type":"ContainerStarted","Data":"37cf881a0c6b2efceae3bbbbead8e6f6a82fd84789fa2ae98fa263fdf594c0b2"} Oct 02 11:15:34 crc kubenswrapper[4766]: I1002 11:15:34.484220 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01ebdded-afd8-4298-bda8-cabdcad97e6a","Type":"ContainerStarted","Data":"43fc61b65709eff146fcb43653654afc9f4e595c3fa3189996f824ee082546a4"} Oct 02 11:15:34 crc kubenswrapper[4766]: I1002 11:15:34.995832 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9d41-account-create-gdwhr" Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.060810 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvfc9\" (UniqueName: \"kubernetes.io/projected/e870bd08-c21d-4537-b2bd-f19eff7e3877-kube-api-access-pvfc9\") pod \"e870bd08-c21d-4537-b2bd-f19eff7e3877\" (UID: \"e870bd08-c21d-4537-b2bd-f19eff7e3877\") " Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.069532 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e870bd08-c21d-4537-b2bd-f19eff7e3877-kube-api-access-pvfc9" (OuterVolumeSpecName: "kube-api-access-pvfc9") pod "e870bd08-c21d-4537-b2bd-f19eff7e3877" (UID: "e870bd08-c21d-4537-b2bd-f19eff7e3877"). InnerVolumeSpecName "kube-api-access-pvfc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.146893 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bccc-account-create-qv4hh" Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.151817 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6336-account-create-vcx9z" Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.162571 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvfc9\" (UniqueName: \"kubernetes.io/projected/e870bd08-c21d-4537-b2bd-f19eff7e3877-kube-api-access-pvfc9\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.264059 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5wnf\" (UniqueName: \"kubernetes.io/projected/bc2e60b1-7e85-49d4-8efb-836891b9acba-kube-api-access-n5wnf\") pod \"bc2e60b1-7e85-49d4-8efb-836891b9acba\" (UID: \"bc2e60b1-7e85-49d4-8efb-836891b9acba\") " Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.264251 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ddq8\" (UniqueName: \"kubernetes.io/projected/3089b483-f9df-4165-a28e-181e6134f8dc-kube-api-access-4ddq8\") pod \"3089b483-f9df-4165-a28e-181e6134f8dc\" (UID: \"3089b483-f9df-4165-a28e-181e6134f8dc\") " Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.267311 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc2e60b1-7e85-49d4-8efb-836891b9acba-kube-api-access-n5wnf" (OuterVolumeSpecName: "kube-api-access-n5wnf") pod "bc2e60b1-7e85-49d4-8efb-836891b9acba" (UID: "bc2e60b1-7e85-49d4-8efb-836891b9acba"). InnerVolumeSpecName "kube-api-access-n5wnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.268152 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3089b483-f9df-4165-a28e-181e6134f8dc-kube-api-access-4ddq8" (OuterVolumeSpecName: "kube-api-access-4ddq8") pod "3089b483-f9df-4165-a28e-181e6134f8dc" (UID: "3089b483-f9df-4165-a28e-181e6134f8dc"). InnerVolumeSpecName "kube-api-access-4ddq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.366777 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ddq8\" (UniqueName: \"kubernetes.io/projected/3089b483-f9df-4165-a28e-181e6134f8dc-kube-api-access-4ddq8\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.366817 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5wnf\" (UniqueName: \"kubernetes.io/projected/bc2e60b1-7e85-49d4-8efb-836891b9acba-kube-api-access-n5wnf\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.499244 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9d41-account-create-gdwhr" Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.499254 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9d41-account-create-gdwhr" event={"ID":"e870bd08-c21d-4537-b2bd-f19eff7e3877","Type":"ContainerDied","Data":"f7f9c2130a1b39451722570dee574de9c14ea5e082843a8ae8296d435edb0d6d"} Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.499390 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7f9c2130a1b39451722570dee574de9c14ea5e082843a8ae8296d435edb0d6d" Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.501474 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-bccc-account-create-qv4hh" event={"ID":"bc2e60b1-7e85-49d4-8efb-836891b9acba","Type":"ContainerDied","Data":"8da83756144c52983e871064619f309e90d4695e2362d4be242ea90b7c182468"} Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.501535 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8da83756144c52983e871064619f309e90d4695e2362d4be242ea90b7c182468" Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.501602 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-bccc-account-create-qv4hh" Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.503747 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-6336-account-create-vcx9z" event={"ID":"3089b483-f9df-4165-a28e-181e6134f8dc","Type":"ContainerDied","Data":"b03fa77e6c5da1ed19c10b903ad7d864097ebd381667c233c1aed1a7b8b2bc64"} Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.503786 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b03fa77e6c5da1ed19c10b903ad7d864097ebd381667c233c1aed1a7b8b2bc64" Oct 02 11:15:35 crc kubenswrapper[4766]: I1002 11:15:35.503803 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-6336-account-create-vcx9z" Oct 02 11:15:36 crc kubenswrapper[4766]: I1002 11:15:36.515327 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01ebdded-afd8-4298-bda8-cabdcad97e6a","Type":"ContainerStarted","Data":"ab458a23d67d2f5ed976e731085c7e2807e5bf2c8018086e53a405a16e823478"} Oct 02 11:15:36 crc kubenswrapper[4766]: I1002 11:15:36.515814 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:15:36 crc kubenswrapper[4766]: I1002 11:15:36.515537 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerName="sg-core" containerID="cri-o://37cf881a0c6b2efceae3bbbbead8e6f6a82fd84789fa2ae98fa263fdf594c0b2" gracePeriod=30 Oct 02 11:15:36 crc kubenswrapper[4766]: I1002 11:15:36.515469 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerName="ceilometer-central-agent" containerID="cri-o://9cc03015b720453337fe827785770a92bf5d0bec822dc8d32cf2488360220049" gracePeriod=30 Oct 02 11:15:36 crc kubenswrapper[4766]: I1002 11:15:36.515825 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerName="ceilometer-notification-agent" containerID="cri-o://43fc61b65709eff146fcb43653654afc9f4e595c3fa3189996f824ee082546a4" gracePeriod=30 Oct 02 11:15:36 crc kubenswrapper[4766]: I1002 11:15:36.515556 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerName="proxy-httpd" containerID="cri-o://ab458a23d67d2f5ed976e731085c7e2807e5bf2c8018086e53a405a16e823478" gracePeriod=30 Oct 02 11:15:36 crc kubenswrapper[4766]: I1002 11:15:36.573577 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.524049055 podStartE2EDuration="6.573556054s" podCreationTimestamp="2025-10-02 11:15:30 +0000 UTC" firstStartedPulling="2025-10-02 11:15:31.600249821 +0000 UTC m=+1446.543120765" lastFinishedPulling="2025-10-02 11:15:35.64975682 +0000 UTC m=+1450.592627764" observedRunningTime="2025-10-02 11:15:36.565765905 +0000 UTC m=+1451.508636849" watchObservedRunningTime="2025-10-02 11:15:36.573556054 +0000 UTC m=+1451.516426988" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.026210 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4gmwl"] Oct 02 11:15:37 crc kubenswrapper[4766]: E1002 11:15:37.026961 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc2e60b1-7e85-49d4-8efb-836891b9acba" containerName="mariadb-account-create" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.026982 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc2e60b1-7e85-49d4-8efb-836891b9acba" containerName="mariadb-account-create" Oct 02 11:15:37 crc kubenswrapper[4766]: E1002 11:15:37.027039 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e870bd08-c21d-4537-b2bd-f19eff7e3877" containerName="mariadb-account-create" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.027049 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e870bd08-c21d-4537-b2bd-f19eff7e3877" containerName="mariadb-account-create" Oct 02 11:15:37 crc kubenswrapper[4766]: E1002 11:15:37.027107 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3089b483-f9df-4165-a28e-181e6134f8dc" containerName="mariadb-account-create" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.027117 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3089b483-f9df-4165-a28e-181e6134f8dc" containerName="mariadb-account-create" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.027423 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e870bd08-c21d-4537-b2bd-f19eff7e3877" containerName="mariadb-account-create" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.027451 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3089b483-f9df-4165-a28e-181e6134f8dc" containerName="mariadb-account-create" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.027490 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc2e60b1-7e85-49d4-8efb-836891b9acba" containerName="mariadb-account-create" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.028508 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4gmwl" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.030515 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-44h9r" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.030807 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.030948 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.038671 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4gmwl"] Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.107740 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jtz9\" (UniqueName: \"kubernetes.io/projected/c803d467-a739-40aa-9dc9-4f04e6e14923-kube-api-access-7jtz9\") pod \"nova-cell0-conductor-db-sync-4gmwl\" (UID: \"c803d467-a739-40aa-9dc9-4f04e6e14923\") " pod="openstack/nova-cell0-conductor-db-sync-4gmwl" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.107813 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-config-data\") pod \"nova-cell0-conductor-db-sync-4gmwl\" (UID: \"c803d467-a739-40aa-9dc9-4f04e6e14923\") " pod="openstack/nova-cell0-conductor-db-sync-4gmwl" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.107836 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4gmwl\" (UID: \"c803d467-a739-40aa-9dc9-4f04e6e14923\") " pod="openstack/nova-cell0-conductor-db-sync-4gmwl" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.108141 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-scripts\") pod \"nova-cell0-conductor-db-sync-4gmwl\" (UID: \"c803d467-a739-40aa-9dc9-4f04e6e14923\") " pod="openstack/nova-cell0-conductor-db-sync-4gmwl" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.210565 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jtz9\" (UniqueName: \"kubernetes.io/projected/c803d467-a739-40aa-9dc9-4f04e6e14923-kube-api-access-7jtz9\") pod \"nova-cell0-conductor-db-sync-4gmwl\" (UID: \"c803d467-a739-40aa-9dc9-4f04e6e14923\") " pod="openstack/nova-cell0-conductor-db-sync-4gmwl" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.210645 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-config-data\") pod \"nova-cell0-conductor-db-sync-4gmwl\" (UID: \"c803d467-a739-40aa-9dc9-4f04e6e14923\") " pod="openstack/nova-cell0-conductor-db-sync-4gmwl" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.210672 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4gmwl\" (UID: \"c803d467-a739-40aa-9dc9-4f04e6e14923\") " pod="openstack/nova-cell0-conductor-db-sync-4gmwl" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.210751 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-scripts\") pod \"nova-cell0-conductor-db-sync-4gmwl\" (UID: \"c803d467-a739-40aa-9dc9-4f04e6e14923\") " pod="openstack/nova-cell0-conductor-db-sync-4gmwl" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.215740 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-scripts\") pod \"nova-cell0-conductor-db-sync-4gmwl\" (UID: \"c803d467-a739-40aa-9dc9-4f04e6e14923\") " pod="openstack/nova-cell0-conductor-db-sync-4gmwl" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.216218 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-config-data\") pod \"nova-cell0-conductor-db-sync-4gmwl\" (UID: \"c803d467-a739-40aa-9dc9-4f04e6e14923\") " pod="openstack/nova-cell0-conductor-db-sync-4gmwl" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.219015 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-4gmwl\" (UID: \"c803d467-a739-40aa-9dc9-4f04e6e14923\") " pod="openstack/nova-cell0-conductor-db-sync-4gmwl" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.229928 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jtz9\" (UniqueName: \"kubernetes.io/projected/c803d467-a739-40aa-9dc9-4f04e6e14923-kube-api-access-7jtz9\") pod \"nova-cell0-conductor-db-sync-4gmwl\" (UID: \"c803d467-a739-40aa-9dc9-4f04e6e14923\") " pod="openstack/nova-cell0-conductor-db-sync-4gmwl" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.413089 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4gmwl" Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.528651 4766 generic.go:334] "Generic (PLEG): container finished" podID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerID="ab458a23d67d2f5ed976e731085c7e2807e5bf2c8018086e53a405a16e823478" exitCode=0 Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.528896 4766 generic.go:334] "Generic (PLEG): container finished" podID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerID="37cf881a0c6b2efceae3bbbbead8e6f6a82fd84789fa2ae98fa263fdf594c0b2" exitCode=2 Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.528906 4766 generic.go:334] "Generic (PLEG): container finished" podID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerID="43fc61b65709eff146fcb43653654afc9f4e595c3fa3189996f824ee082546a4" exitCode=0 Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.528926 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01ebdded-afd8-4298-bda8-cabdcad97e6a","Type":"ContainerDied","Data":"ab458a23d67d2f5ed976e731085c7e2807e5bf2c8018086e53a405a16e823478"} Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.528949 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01ebdded-afd8-4298-bda8-cabdcad97e6a","Type":"ContainerDied","Data":"37cf881a0c6b2efceae3bbbbead8e6f6a82fd84789fa2ae98fa263fdf594c0b2"} Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.528960 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01ebdded-afd8-4298-bda8-cabdcad97e6a","Type":"ContainerDied","Data":"43fc61b65709eff146fcb43653654afc9f4e595c3fa3189996f824ee082546a4"} Oct 02 11:15:37 crc kubenswrapper[4766]: I1002 11:15:37.897256 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4gmwl"] Oct 02 11:15:37 crc kubenswrapper[4766]: W1002 11:15:37.917672 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc803d467_a739_40aa_9dc9_4f04e6e14923.slice/crio-fa9052e62690f95e2191cf268e897bb0a1e860a19cd2ec9edf0765924cc3e565 WatchSource:0}: Error finding container fa9052e62690f95e2191cf268e897bb0a1e860a19cd2ec9edf0765924cc3e565: Status 404 returned error can't find the container with id fa9052e62690f95e2191cf268e897bb0a1e860a19cd2ec9edf0765924cc3e565 Oct 02 11:15:38 crc kubenswrapper[4766]: I1002 11:15:38.541132 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4gmwl" event={"ID":"c803d467-a739-40aa-9dc9-4f04e6e14923","Type":"ContainerStarted","Data":"fa9052e62690f95e2191cf268e897bb0a1e860a19cd2ec9edf0765924cc3e565"} Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.023654 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.024018 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.059372 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.073242 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.125714 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.167112 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01ebdded-afd8-4298-bda8-cabdcad97e6a-run-httpd\") pod \"01ebdded-afd8-4298-bda8-cabdcad97e6a\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.167230 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-scripts\") pod \"01ebdded-afd8-4298-bda8-cabdcad97e6a\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.167251 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-config-data\") pod \"01ebdded-afd8-4298-bda8-cabdcad97e6a\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.167276 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-combined-ca-bundle\") pod \"01ebdded-afd8-4298-bda8-cabdcad97e6a\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.167320 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-sg-core-conf-yaml\") pod \"01ebdded-afd8-4298-bda8-cabdcad97e6a\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.167347 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrh4w\" (UniqueName: \"kubernetes.io/projected/01ebdded-afd8-4298-bda8-cabdcad97e6a-kube-api-access-lrh4w\") pod \"01ebdded-afd8-4298-bda8-cabdcad97e6a\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.167464 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01ebdded-afd8-4298-bda8-cabdcad97e6a-log-httpd\") pod \"01ebdded-afd8-4298-bda8-cabdcad97e6a\" (UID: \"01ebdded-afd8-4298-bda8-cabdcad97e6a\") " Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.168652 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01ebdded-afd8-4298-bda8-cabdcad97e6a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "01ebdded-afd8-4298-bda8-cabdcad97e6a" (UID: "01ebdded-afd8-4298-bda8-cabdcad97e6a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.169239 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01ebdded-afd8-4298-bda8-cabdcad97e6a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "01ebdded-afd8-4298-bda8-cabdcad97e6a" (UID: "01ebdded-afd8-4298-bda8-cabdcad97e6a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.176410 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-scripts" (OuterVolumeSpecName: "scripts") pod "01ebdded-afd8-4298-bda8-cabdcad97e6a" (UID: "01ebdded-afd8-4298-bda8-cabdcad97e6a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.178061 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ebdded-afd8-4298-bda8-cabdcad97e6a-kube-api-access-lrh4w" (OuterVolumeSpecName: "kube-api-access-lrh4w") pod "01ebdded-afd8-4298-bda8-cabdcad97e6a" (UID: "01ebdded-afd8-4298-bda8-cabdcad97e6a"). InnerVolumeSpecName "kube-api-access-lrh4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.269318 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01ebdded-afd8-4298-bda8-cabdcad97e6a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.269347 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01ebdded-afd8-4298-bda8-cabdcad97e6a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.269356 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.269365 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrh4w\" (UniqueName: \"kubernetes.io/projected/01ebdded-afd8-4298-bda8-cabdcad97e6a-kube-api-access-lrh4w\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.272652 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "01ebdded-afd8-4298-bda8-cabdcad97e6a" (UID: "01ebdded-afd8-4298-bda8-cabdcad97e6a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.341750 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-config-data" (OuterVolumeSpecName: "config-data") pod "01ebdded-afd8-4298-bda8-cabdcad97e6a" (UID: "01ebdded-afd8-4298-bda8-cabdcad97e6a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.370884 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.370932 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.390841 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01ebdded-afd8-4298-bda8-cabdcad97e6a" (UID: "01ebdded-afd8-4298-bda8-cabdcad97e6a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.472340 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01ebdded-afd8-4298-bda8-cabdcad97e6a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.571731 4766 generic.go:334] "Generic (PLEG): container finished" podID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerID="9cc03015b720453337fe827785770a92bf5d0bec822dc8d32cf2488360220049" exitCode=0 Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.571832 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01ebdded-afd8-4298-bda8-cabdcad97e6a","Type":"ContainerDied","Data":"9cc03015b720453337fe827785770a92bf5d0bec822dc8d32cf2488360220049"} Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.571902 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01ebdded-afd8-4298-bda8-cabdcad97e6a","Type":"ContainerDied","Data":"b3a9abf5d0028a94b6888d044b989dd4d601e151b620244e9a6a343b7b6fbc92"} Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.571925 4766 scope.go:117] "RemoveContainer" containerID="ab458a23d67d2f5ed976e731085c7e2807e5bf2c8018086e53a405a16e823478" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.571842 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.576558 4766 generic.go:334] "Generic (PLEG): container finished" podID="69a16b06-e649-4c66-94e9-7cda4fb8c135" containerID="72397d872ce507fa92e345865588eec799e5986148feeabfabfecfa3853620de" exitCode=137 Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.576624 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"69a16b06-e649-4c66-94e9-7cda4fb8c135","Type":"ContainerDied","Data":"72397d872ce507fa92e345865588eec799e5986148feeabfabfecfa3853620de"} Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.577230 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.577447 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.602807 4766 scope.go:117] "RemoveContainer" containerID="37cf881a0c6b2efceae3bbbbead8e6f6a82fd84789fa2ae98fa263fdf594c0b2" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.609833 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.621380 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.639341 4766 scope.go:117] "RemoveContainer" containerID="43fc61b65709eff146fcb43653654afc9f4e595c3fa3189996f824ee082546a4" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.643563 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:40 crc kubenswrapper[4766]: E1002 11:15:40.643922 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerName="proxy-httpd" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.643934 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerName="proxy-httpd" Oct 02 11:15:40 crc kubenswrapper[4766]: E1002 11:15:40.643949 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerName="sg-core" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.643955 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerName="sg-core" Oct 02 11:15:40 crc kubenswrapper[4766]: E1002 11:15:40.643963 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerName="ceilometer-notification-agent" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.643970 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerName="ceilometer-notification-agent" Oct 02 11:15:40 crc kubenswrapper[4766]: E1002 11:15:40.643998 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerName="ceilometer-central-agent" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.644004 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerName="ceilometer-central-agent" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.644277 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerName="proxy-httpd" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.644297 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerName="ceilometer-central-agent" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.644311 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerName="sg-core" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.644320 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ebdded-afd8-4298-bda8-cabdcad97e6a" containerName="ceilometer-notification-agent" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.647046 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.653559 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.653754 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.670226 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.746674 4766 scope.go:117] "RemoveContainer" containerID="9cc03015b720453337fe827785770a92bf5d0bec822dc8d32cf2488360220049" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.791083 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8344bd0-6491-40b6-8b75-dd9731647d98-log-httpd\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.791579 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.791644 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-config-data\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.791738 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-scripts\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.791906 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8344bd0-6491-40b6-8b75-dd9731647d98-run-httpd\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.792025 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.792075 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjg2p\" (UniqueName: \"kubernetes.io/projected/c8344bd0-6491-40b6-8b75-dd9731647d98-kube-api-access-pjg2p\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.884143 4766 scope.go:117] "RemoveContainer" containerID="ab458a23d67d2f5ed976e731085c7e2807e5bf2c8018086e53a405a16e823478" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.885230 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:15:40 crc kubenswrapper[4766]: E1002 11:15:40.885609 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab458a23d67d2f5ed976e731085c7e2807e5bf2c8018086e53a405a16e823478\": container with ID starting with ab458a23d67d2f5ed976e731085c7e2807e5bf2c8018086e53a405a16e823478 not found: ID does not exist" containerID="ab458a23d67d2f5ed976e731085c7e2807e5bf2c8018086e53a405a16e823478" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.885639 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab458a23d67d2f5ed976e731085c7e2807e5bf2c8018086e53a405a16e823478"} err="failed to get container status \"ab458a23d67d2f5ed976e731085c7e2807e5bf2c8018086e53a405a16e823478\": rpc error: code = NotFound desc = could not find container \"ab458a23d67d2f5ed976e731085c7e2807e5bf2c8018086e53a405a16e823478\": container with ID starting with ab458a23d67d2f5ed976e731085c7e2807e5bf2c8018086e53a405a16e823478 not found: ID does not exist" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.885665 4766 scope.go:117] "RemoveContainer" containerID="37cf881a0c6b2efceae3bbbbead8e6f6a82fd84789fa2ae98fa263fdf594c0b2" Oct 02 11:15:40 crc kubenswrapper[4766]: E1002 11:15:40.886836 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37cf881a0c6b2efceae3bbbbead8e6f6a82fd84789fa2ae98fa263fdf594c0b2\": container with ID starting with 37cf881a0c6b2efceae3bbbbead8e6f6a82fd84789fa2ae98fa263fdf594c0b2 not found: ID does not exist" containerID="37cf881a0c6b2efceae3bbbbead8e6f6a82fd84789fa2ae98fa263fdf594c0b2" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.886898 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37cf881a0c6b2efceae3bbbbead8e6f6a82fd84789fa2ae98fa263fdf594c0b2"} err="failed to get container status \"37cf881a0c6b2efceae3bbbbead8e6f6a82fd84789fa2ae98fa263fdf594c0b2\": rpc error: code = NotFound desc = could not find container \"37cf881a0c6b2efceae3bbbbead8e6f6a82fd84789fa2ae98fa263fdf594c0b2\": container with ID starting with 37cf881a0c6b2efceae3bbbbead8e6f6a82fd84789fa2ae98fa263fdf594c0b2 not found: ID does not exist" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.886919 4766 scope.go:117] "RemoveContainer" containerID="43fc61b65709eff146fcb43653654afc9f4e595c3fa3189996f824ee082546a4" Oct 02 11:15:40 crc kubenswrapper[4766]: E1002 11:15:40.888797 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43fc61b65709eff146fcb43653654afc9f4e595c3fa3189996f824ee082546a4\": container with ID starting with 43fc61b65709eff146fcb43653654afc9f4e595c3fa3189996f824ee082546a4 not found: ID does not exist" containerID="43fc61b65709eff146fcb43653654afc9f4e595c3fa3189996f824ee082546a4" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.888838 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43fc61b65709eff146fcb43653654afc9f4e595c3fa3189996f824ee082546a4"} err="failed to get container status \"43fc61b65709eff146fcb43653654afc9f4e595c3fa3189996f824ee082546a4\": rpc error: code = NotFound desc = could not find container \"43fc61b65709eff146fcb43653654afc9f4e595c3fa3189996f824ee082546a4\": container with ID starting with 43fc61b65709eff146fcb43653654afc9f4e595c3fa3189996f824ee082546a4 not found: ID does not exist" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.888862 4766 scope.go:117] "RemoveContainer" containerID="9cc03015b720453337fe827785770a92bf5d0bec822dc8d32cf2488360220049" Oct 02 11:15:40 crc kubenswrapper[4766]: E1002 11:15:40.889133 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cc03015b720453337fe827785770a92bf5d0bec822dc8d32cf2488360220049\": container with ID starting with 9cc03015b720453337fe827785770a92bf5d0bec822dc8d32cf2488360220049 not found: ID does not exist" containerID="9cc03015b720453337fe827785770a92bf5d0bec822dc8d32cf2488360220049" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.889163 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cc03015b720453337fe827785770a92bf5d0bec822dc8d32cf2488360220049"} err="failed to get container status \"9cc03015b720453337fe827785770a92bf5d0bec822dc8d32cf2488360220049\": rpc error: code = NotFound desc = could not find container \"9cc03015b720453337fe827785770a92bf5d0bec822dc8d32cf2488360220049\": container with ID starting with 9cc03015b720453337fe827785770a92bf5d0bec822dc8d32cf2488360220049 not found: ID does not exist" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.895069 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8344bd0-6491-40b6-8b75-dd9731647d98-log-httpd\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.895163 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.895228 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-config-data\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.895312 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-scripts\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.895413 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8344bd0-6491-40b6-8b75-dd9731647d98-run-httpd\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.895489 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.895556 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjg2p\" (UniqueName: \"kubernetes.io/projected/c8344bd0-6491-40b6-8b75-dd9731647d98-kube-api-access-pjg2p\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.896443 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8344bd0-6491-40b6-8b75-dd9731647d98-log-httpd\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.897575 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8344bd0-6491-40b6-8b75-dd9731647d98-run-httpd\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.904584 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-config-data\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.906175 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.907582 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.907631 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.914141 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.925398 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-scripts\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.950181 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjg2p\" (UniqueName: \"kubernetes.io/projected/c8344bd0-6491-40b6-8b75-dd9731647d98-kube-api-access-pjg2p\") pod \"ceilometer-0\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " pod="openstack/ceilometer-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.964901 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.981912 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.999687 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-config-data-custom\") pod \"69a16b06-e649-4c66-94e9-7cda4fb8c135\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.999722 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/69a16b06-e649-4c66-94e9-7cda4fb8c135-etc-machine-id\") pod \"69a16b06-e649-4c66-94e9-7cda4fb8c135\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.999785 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69a16b06-e649-4c66-94e9-7cda4fb8c135-logs\") pod \"69a16b06-e649-4c66-94e9-7cda4fb8c135\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " Oct 02 11:15:40 crc kubenswrapper[4766]: I1002 11:15:40.999805 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-scripts\") pod \"69a16b06-e649-4c66-94e9-7cda4fb8c135\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:40.999913 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-combined-ca-bundle\") pod \"69a16b06-e649-4c66-94e9-7cda4fb8c135\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:40.999965 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp9kl\" (UniqueName: \"kubernetes.io/projected/69a16b06-e649-4c66-94e9-7cda4fb8c135-kube-api-access-sp9kl\") pod \"69a16b06-e649-4c66-94e9-7cda4fb8c135\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.000000 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-config-data\") pod \"69a16b06-e649-4c66-94e9-7cda4fb8c135\" (UID: \"69a16b06-e649-4c66-94e9-7cda4fb8c135\") " Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.000008 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.000100 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69a16b06-e649-4c66-94e9-7cda4fb8c135-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "69a16b06-e649-4c66-94e9-7cda4fb8c135" (UID: "69a16b06-e649-4c66-94e9-7cda4fb8c135"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.000938 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69a16b06-e649-4c66-94e9-7cda4fb8c135-logs" (OuterVolumeSpecName: "logs") pod "69a16b06-e649-4c66-94e9-7cda4fb8c135" (UID: "69a16b06-e649-4c66-94e9-7cda4fb8c135"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.004586 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-scripts" (OuterVolumeSpecName: "scripts") pod "69a16b06-e649-4c66-94e9-7cda4fb8c135" (UID: "69a16b06-e649-4c66-94e9-7cda4fb8c135"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.015050 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a16b06-e649-4c66-94e9-7cda4fb8c135-kube-api-access-sp9kl" (OuterVolumeSpecName: "kube-api-access-sp9kl") pod "69a16b06-e649-4c66-94e9-7cda4fb8c135" (UID: "69a16b06-e649-4c66-94e9-7cda4fb8c135"). InnerVolumeSpecName "kube-api-access-sp9kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.015150 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "69a16b06-e649-4c66-94e9-7cda4fb8c135" (UID: "69a16b06-e649-4c66-94e9-7cda4fb8c135"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.044912 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69a16b06-e649-4c66-94e9-7cda4fb8c135" (UID: "69a16b06-e649-4c66-94e9-7cda4fb8c135"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.085474 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-config-data" (OuterVolumeSpecName: "config-data") pod "69a16b06-e649-4c66-94e9-7cda4fb8c135" (UID: "69a16b06-e649-4c66-94e9-7cda4fb8c135"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.102388 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69a16b06-e649-4c66-94e9-7cda4fb8c135-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.102422 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.102432 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.102444 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp9kl\" (UniqueName: \"kubernetes.io/projected/69a16b06-e649-4c66-94e9-7cda4fb8c135-kube-api-access-sp9kl\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.102452 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.102461 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/69a16b06-e649-4c66-94e9-7cda4fb8c135-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.102469 4766 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/69a16b06-e649-4c66-94e9-7cda4fb8c135-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.620951 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.621600 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"69a16b06-e649-4c66-94e9-7cda4fb8c135","Type":"ContainerDied","Data":"cb42efd32d1dadebc5d5a2e6e75f09e166e0f167a399a64addec663fce33b19b"} Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.621682 4766 scope.go:117] "RemoveContainer" containerID="72397d872ce507fa92e345865588eec799e5986148feeabfabfecfa3853620de" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.622523 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.622912 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.628964 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.658345 4766 scope.go:117] "RemoveContainer" containerID="4c673cf07b5afbfa3104c07fa205845d5ee675d264763c597c7f6688aad6e342" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.666185 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.699250 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.706273 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:41 crc kubenswrapper[4766]: E1002 11:15:41.706782 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a16b06-e649-4c66-94e9-7cda4fb8c135" containerName="cinder-api" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.706799 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a16b06-e649-4c66-94e9-7cda4fb8c135" containerName="cinder-api" Oct 02 11:15:41 crc kubenswrapper[4766]: E1002 11:15:41.706823 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a16b06-e649-4c66-94e9-7cda4fb8c135" containerName="cinder-api-log" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.706831 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a16b06-e649-4c66-94e9-7cda4fb8c135" containerName="cinder-api-log" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.707057 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="69a16b06-e649-4c66-94e9-7cda4fb8c135" containerName="cinder-api" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.707079 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="69a16b06-e649-4c66-94e9-7cda4fb8c135" containerName="cinder-api-log" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.708257 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.715529 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.715851 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.716198 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.721726 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.815077 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.815121 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.815267 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.815425 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-logs\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.815474 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-scripts\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.815512 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-config-data\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.815567 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.815656 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-config-data-custom\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.815676 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bglnx\" (UniqueName: \"kubernetes.io/projected/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-kube-api-access-bglnx\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.896156 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ebdded-afd8-4298-bda8-cabdcad97e6a" path="/var/lib/kubelet/pods/01ebdded-afd8-4298-bda8-cabdcad97e6a/volumes" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.897122 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69a16b06-e649-4c66-94e9-7cda4fb8c135" path="/var/lib/kubelet/pods/69a16b06-e649-4c66-94e9-7cda4fb8c135/volumes" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.917499 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-scripts\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.917594 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-config-data\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.918538 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.918653 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-config-data-custom\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.918688 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bglnx\" (UniqueName: \"kubernetes.io/projected/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-kube-api-access-bglnx\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.918794 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.918819 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.918855 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.918971 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-logs\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.919422 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-logs\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.919474 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.924033 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-config-data\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.924779 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.925715 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-config-data-custom\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.925929 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.930759 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-scripts\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.941037 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:41 crc kubenswrapper[4766]: I1002 11:15:41.945085 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bglnx\" (UniqueName: \"kubernetes.io/projected/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-kube-api-access-bglnx\") pod \"cinder-api-0\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " pod="openstack/cinder-api-0" Oct 02 11:15:42 crc kubenswrapper[4766]: I1002 11:15:42.046019 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:15:42 crc kubenswrapper[4766]: I1002 11:15:42.634820 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8344bd0-6491-40b6-8b75-dd9731647d98","Type":"ContainerStarted","Data":"5ced3759fae334d4331d6567928228faf6a043cff38faabfc0da650ef47dffe3"} Oct 02 11:15:42 crc kubenswrapper[4766]: I1002 11:15:42.934368 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 11:15:42 crc kubenswrapper[4766]: I1002 11:15:42.934833 4766 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:15:42 crc kubenswrapper[4766]: I1002 11:15:42.951843 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 11:15:43 crc kubenswrapper[4766]: I1002 11:15:43.901456 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:43 crc kubenswrapper[4766]: I1002 11:15:43.901575 4766 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 11:15:44 crc kubenswrapper[4766]: I1002 11:15:44.079976 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 11:15:47 crc kubenswrapper[4766]: I1002 11:15:47.688985 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:48 crc kubenswrapper[4766]: I1002 11:15:48.688875 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:15:48 crc kubenswrapper[4766]: W1002 11:15:48.691601 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4d0079a_03e3_4e5f_81a2_81f5bceb795c.slice/crio-60cef1d4fbe94825d19e68a8010ef54485ba97004cddf6cefcf03ef0eea9192b WatchSource:0}: Error finding container 60cef1d4fbe94825d19e68a8010ef54485ba97004cddf6cefcf03ef0eea9192b: Status 404 returned error can't find the container with id 60cef1d4fbe94825d19e68a8010ef54485ba97004cddf6cefcf03ef0eea9192b Oct 02 11:15:48 crc kubenswrapper[4766]: I1002 11:15:48.706836 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4gmwl" event={"ID":"c803d467-a739-40aa-9dc9-4f04e6e14923","Type":"ContainerStarted","Data":"224becfa27d031ef4fcc783c953605675730a85f887b4f3671b869a18bf84129"} Oct 02 11:15:48 crc kubenswrapper[4766]: I1002 11:15:48.708710 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8344bd0-6491-40b6-8b75-dd9731647d98","Type":"ContainerStarted","Data":"ade51fe7416751fd00ac047ff636d1f77c716623c6812f57bd54243cf0b6db08"} Oct 02 11:15:48 crc kubenswrapper[4766]: I1002 11:15:48.709831 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4d0079a-03e3-4e5f-81a2-81f5bceb795c","Type":"ContainerStarted","Data":"60cef1d4fbe94825d19e68a8010ef54485ba97004cddf6cefcf03ef0eea9192b"} Oct 02 11:15:49 crc kubenswrapper[4766]: I1002 11:15:49.740946 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4d0079a-03e3-4e5f-81a2-81f5bceb795c","Type":"ContainerStarted","Data":"11c17e3e9a13cd8d34cb7338544e7b564639e6d7cc5158ff63319715efa5d8b5"} Oct 02 11:15:49 crc kubenswrapper[4766]: I1002 11:15:49.746336 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8344bd0-6491-40b6-8b75-dd9731647d98","Type":"ContainerStarted","Data":"bd8737ca3d6afc672d595d3d7bd6b9944524b8bb819b487dacfbede1cda7050c"} Oct 02 11:15:50 crc kubenswrapper[4766]: I1002 11:15:50.384828 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:15:50 crc kubenswrapper[4766]: I1002 11:15:50.409266 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-4gmwl" podStartSLOduration=3.118205848 podStartE2EDuration="13.409240961s" podCreationTimestamp="2025-10-02 11:15:37 +0000 UTC" firstStartedPulling="2025-10-02 11:15:37.92078639 +0000 UTC m=+1452.863657334" lastFinishedPulling="2025-10-02 11:15:48.211821483 +0000 UTC m=+1463.154692447" observedRunningTime="2025-10-02 11:15:48.728875709 +0000 UTC m=+1463.671746663" watchObservedRunningTime="2025-10-02 11:15:50.409240961 +0000 UTC m=+1465.352111895" Oct 02 11:15:50 crc kubenswrapper[4766]: I1002 11:15:50.755898 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8344bd0-6491-40b6-8b75-dd9731647d98","Type":"ContainerStarted","Data":"85684773ce1eaaecf59536989e4aaa96fb485a92fce32be8ff3b6fb2a151d041"} Oct 02 11:15:50 crc kubenswrapper[4766]: I1002 11:15:50.757286 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4d0079a-03e3-4e5f-81a2-81f5bceb795c","Type":"ContainerStarted","Data":"a12103d025e03ad958180a996161105d2c14a35f5ae1e7d1aabe48aab8387391"} Oct 02 11:15:50 crc kubenswrapper[4766]: I1002 11:15:50.757634 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 02 11:15:50 crc kubenswrapper[4766]: I1002 11:15:50.776910 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=9.776893951 podStartE2EDuration="9.776893951s" podCreationTimestamp="2025-10-02 11:15:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:15:50.773054098 +0000 UTC m=+1465.715925042" watchObservedRunningTime="2025-10-02 11:15:50.776893951 +0000 UTC m=+1465.719764885" Oct 02 11:15:51 crc kubenswrapper[4766]: I1002 11:15:51.771919 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8344bd0-6491-40b6-8b75-dd9731647d98","Type":"ContainerStarted","Data":"026d46202cc3a06d2243bfc6d5f34fbd4b484131defb7d82162a30dc6aad2d2f"} Oct 02 11:15:51 crc kubenswrapper[4766]: I1002 11:15:51.772345 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:15:51 crc kubenswrapper[4766]: I1002 11:15:51.772038 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerName="ceilometer-central-agent" containerID="cri-o://ade51fe7416751fd00ac047ff636d1f77c716623c6812f57bd54243cf0b6db08" gracePeriod=30 Oct 02 11:15:51 crc kubenswrapper[4766]: I1002 11:15:51.772183 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerName="ceilometer-notification-agent" containerID="cri-o://bd8737ca3d6afc672d595d3d7bd6b9944524b8bb819b487dacfbede1cda7050c" gracePeriod=30 Oct 02 11:15:51 crc kubenswrapper[4766]: I1002 11:15:51.772103 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerName="proxy-httpd" containerID="cri-o://026d46202cc3a06d2243bfc6d5f34fbd4b484131defb7d82162a30dc6aad2d2f" gracePeriod=30 Oct 02 11:15:51 crc kubenswrapper[4766]: I1002 11:15:51.772199 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerName="sg-core" containerID="cri-o://85684773ce1eaaecf59536989e4aaa96fb485a92fce32be8ff3b6fb2a151d041" gracePeriod=30 Oct 02 11:15:51 crc kubenswrapper[4766]: I1002 11:15:51.806087 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.147234524 podStartE2EDuration="11.806068213s" podCreationTimestamp="2025-10-02 11:15:40 +0000 UTC" firstStartedPulling="2025-10-02 11:15:41.658266017 +0000 UTC m=+1456.601136961" lastFinishedPulling="2025-10-02 11:15:51.317099716 +0000 UTC m=+1466.259970650" observedRunningTime="2025-10-02 11:15:51.801573159 +0000 UTC m=+1466.744444123" watchObservedRunningTime="2025-10-02 11:15:51.806068213 +0000 UTC m=+1466.748939157" Oct 02 11:15:52 crc kubenswrapper[4766]: I1002 11:15:52.781555 4766 generic.go:334] "Generic (PLEG): container finished" podID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerID="026d46202cc3a06d2243bfc6d5f34fbd4b484131defb7d82162a30dc6aad2d2f" exitCode=0 Oct 02 11:15:52 crc kubenswrapper[4766]: I1002 11:15:52.781596 4766 generic.go:334] "Generic (PLEG): container finished" podID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerID="85684773ce1eaaecf59536989e4aaa96fb485a92fce32be8ff3b6fb2a151d041" exitCode=2 Oct 02 11:15:52 crc kubenswrapper[4766]: I1002 11:15:52.781607 4766 generic.go:334] "Generic (PLEG): container finished" podID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerID="bd8737ca3d6afc672d595d3d7bd6b9944524b8bb819b487dacfbede1cda7050c" exitCode=0 Oct 02 11:15:52 crc kubenswrapper[4766]: I1002 11:15:52.781627 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8344bd0-6491-40b6-8b75-dd9731647d98","Type":"ContainerDied","Data":"026d46202cc3a06d2243bfc6d5f34fbd4b484131defb7d82162a30dc6aad2d2f"} Oct 02 11:15:52 crc kubenswrapper[4766]: I1002 11:15:52.781651 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8344bd0-6491-40b6-8b75-dd9731647d98","Type":"ContainerDied","Data":"85684773ce1eaaecf59536989e4aaa96fb485a92fce32be8ff3b6fb2a151d041"} Oct 02 11:15:52 crc kubenswrapper[4766]: I1002 11:15:52.781660 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8344bd0-6491-40b6-8b75-dd9731647d98","Type":"ContainerDied","Data":"bd8737ca3d6afc672d595d3d7bd6b9944524b8bb819b487dacfbede1cda7050c"} Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.432579 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.432910 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.746139 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.801530 4766 generic.go:334] "Generic (PLEG): container finished" podID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerID="ade51fe7416751fd00ac047ff636d1f77c716623c6812f57bd54243cf0b6db08" exitCode=0 Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.801589 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8344bd0-6491-40b6-8b75-dd9731647d98","Type":"ContainerDied","Data":"ade51fe7416751fd00ac047ff636d1f77c716623c6812f57bd54243cf0b6db08"} Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.801622 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.801635 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c8344bd0-6491-40b6-8b75-dd9731647d98","Type":"ContainerDied","Data":"5ced3759fae334d4331d6567928228faf6a043cff38faabfc0da650ef47dffe3"} Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.801664 4766 scope.go:117] "RemoveContainer" containerID="026d46202cc3a06d2243bfc6d5f34fbd4b484131defb7d82162a30dc6aad2d2f" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.820588 4766 scope.go:117] "RemoveContainer" containerID="85684773ce1eaaecf59536989e4aaa96fb485a92fce32be8ff3b6fb2a151d041" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.839729 4766 scope.go:117] "RemoveContainer" containerID="bd8737ca3d6afc672d595d3d7bd6b9944524b8bb819b487dacfbede1cda7050c" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.864054 4766 scope.go:117] "RemoveContainer" containerID="ade51fe7416751fd00ac047ff636d1f77c716623c6812f57bd54243cf0b6db08" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.879161 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjg2p\" (UniqueName: \"kubernetes.io/projected/c8344bd0-6491-40b6-8b75-dd9731647d98-kube-api-access-pjg2p\") pod \"c8344bd0-6491-40b6-8b75-dd9731647d98\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.879242 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-scripts\") pod \"c8344bd0-6491-40b6-8b75-dd9731647d98\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.879267 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-combined-ca-bundle\") pod \"c8344bd0-6491-40b6-8b75-dd9731647d98\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.879289 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-config-data\") pod \"c8344bd0-6491-40b6-8b75-dd9731647d98\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.879323 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8344bd0-6491-40b6-8b75-dd9731647d98-run-httpd\") pod \"c8344bd0-6491-40b6-8b75-dd9731647d98\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.879386 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-sg-core-conf-yaml\") pod \"c8344bd0-6491-40b6-8b75-dd9731647d98\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.879429 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8344bd0-6491-40b6-8b75-dd9731647d98-log-httpd\") pod \"c8344bd0-6491-40b6-8b75-dd9731647d98\" (UID: \"c8344bd0-6491-40b6-8b75-dd9731647d98\") " Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.879818 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8344bd0-6491-40b6-8b75-dd9731647d98-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c8344bd0-6491-40b6-8b75-dd9731647d98" (UID: "c8344bd0-6491-40b6-8b75-dd9731647d98"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.880040 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8344bd0-6491-40b6-8b75-dd9731647d98-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.880084 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8344bd0-6491-40b6-8b75-dd9731647d98-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c8344bd0-6491-40b6-8b75-dd9731647d98" (UID: "c8344bd0-6491-40b6-8b75-dd9731647d98"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.884166 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-scripts" (OuterVolumeSpecName: "scripts") pod "c8344bd0-6491-40b6-8b75-dd9731647d98" (UID: "c8344bd0-6491-40b6-8b75-dd9731647d98"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.885863 4766 scope.go:117] "RemoveContainer" containerID="026d46202cc3a06d2243bfc6d5f34fbd4b484131defb7d82162a30dc6aad2d2f" Oct 02 11:15:54 crc kubenswrapper[4766]: E1002 11:15:54.886297 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"026d46202cc3a06d2243bfc6d5f34fbd4b484131defb7d82162a30dc6aad2d2f\": container with ID starting with 026d46202cc3a06d2243bfc6d5f34fbd4b484131defb7d82162a30dc6aad2d2f not found: ID does not exist" containerID="026d46202cc3a06d2243bfc6d5f34fbd4b484131defb7d82162a30dc6aad2d2f" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.886336 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"026d46202cc3a06d2243bfc6d5f34fbd4b484131defb7d82162a30dc6aad2d2f"} err="failed to get container status \"026d46202cc3a06d2243bfc6d5f34fbd4b484131defb7d82162a30dc6aad2d2f\": rpc error: code = NotFound desc = could not find container \"026d46202cc3a06d2243bfc6d5f34fbd4b484131defb7d82162a30dc6aad2d2f\": container with ID starting with 026d46202cc3a06d2243bfc6d5f34fbd4b484131defb7d82162a30dc6aad2d2f not found: ID does not exist" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.886364 4766 scope.go:117] "RemoveContainer" containerID="85684773ce1eaaecf59536989e4aaa96fb485a92fce32be8ff3b6fb2a151d041" Oct 02 11:15:54 crc kubenswrapper[4766]: E1002 11:15:54.887086 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85684773ce1eaaecf59536989e4aaa96fb485a92fce32be8ff3b6fb2a151d041\": container with ID starting with 85684773ce1eaaecf59536989e4aaa96fb485a92fce32be8ff3b6fb2a151d041 not found: ID does not exist" containerID="85684773ce1eaaecf59536989e4aaa96fb485a92fce32be8ff3b6fb2a151d041" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.887128 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85684773ce1eaaecf59536989e4aaa96fb485a92fce32be8ff3b6fb2a151d041"} err="failed to get container status \"85684773ce1eaaecf59536989e4aaa96fb485a92fce32be8ff3b6fb2a151d041\": rpc error: code = NotFound desc = could not find container \"85684773ce1eaaecf59536989e4aaa96fb485a92fce32be8ff3b6fb2a151d041\": container with ID starting with 85684773ce1eaaecf59536989e4aaa96fb485a92fce32be8ff3b6fb2a151d041 not found: ID does not exist" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.887164 4766 scope.go:117] "RemoveContainer" containerID="bd8737ca3d6afc672d595d3d7bd6b9944524b8bb819b487dacfbede1cda7050c" Oct 02 11:15:54 crc kubenswrapper[4766]: E1002 11:15:54.887471 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8737ca3d6afc672d595d3d7bd6b9944524b8bb819b487dacfbede1cda7050c\": container with ID starting with bd8737ca3d6afc672d595d3d7bd6b9944524b8bb819b487dacfbede1cda7050c not found: ID does not exist" containerID="bd8737ca3d6afc672d595d3d7bd6b9944524b8bb819b487dacfbede1cda7050c" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.887513 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8737ca3d6afc672d595d3d7bd6b9944524b8bb819b487dacfbede1cda7050c"} err="failed to get container status \"bd8737ca3d6afc672d595d3d7bd6b9944524b8bb819b487dacfbede1cda7050c\": rpc error: code = NotFound desc = could not find container \"bd8737ca3d6afc672d595d3d7bd6b9944524b8bb819b487dacfbede1cda7050c\": container with ID starting with bd8737ca3d6afc672d595d3d7bd6b9944524b8bb819b487dacfbede1cda7050c not found: ID does not exist" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.887528 4766 scope.go:117] "RemoveContainer" containerID="ade51fe7416751fd00ac047ff636d1f77c716623c6812f57bd54243cf0b6db08" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.888466 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8344bd0-6491-40b6-8b75-dd9731647d98-kube-api-access-pjg2p" (OuterVolumeSpecName: "kube-api-access-pjg2p") pod "c8344bd0-6491-40b6-8b75-dd9731647d98" (UID: "c8344bd0-6491-40b6-8b75-dd9731647d98"). InnerVolumeSpecName "kube-api-access-pjg2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:15:54 crc kubenswrapper[4766]: E1002 11:15:54.888621 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ade51fe7416751fd00ac047ff636d1f77c716623c6812f57bd54243cf0b6db08\": container with ID starting with ade51fe7416751fd00ac047ff636d1f77c716623c6812f57bd54243cf0b6db08 not found: ID does not exist" containerID="ade51fe7416751fd00ac047ff636d1f77c716623c6812f57bd54243cf0b6db08" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.888651 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ade51fe7416751fd00ac047ff636d1f77c716623c6812f57bd54243cf0b6db08"} err="failed to get container status \"ade51fe7416751fd00ac047ff636d1f77c716623c6812f57bd54243cf0b6db08\": rpc error: code = NotFound desc = could not find container \"ade51fe7416751fd00ac047ff636d1f77c716623c6812f57bd54243cf0b6db08\": container with ID starting with ade51fe7416751fd00ac047ff636d1f77c716623c6812f57bd54243cf0b6db08 not found: ID does not exist" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.906737 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c8344bd0-6491-40b6-8b75-dd9731647d98" (UID: "c8344bd0-6491-40b6-8b75-dd9731647d98"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.950005 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8344bd0-6491-40b6-8b75-dd9731647d98" (UID: "c8344bd0-6491-40b6-8b75-dd9731647d98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.972530 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-config-data" (OuterVolumeSpecName: "config-data") pod "c8344bd0-6491-40b6-8b75-dd9731647d98" (UID: "c8344bd0-6491-40b6-8b75-dd9731647d98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.981770 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjg2p\" (UniqueName: \"kubernetes.io/projected/c8344bd0-6491-40b6-8b75-dd9731647d98-kube-api-access-pjg2p\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.981801 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.981811 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.981819 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.981828 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c8344bd0-6491-40b6-8b75-dd9731647d98-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:54 crc kubenswrapper[4766]: I1002 11:15:54.981839 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c8344bd0-6491-40b6-8b75-dd9731647d98-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.050327 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.102228 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d5d689cbb-b8wmb"] Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.102460 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d5d689cbb-b8wmb" podUID="5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf" containerName="neutron-api" containerID="cri-o://a5d6b8038eeebfa5add3d86a4032fc2e6b96dcf152395518f5b6721b67d760e9" gracePeriod=30 Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.102584 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5d5d689cbb-b8wmb" podUID="5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf" containerName="neutron-httpd" containerID="cri-o://ddf1db2050ce314b5c6aa724ee6afdaca35e577bdbc4f9b2b93cd43a6a850308" gracePeriod=30 Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.147659 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.156009 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.185034 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:55 crc kubenswrapper[4766]: E1002 11:15:55.189162 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerName="ceilometer-notification-agent" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.189467 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerName="ceilometer-notification-agent" Oct 02 11:15:55 crc kubenswrapper[4766]: E1002 11:15:55.189555 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerName="proxy-httpd" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.189624 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerName="proxy-httpd" Oct 02 11:15:55 crc kubenswrapper[4766]: E1002 11:15:55.189682 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerName="ceilometer-central-agent" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.189736 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerName="ceilometer-central-agent" Oct 02 11:15:55 crc kubenswrapper[4766]: E1002 11:15:55.189798 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerName="sg-core" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.189853 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerName="sg-core" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.190092 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerName="ceilometer-central-agent" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.190168 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerName="sg-core" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.190233 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerName="ceilometer-notification-agent" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.190296 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8344bd0-6491-40b6-8b75-dd9731647d98" containerName="proxy-httpd" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.192060 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.195459 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.197946 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.198175 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.292599 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43489c17-898c-487c-b134-a6aa6c441299-log-httpd\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.293010 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-scripts\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.293066 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.293096 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkn4d\" (UniqueName: \"kubernetes.io/projected/43489c17-898c-487c-b134-a6aa6c441299-kube-api-access-pkn4d\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.293134 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43489c17-898c-487c-b134-a6aa6c441299-run-httpd\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.293207 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-config-data\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.293283 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.394611 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43489c17-898c-487c-b134-a6aa6c441299-log-httpd\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.394679 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-scripts\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.394737 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.394767 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkn4d\" (UniqueName: \"kubernetes.io/projected/43489c17-898c-487c-b134-a6aa6c441299-kube-api-access-pkn4d\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.394802 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43489c17-898c-487c-b134-a6aa6c441299-run-httpd\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.394863 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-config-data\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.394913 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.395070 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43489c17-898c-487c-b134-a6aa6c441299-log-httpd\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.395297 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43489c17-898c-487c-b134-a6aa6c441299-run-httpd\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.400783 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.401427 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-scripts\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.401594 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-config-data\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.411962 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.419209 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkn4d\" (UniqueName: \"kubernetes.io/projected/43489c17-898c-487c-b134-a6aa6c441299-kube-api-access-pkn4d\") pod \"ceilometer-0\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.470731 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fqng4"] Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.472647 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqng4" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.478226 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fqng4"] Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.530010 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.601468 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe991bdf-f974-4959-bb0b-e10001c1c380-catalog-content\") pod \"redhat-operators-fqng4\" (UID: \"fe991bdf-f974-4959-bb0b-e10001c1c380\") " pod="openshift-marketplace/redhat-operators-fqng4" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.601530 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn2s8\" (UniqueName: \"kubernetes.io/projected/fe991bdf-f974-4959-bb0b-e10001c1c380-kube-api-access-qn2s8\") pod \"redhat-operators-fqng4\" (UID: \"fe991bdf-f974-4959-bb0b-e10001c1c380\") " pod="openshift-marketplace/redhat-operators-fqng4" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.601554 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe991bdf-f974-4959-bb0b-e10001c1c380-utilities\") pod \"redhat-operators-fqng4\" (UID: \"fe991bdf-f974-4959-bb0b-e10001c1c380\") " pod="openshift-marketplace/redhat-operators-fqng4" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.702986 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe991bdf-f974-4959-bb0b-e10001c1c380-catalog-content\") pod \"redhat-operators-fqng4\" (UID: \"fe991bdf-f974-4959-bb0b-e10001c1c380\") " pod="openshift-marketplace/redhat-operators-fqng4" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.703223 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn2s8\" (UniqueName: \"kubernetes.io/projected/fe991bdf-f974-4959-bb0b-e10001c1c380-kube-api-access-qn2s8\") pod \"redhat-operators-fqng4\" (UID: \"fe991bdf-f974-4959-bb0b-e10001c1c380\") " pod="openshift-marketplace/redhat-operators-fqng4" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.703249 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe991bdf-f974-4959-bb0b-e10001c1c380-utilities\") pod \"redhat-operators-fqng4\" (UID: \"fe991bdf-f974-4959-bb0b-e10001c1c380\") " pod="openshift-marketplace/redhat-operators-fqng4" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.703784 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe991bdf-f974-4959-bb0b-e10001c1c380-utilities\") pod \"redhat-operators-fqng4\" (UID: \"fe991bdf-f974-4959-bb0b-e10001c1c380\") " pod="openshift-marketplace/redhat-operators-fqng4" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.703994 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe991bdf-f974-4959-bb0b-e10001c1c380-catalog-content\") pod \"redhat-operators-fqng4\" (UID: \"fe991bdf-f974-4959-bb0b-e10001c1c380\") " pod="openshift-marketplace/redhat-operators-fqng4" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.726585 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn2s8\" (UniqueName: \"kubernetes.io/projected/fe991bdf-f974-4959-bb0b-e10001c1c380-kube-api-access-qn2s8\") pod \"redhat-operators-fqng4\" (UID: \"fe991bdf-f974-4959-bb0b-e10001c1c380\") " pod="openshift-marketplace/redhat-operators-fqng4" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.817659 4766 generic.go:334] "Generic (PLEG): container finished" podID="5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf" containerID="ddf1db2050ce314b5c6aa724ee6afdaca35e577bdbc4f9b2b93cd43a6a850308" exitCode=0 Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.817706 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d5d689cbb-b8wmb" event={"ID":"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf","Type":"ContainerDied","Data":"ddf1db2050ce314b5c6aa724ee6afdaca35e577bdbc4f9b2b93cd43a6a850308"} Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.854678 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqng4" Oct 02 11:15:55 crc kubenswrapper[4766]: I1002 11:15:55.896077 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8344bd0-6491-40b6-8b75-dd9731647d98" path="/var/lib/kubelet/pods/c8344bd0-6491-40b6-8b75-dd9731647d98/volumes" Oct 02 11:15:56 crc kubenswrapper[4766]: I1002 11:15:56.034823 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:15:56 crc kubenswrapper[4766]: I1002 11:15:56.331596 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fqng4"] Oct 02 11:15:56 crc kubenswrapper[4766]: I1002 11:15:56.827010 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43489c17-898c-487c-b134-a6aa6c441299","Type":"ContainerStarted","Data":"8ae9682cfc74325ae47309c018f54eb483a1d77f520ae5c2855f25d5b2ca4e78"} Oct 02 11:15:56 crc kubenswrapper[4766]: I1002 11:15:56.828472 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqng4" event={"ID":"fe991bdf-f974-4959-bb0b-e10001c1c380","Type":"ContainerStarted","Data":"398fd8a860f21b451c92daf46b1c80ca5fc15cec28904cf2021f7c2098353beb"} Oct 02 11:15:57 crc kubenswrapper[4766]: I1002 11:15:57.840463 4766 generic.go:334] "Generic (PLEG): container finished" podID="fe991bdf-f974-4959-bb0b-e10001c1c380" containerID="a1563d2c6cc67fbfc0c650c00907947ca89e6f47afa6a831431516b6cbf09748" exitCode=0 Oct 02 11:15:57 crc kubenswrapper[4766]: I1002 11:15:57.840802 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqng4" event={"ID":"fe991bdf-f974-4959-bb0b-e10001c1c380","Type":"ContainerDied","Data":"a1563d2c6cc67fbfc0c650c00907947ca89e6f47afa6a831431516b6cbf09748"} Oct 02 11:15:58 crc kubenswrapper[4766]: I1002 11:15:58.878077 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43489c17-898c-487c-b134-a6aa6c441299","Type":"ContainerStarted","Data":"2f193d89bd1c1213ea9f2fdb4398b39637ce5d08394de4e011e48dfc01f7cb72"} Oct 02 11:15:59 crc kubenswrapper[4766]: I1002 11:15:59.002963 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 02 11:16:00 crc kubenswrapper[4766]: I1002 11:16:00.906902 4766 generic.go:334] "Generic (PLEG): container finished" podID="5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf" containerID="a5d6b8038eeebfa5add3d86a4032fc2e6b96dcf152395518f5b6721b67d760e9" exitCode=0 Oct 02 11:16:00 crc kubenswrapper[4766]: I1002 11:16:00.907022 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d5d689cbb-b8wmb" event={"ID":"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf","Type":"ContainerDied","Data":"a5d6b8038eeebfa5add3d86a4032fc2e6b96dcf152395518f5b6721b67d760e9"} Oct 02 11:16:01 crc kubenswrapper[4766]: I1002 11:16:01.983063 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.020451 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-combined-ca-bundle\") pod \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.020485 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-ovndb-tls-certs\") pod \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.020552 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47p5g\" (UniqueName: \"kubernetes.io/projected/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-kube-api-access-47p5g\") pod \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.020649 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-httpd-config\") pod \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.020669 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-config\") pod \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\" (UID: \"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf\") " Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.027180 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf" (UID: "5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.027569 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-kube-api-access-47p5g" (OuterVolumeSpecName: "kube-api-access-47p5g") pod "5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf" (UID: "5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf"). InnerVolumeSpecName "kube-api-access-47p5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.079746 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-config" (OuterVolumeSpecName: "config") pod "5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf" (UID: "5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.080981 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf" (UID: "5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.102199 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf" (UID: "5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.122707 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.122752 4766 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.122785 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47p5g\" (UniqueName: \"kubernetes.io/projected/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-kube-api-access-47p5g\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.122801 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.122816 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.932199 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5d5d689cbb-b8wmb" event={"ID":"5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf","Type":"ContainerDied","Data":"2b059259836e9be9e71ca1179c6a25d181ebf547605caab27caadf6723718f11"} Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.932228 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5d5d689cbb-b8wmb" Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.932282 4766 scope.go:117] "RemoveContainer" containerID="ddf1db2050ce314b5c6aa724ee6afdaca35e577bdbc4f9b2b93cd43a6a850308" Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.971130 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5d5d689cbb-b8wmb"] Oct 02 11:16:02 crc kubenswrapper[4766]: I1002 11:16:02.978130 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5d5d689cbb-b8wmb"] Oct 02 11:16:03 crc kubenswrapper[4766]: I1002 11:16:03.892200 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf" path="/var/lib/kubelet/pods/5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf/volumes" Oct 02 11:16:04 crc kubenswrapper[4766]: I1002 11:16:04.538135 4766 scope.go:117] "RemoveContainer" containerID="a5d6b8038eeebfa5add3d86a4032fc2e6b96dcf152395518f5b6721b67d760e9" Oct 02 11:16:07 crc kubenswrapper[4766]: I1002 11:16:07.052766 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="d4d0079a-03e3-4e5f-81a2-81f5bceb795c" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.177:8776/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:16:08 crc kubenswrapper[4766]: I1002 11:16:08.988631 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43489c17-898c-487c-b134-a6aa6c441299","Type":"ContainerStarted","Data":"d8860d058d4acc0515a92572caa65aaaa3305c0653b58e0e58649e153729e19d"} Oct 02 11:16:08 crc kubenswrapper[4766]: I1002 11:16:08.991616 4766 generic.go:334] "Generic (PLEG): container finished" podID="fe991bdf-f974-4959-bb0b-e10001c1c380" containerID="72843f220341ecac2f876b6c9522e5107012e11c3baa2c75917d081cea48e947" exitCode=0 Oct 02 11:16:08 crc kubenswrapper[4766]: I1002 11:16:08.991659 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqng4" event={"ID":"fe991bdf-f974-4959-bb0b-e10001c1c380","Type":"ContainerDied","Data":"72843f220341ecac2f876b6c9522e5107012e11c3baa2c75917d081cea48e947"} Oct 02 11:16:09 crc kubenswrapper[4766]: I1002 11:16:09.873316 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-shj49"] Oct 02 11:16:09 crc kubenswrapper[4766]: E1002 11:16:09.874603 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf" containerName="neutron-httpd" Oct 02 11:16:09 crc kubenswrapper[4766]: I1002 11:16:09.874621 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf" containerName="neutron-httpd" Oct 02 11:16:09 crc kubenswrapper[4766]: E1002 11:16:09.874638 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf" containerName="neutron-api" Oct 02 11:16:09 crc kubenswrapper[4766]: I1002 11:16:09.874645 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf" containerName="neutron-api" Oct 02 11:16:09 crc kubenswrapper[4766]: I1002 11:16:09.874834 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf" containerName="neutron-api" Oct 02 11:16:09 crc kubenswrapper[4766]: I1002 11:16:09.874863 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ab8b4f9-0502-4896-aee1-f5f96bd8d1cf" containerName="neutron-httpd" Oct 02 11:16:09 crc kubenswrapper[4766]: I1002 11:16:09.876364 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-shj49" Oct 02 11:16:09 crc kubenswrapper[4766]: I1002 11:16:09.910496 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-shj49"] Oct 02 11:16:09 crc kubenswrapper[4766]: I1002 11:16:09.977123 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-utilities\") pod \"redhat-marketplace-shj49\" (UID: \"005c0030-75ad-49fa-9f5c-60ebe0fbedb0\") " pod="openshift-marketplace/redhat-marketplace-shj49" Oct 02 11:16:09 crc kubenswrapper[4766]: I1002 11:16:09.977167 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzzgm\" (UniqueName: \"kubernetes.io/projected/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-kube-api-access-vzzgm\") pod \"redhat-marketplace-shj49\" (UID: \"005c0030-75ad-49fa-9f5c-60ebe0fbedb0\") " pod="openshift-marketplace/redhat-marketplace-shj49" Oct 02 11:16:09 crc kubenswrapper[4766]: I1002 11:16:09.977242 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-catalog-content\") pod \"redhat-marketplace-shj49\" (UID: \"005c0030-75ad-49fa-9f5c-60ebe0fbedb0\") " pod="openshift-marketplace/redhat-marketplace-shj49" Oct 02 11:16:10 crc kubenswrapper[4766]: I1002 11:16:10.002041 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43489c17-898c-487c-b134-a6aa6c441299","Type":"ContainerStarted","Data":"2eecbaec9a47b5b971b34a006e92793fa386e41fcaf08c4b10f863cd047efa92"} Oct 02 11:16:10 crc kubenswrapper[4766]: I1002 11:16:10.078634 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-utilities\") pod \"redhat-marketplace-shj49\" (UID: \"005c0030-75ad-49fa-9f5c-60ebe0fbedb0\") " pod="openshift-marketplace/redhat-marketplace-shj49" Oct 02 11:16:10 crc kubenswrapper[4766]: I1002 11:16:10.078687 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzzgm\" (UniqueName: \"kubernetes.io/projected/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-kube-api-access-vzzgm\") pod \"redhat-marketplace-shj49\" (UID: \"005c0030-75ad-49fa-9f5c-60ebe0fbedb0\") " pod="openshift-marketplace/redhat-marketplace-shj49" Oct 02 11:16:10 crc kubenswrapper[4766]: I1002 11:16:10.078731 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-catalog-content\") pod \"redhat-marketplace-shj49\" (UID: \"005c0030-75ad-49fa-9f5c-60ebe0fbedb0\") " pod="openshift-marketplace/redhat-marketplace-shj49" Oct 02 11:16:10 crc kubenswrapper[4766]: I1002 11:16:10.079203 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-utilities\") pod \"redhat-marketplace-shj49\" (UID: \"005c0030-75ad-49fa-9f5c-60ebe0fbedb0\") " pod="openshift-marketplace/redhat-marketplace-shj49" Oct 02 11:16:10 crc kubenswrapper[4766]: I1002 11:16:10.079568 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-catalog-content\") pod \"redhat-marketplace-shj49\" (UID: \"005c0030-75ad-49fa-9f5c-60ebe0fbedb0\") " pod="openshift-marketplace/redhat-marketplace-shj49" Oct 02 11:16:10 crc kubenswrapper[4766]: I1002 11:16:10.095820 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzzgm\" (UniqueName: \"kubernetes.io/projected/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-kube-api-access-vzzgm\") pod \"redhat-marketplace-shj49\" (UID: \"005c0030-75ad-49fa-9f5c-60ebe0fbedb0\") " pod="openshift-marketplace/redhat-marketplace-shj49" Oct 02 11:16:10 crc kubenswrapper[4766]: I1002 11:16:10.202641 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-shj49" Oct 02 11:16:10 crc kubenswrapper[4766]: I1002 11:16:10.761952 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-shj49"] Oct 02 11:16:11 crc kubenswrapper[4766]: I1002 11:16:11.011325 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shj49" event={"ID":"005c0030-75ad-49fa-9f5c-60ebe0fbedb0","Type":"ContainerStarted","Data":"62a56633d6c10091654a59989af81cbd3ded3e854233b66968d058d91c706d4a"} Oct 02 11:16:15 crc kubenswrapper[4766]: I1002 11:16:15.050703 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqng4" event={"ID":"fe991bdf-f974-4959-bb0b-e10001c1c380","Type":"ContainerStarted","Data":"f9ae8d1bd91af50c0e00f98e594ee4555f7951d9a6144bf49b2eb1154160076b"} Oct 02 11:16:15 crc kubenswrapper[4766]: I1002 11:16:15.052361 4766 generic.go:334] "Generic (PLEG): container finished" podID="005c0030-75ad-49fa-9f5c-60ebe0fbedb0" containerID="8be986fa0983b8d276c352ce07dd8e44159451272e4a4b27e56b8800c495b4c6" exitCode=0 Oct 02 11:16:15 crc kubenswrapper[4766]: I1002 11:16:15.052410 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shj49" event={"ID":"005c0030-75ad-49fa-9f5c-60ebe0fbedb0","Type":"ContainerDied","Data":"8be986fa0983b8d276c352ce07dd8e44159451272e4a4b27e56b8800c495b4c6"} Oct 02 11:16:15 crc kubenswrapper[4766]: I1002 11:16:15.055718 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43489c17-898c-487c-b134-a6aa6c441299","Type":"ContainerStarted","Data":"6822bd7f7084b431b4a8b5176ad9646a3ddb5dbd2ec7441b1475cd57cdb05e43"} Oct 02 11:16:15 crc kubenswrapper[4766]: I1002 11:16:15.055861 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:16:15 crc kubenswrapper[4766]: I1002 11:16:15.083192 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fqng4" podStartSLOduration=4.227141245 podStartE2EDuration="20.08317029s" podCreationTimestamp="2025-10-02 11:15:55 +0000 UTC" firstStartedPulling="2025-10-02 11:15:58.044635843 +0000 UTC m=+1472.987506787" lastFinishedPulling="2025-10-02 11:16:13.900664898 +0000 UTC m=+1488.843535832" observedRunningTime="2025-10-02 11:16:15.073325595 +0000 UTC m=+1490.016196539" watchObservedRunningTime="2025-10-02 11:16:15.08317029 +0000 UTC m=+1490.026041244" Oct 02 11:16:15 crc kubenswrapper[4766]: I1002 11:16:15.107916 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.921676933 podStartE2EDuration="20.10789521s" podCreationTimestamp="2025-10-02 11:15:55 +0000 UTC" firstStartedPulling="2025-10-02 11:15:56.053094254 +0000 UTC m=+1470.995965198" lastFinishedPulling="2025-10-02 11:16:14.239312531 +0000 UTC m=+1489.182183475" observedRunningTime="2025-10-02 11:16:15.102948283 +0000 UTC m=+1490.045819257" watchObservedRunningTime="2025-10-02 11:16:15.10789521 +0000 UTC m=+1490.050766164" Oct 02 11:16:15 crc kubenswrapper[4766]: I1002 11:16:15.855791 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fqng4" Oct 02 11:16:15 crc kubenswrapper[4766]: I1002 11:16:15.856115 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fqng4" Oct 02 11:16:16 crc kubenswrapper[4766]: I1002 11:16:16.900840 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fqng4" podUID="fe991bdf-f974-4959-bb0b-e10001c1c380" containerName="registry-server" probeResult="failure" output=< Oct 02 11:16:16 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Oct 02 11:16:16 crc kubenswrapper[4766]: > Oct 02 11:16:19 crc kubenswrapper[4766]: I1002 11:16:19.091706 4766 generic.go:334] "Generic (PLEG): container finished" podID="005c0030-75ad-49fa-9f5c-60ebe0fbedb0" containerID="f5e5ec10695ede5c30e68d72187df8e8adfdef32a82757167024affaa886e474" exitCode=0 Oct 02 11:16:19 crc kubenswrapper[4766]: I1002 11:16:19.092161 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shj49" event={"ID":"005c0030-75ad-49fa-9f5c-60ebe0fbedb0","Type":"ContainerDied","Data":"f5e5ec10695ede5c30e68d72187df8e8adfdef32a82757167024affaa886e474"} Oct 02 11:16:20 crc kubenswrapper[4766]: I1002 11:16:20.104174 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shj49" event={"ID":"005c0030-75ad-49fa-9f5c-60ebe0fbedb0","Type":"ContainerStarted","Data":"1f3b8c3333dc0379b7a0426a6b14572fef9cb6145a584e2efe916428d57d733e"} Oct 02 11:16:20 crc kubenswrapper[4766]: I1002 11:16:20.129114 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-shj49" podStartSLOduration=6.438647864 podStartE2EDuration="11.129097845s" podCreationTimestamp="2025-10-02 11:16:09 +0000 UTC" firstStartedPulling="2025-10-02 11:16:15.05438598 +0000 UTC m=+1489.997256954" lastFinishedPulling="2025-10-02 11:16:19.744835971 +0000 UTC m=+1494.687706935" observedRunningTime="2025-10-02 11:16:20.12174083 +0000 UTC m=+1495.064611774" watchObservedRunningTime="2025-10-02 11:16:20.129097845 +0000 UTC m=+1495.071968789" Oct 02 11:16:20 crc kubenswrapper[4766]: I1002 11:16:20.203498 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-shj49" Oct 02 11:16:20 crc kubenswrapper[4766]: I1002 11:16:20.203669 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-shj49" Oct 02 11:16:21 crc kubenswrapper[4766]: I1002 11:16:21.249890 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-shj49" podUID="005c0030-75ad-49fa-9f5c-60ebe0fbedb0" containerName="registry-server" probeResult="failure" output=< Oct 02 11:16:21 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Oct 02 11:16:21 crc kubenswrapper[4766]: > Oct 02 11:16:24 crc kubenswrapper[4766]: I1002 11:16:24.432472 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:16:24 crc kubenswrapper[4766]: I1002 11:16:24.433255 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:16:25 crc kubenswrapper[4766]: I1002 11:16:25.537730 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 11:16:26 crc kubenswrapper[4766]: I1002 11:16:26.907361 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fqng4" podUID="fe991bdf-f974-4959-bb0b-e10001c1c380" containerName="registry-server" probeResult="failure" output=< Oct 02 11:16:26 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Oct 02 11:16:26 crc kubenswrapper[4766]: > Oct 02 11:16:29 crc kubenswrapper[4766]: I1002 11:16:29.533135 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:16:29 crc kubenswrapper[4766]: I1002 11:16:29.533920 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="e914485f-05fc-4f85-b902-2e43bcfc0bb5" containerName="kube-state-metrics" containerID="cri-o://91eabd7220e84be6dc59a656dd4a2e76c92950e906521ca1ddc8c6e2db98afa7" gracePeriod=30 Oct 02 11:16:29 crc kubenswrapper[4766]: I1002 11:16:29.971419 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.085538 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dqbd\" (UniqueName: \"kubernetes.io/projected/e914485f-05fc-4f85-b902-2e43bcfc0bb5-kube-api-access-2dqbd\") pod \"e914485f-05fc-4f85-b902-2e43bcfc0bb5\" (UID: \"e914485f-05fc-4f85-b902-2e43bcfc0bb5\") " Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.091065 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e914485f-05fc-4f85-b902-2e43bcfc0bb5-kube-api-access-2dqbd" (OuterVolumeSpecName: "kube-api-access-2dqbd") pod "e914485f-05fc-4f85-b902-2e43bcfc0bb5" (UID: "e914485f-05fc-4f85-b902-2e43bcfc0bb5"). InnerVolumeSpecName "kube-api-access-2dqbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.188343 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dqbd\" (UniqueName: \"kubernetes.io/projected/e914485f-05fc-4f85-b902-2e43bcfc0bb5-kube-api-access-2dqbd\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.205121 4766 generic.go:334] "Generic (PLEG): container finished" podID="c803d467-a739-40aa-9dc9-4f04e6e14923" containerID="224becfa27d031ef4fcc783c953605675730a85f887b4f3671b869a18bf84129" exitCode=0 Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.205197 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4gmwl" event={"ID":"c803d467-a739-40aa-9dc9-4f04e6e14923","Type":"ContainerDied","Data":"224becfa27d031ef4fcc783c953605675730a85f887b4f3671b869a18bf84129"} Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.207287 4766 generic.go:334] "Generic (PLEG): container finished" podID="e914485f-05fc-4f85-b902-2e43bcfc0bb5" containerID="91eabd7220e84be6dc59a656dd4a2e76c92950e906521ca1ddc8c6e2db98afa7" exitCode=2 Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.207329 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e914485f-05fc-4f85-b902-2e43bcfc0bb5","Type":"ContainerDied","Data":"91eabd7220e84be6dc59a656dd4a2e76c92950e906521ca1ddc8c6e2db98afa7"} Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.207372 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e914485f-05fc-4f85-b902-2e43bcfc0bb5","Type":"ContainerDied","Data":"b5eccfb23c6ba881ba3874dc0e3a306b2ab69db403f398a47513cd2685f392a5"} Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.207338 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.207423 4766 scope.go:117] "RemoveContainer" containerID="91eabd7220e84be6dc59a656dd4a2e76c92950e906521ca1ddc8c6e2db98afa7" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.236647 4766 scope.go:117] "RemoveContainer" containerID="91eabd7220e84be6dc59a656dd4a2e76c92950e906521ca1ddc8c6e2db98afa7" Oct 02 11:16:30 crc kubenswrapper[4766]: E1002 11:16:30.237225 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91eabd7220e84be6dc59a656dd4a2e76c92950e906521ca1ddc8c6e2db98afa7\": container with ID starting with 91eabd7220e84be6dc59a656dd4a2e76c92950e906521ca1ddc8c6e2db98afa7 not found: ID does not exist" containerID="91eabd7220e84be6dc59a656dd4a2e76c92950e906521ca1ddc8c6e2db98afa7" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.237264 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91eabd7220e84be6dc59a656dd4a2e76c92950e906521ca1ddc8c6e2db98afa7"} err="failed to get container status \"91eabd7220e84be6dc59a656dd4a2e76c92950e906521ca1ddc8c6e2db98afa7\": rpc error: code = NotFound desc = could not find container \"91eabd7220e84be6dc59a656dd4a2e76c92950e906521ca1ddc8c6e2db98afa7\": container with ID starting with 91eabd7220e84be6dc59a656dd4a2e76c92950e906521ca1ddc8c6e2db98afa7 not found: ID does not exist" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.252334 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.262866 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-shj49" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.268086 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.282470 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:16:30 crc kubenswrapper[4766]: E1002 11:16:30.283191 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e914485f-05fc-4f85-b902-2e43bcfc0bb5" containerName="kube-state-metrics" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.283285 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e914485f-05fc-4f85-b902-2e43bcfc0bb5" containerName="kube-state-metrics" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.283625 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e914485f-05fc-4f85-b902-2e43bcfc0bb5" containerName="kube-state-metrics" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.284373 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.290701 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.291232 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.291437 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.325294 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-shj49" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.391832 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5a8ba140-6dc8-4023-9789-7f288b85159b\") " pod="openstack/kube-state-metrics-0" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.391960 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5a8ba140-6dc8-4023-9789-7f288b85159b\") " pod="openstack/kube-state-metrics-0" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.392109 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxq8r\" (UniqueName: \"kubernetes.io/projected/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-api-access-qxq8r\") pod \"kube-state-metrics-0\" (UID: \"5a8ba140-6dc8-4023-9789-7f288b85159b\") " pod="openstack/kube-state-metrics-0" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.392173 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5a8ba140-6dc8-4023-9789-7f288b85159b\") " pod="openstack/kube-state-metrics-0" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.492970 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-shj49"] Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.493990 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxq8r\" (UniqueName: \"kubernetes.io/projected/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-api-access-qxq8r\") pod \"kube-state-metrics-0\" (UID: \"5a8ba140-6dc8-4023-9789-7f288b85159b\") " pod="openstack/kube-state-metrics-0" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.494058 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5a8ba140-6dc8-4023-9789-7f288b85159b\") " pod="openstack/kube-state-metrics-0" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.494119 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5a8ba140-6dc8-4023-9789-7f288b85159b\") " pod="openstack/kube-state-metrics-0" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.494179 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5a8ba140-6dc8-4023-9789-7f288b85159b\") " pod="openstack/kube-state-metrics-0" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.499094 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5a8ba140-6dc8-4023-9789-7f288b85159b\") " pod="openstack/kube-state-metrics-0" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.504231 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5a8ba140-6dc8-4023-9789-7f288b85159b\") " pod="openstack/kube-state-metrics-0" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.505916 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5a8ba140-6dc8-4023-9789-7f288b85159b\") " pod="openstack/kube-state-metrics-0" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.513280 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxq8r\" (UniqueName: \"kubernetes.io/projected/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-api-access-qxq8r\") pod \"kube-state-metrics-0\" (UID: \"5a8ba140-6dc8-4023-9789-7f288b85159b\") " pod="openstack/kube-state-metrics-0" Oct 02 11:16:30 crc kubenswrapper[4766]: I1002 11:16:30.611029 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.105463 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:16:31 crc kubenswrapper[4766]: W1002 11:16:31.107849 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a8ba140_6dc8_4023_9789_7f288b85159b.slice/crio-bae717950829c29120d95be44d719c6349367f0c2fd854a2a1a7e62843f94457 WatchSource:0}: Error finding container bae717950829c29120d95be44d719c6349367f0c2fd854a2a1a7e62843f94457: Status 404 returned error can't find the container with id bae717950829c29120d95be44d719c6349367f0c2fd854a2a1a7e62843f94457 Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.217871 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5a8ba140-6dc8-4023-9789-7f288b85159b","Type":"ContainerStarted","Data":"bae717950829c29120d95be44d719c6349367f0c2fd854a2a1a7e62843f94457"} Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.223971 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.224255 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43489c17-898c-487c-b134-a6aa6c441299" containerName="ceilometer-central-agent" containerID="cri-o://2f193d89bd1c1213ea9f2fdb4398b39637ce5d08394de4e011e48dfc01f7cb72" gracePeriod=30 Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.224634 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43489c17-898c-487c-b134-a6aa6c441299" containerName="proxy-httpd" containerID="cri-o://6822bd7f7084b431b4a8b5176ad9646a3ddb5dbd2ec7441b1475cd57cdb05e43" gracePeriod=30 Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.224701 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43489c17-898c-487c-b134-a6aa6c441299" containerName="ceilometer-notification-agent" containerID="cri-o://d8860d058d4acc0515a92572caa65aaaa3305c0653b58e0e58649e153729e19d" gracePeriod=30 Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.224717 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43489c17-898c-487c-b134-a6aa6c441299" containerName="sg-core" containerID="cri-o://2eecbaec9a47b5b971b34a006e92793fa386e41fcaf08c4b10f863cd047efa92" gracePeriod=30 Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.599962 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4gmwl" Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.719375 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-scripts\") pod \"c803d467-a739-40aa-9dc9-4f04e6e14923\" (UID: \"c803d467-a739-40aa-9dc9-4f04e6e14923\") " Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.719671 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-combined-ca-bundle\") pod \"c803d467-a739-40aa-9dc9-4f04e6e14923\" (UID: \"c803d467-a739-40aa-9dc9-4f04e6e14923\") " Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.719747 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jtz9\" (UniqueName: \"kubernetes.io/projected/c803d467-a739-40aa-9dc9-4f04e6e14923-kube-api-access-7jtz9\") pod \"c803d467-a739-40aa-9dc9-4f04e6e14923\" (UID: \"c803d467-a739-40aa-9dc9-4f04e6e14923\") " Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.719796 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-config-data\") pod \"c803d467-a739-40aa-9dc9-4f04e6e14923\" (UID: \"c803d467-a739-40aa-9dc9-4f04e6e14923\") " Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.727405 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-scripts" (OuterVolumeSpecName: "scripts") pod "c803d467-a739-40aa-9dc9-4f04e6e14923" (UID: "c803d467-a739-40aa-9dc9-4f04e6e14923"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.728755 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c803d467-a739-40aa-9dc9-4f04e6e14923-kube-api-access-7jtz9" (OuterVolumeSpecName: "kube-api-access-7jtz9") pod "c803d467-a739-40aa-9dc9-4f04e6e14923" (UID: "c803d467-a739-40aa-9dc9-4f04e6e14923"). InnerVolumeSpecName "kube-api-access-7jtz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.753218 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-config-data" (OuterVolumeSpecName: "config-data") pod "c803d467-a739-40aa-9dc9-4f04e6e14923" (UID: "c803d467-a739-40aa-9dc9-4f04e6e14923"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.757563 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c803d467-a739-40aa-9dc9-4f04e6e14923" (UID: "c803d467-a739-40aa-9dc9-4f04e6e14923"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.822015 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.822047 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jtz9\" (UniqueName: \"kubernetes.io/projected/c803d467-a739-40aa-9dc9-4f04e6e14923-kube-api-access-7jtz9\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.822056 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.822066 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c803d467-a739-40aa-9dc9-4f04e6e14923-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:31 crc kubenswrapper[4766]: I1002 11:16:31.891736 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e914485f-05fc-4f85-b902-2e43bcfc0bb5" path="/var/lib/kubelet/pods/e914485f-05fc-4f85-b902-2e43bcfc0bb5/volumes" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.231165 4766 generic.go:334] "Generic (PLEG): container finished" podID="43489c17-898c-487c-b134-a6aa6c441299" containerID="6822bd7f7084b431b4a8b5176ad9646a3ddb5dbd2ec7441b1475cd57cdb05e43" exitCode=0 Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.231205 4766 generic.go:334] "Generic (PLEG): container finished" podID="43489c17-898c-487c-b134-a6aa6c441299" containerID="2eecbaec9a47b5b971b34a006e92793fa386e41fcaf08c4b10f863cd047efa92" exitCode=2 Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.231216 4766 generic.go:334] "Generic (PLEG): container finished" podID="43489c17-898c-487c-b134-a6aa6c441299" containerID="2f193d89bd1c1213ea9f2fdb4398b39637ce5d08394de4e011e48dfc01f7cb72" exitCode=0 Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.231240 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43489c17-898c-487c-b134-a6aa6c441299","Type":"ContainerDied","Data":"6822bd7f7084b431b4a8b5176ad9646a3ddb5dbd2ec7441b1475cd57cdb05e43"} Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.231310 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43489c17-898c-487c-b134-a6aa6c441299","Type":"ContainerDied","Data":"2eecbaec9a47b5b971b34a006e92793fa386e41fcaf08c4b10f863cd047efa92"} Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.231363 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43489c17-898c-487c-b134-a6aa6c441299","Type":"ContainerDied","Data":"2f193d89bd1c1213ea9f2fdb4398b39637ce5d08394de4e011e48dfc01f7cb72"} Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.232863 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-4gmwl" event={"ID":"c803d467-a739-40aa-9dc9-4f04e6e14923","Type":"ContainerDied","Data":"fa9052e62690f95e2191cf268e897bb0a1e860a19cd2ec9edf0765924cc3e565"} Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.232898 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa9052e62690f95e2191cf268e897bb0a1e860a19cd2ec9edf0765924cc3e565" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.232959 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-4gmwl" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.236385 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5a8ba140-6dc8-4023-9789-7f288b85159b","Type":"ContainerStarted","Data":"24dcbf81a6048e4223093aaf313d135dc7e342e1aad2567595ccd81590fd91ce"} Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.236544 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-shj49" podUID="005c0030-75ad-49fa-9f5c-60ebe0fbedb0" containerName="registry-server" containerID="cri-o://1f3b8c3333dc0379b7a0426a6b14572fef9cb6145a584e2efe916428d57d733e" gracePeriod=2 Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.279378 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.9057025140000001 podStartE2EDuration="2.27936305s" podCreationTimestamp="2025-10-02 11:16:30 +0000 UTC" firstStartedPulling="2025-10-02 11:16:31.109550303 +0000 UTC m=+1506.052421247" lastFinishedPulling="2025-10-02 11:16:31.483210839 +0000 UTC m=+1506.426081783" observedRunningTime="2025-10-02 11:16:32.273617326 +0000 UTC m=+1507.216488280" watchObservedRunningTime="2025-10-02 11:16:32.27936305 +0000 UTC m=+1507.222233994" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.321871 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:16:32 crc kubenswrapper[4766]: E1002 11:16:32.322638 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c803d467-a739-40aa-9dc9-4f04e6e14923" containerName="nova-cell0-conductor-db-sync" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.322655 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c803d467-a739-40aa-9dc9-4f04e6e14923" containerName="nova-cell0-conductor-db-sync" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.322892 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c803d467-a739-40aa-9dc9-4f04e6e14923" containerName="nova-cell0-conductor-db-sync" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.323456 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.325824 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-44h9r" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.326359 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.332359 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.435272 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x46b\" (UniqueName: \"kubernetes.io/projected/db4400a2-c286-467e-b62d-a5cb3042aa88-kube-api-access-8x46b\") pod \"nova-cell0-conductor-0\" (UID: \"db4400a2-c286-467e-b62d-a5cb3042aa88\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.435354 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4400a2-c286-467e-b62d-a5cb3042aa88-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"db4400a2-c286-467e-b62d-a5cb3042aa88\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.435425 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db4400a2-c286-467e-b62d-a5cb3042aa88-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"db4400a2-c286-467e-b62d-a5cb3042aa88\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.537422 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db4400a2-c286-467e-b62d-a5cb3042aa88-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"db4400a2-c286-467e-b62d-a5cb3042aa88\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.537711 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x46b\" (UniqueName: \"kubernetes.io/projected/db4400a2-c286-467e-b62d-a5cb3042aa88-kube-api-access-8x46b\") pod \"nova-cell0-conductor-0\" (UID: \"db4400a2-c286-467e-b62d-a5cb3042aa88\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.537805 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4400a2-c286-467e-b62d-a5cb3042aa88-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"db4400a2-c286-467e-b62d-a5cb3042aa88\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.543552 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db4400a2-c286-467e-b62d-a5cb3042aa88-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"db4400a2-c286-467e-b62d-a5cb3042aa88\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.543724 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4400a2-c286-467e-b62d-a5cb3042aa88-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"db4400a2-c286-467e-b62d-a5cb3042aa88\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.556219 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x46b\" (UniqueName: \"kubernetes.io/projected/db4400a2-c286-467e-b62d-a5cb3042aa88-kube-api-access-8x46b\") pod \"nova-cell0-conductor-0\" (UID: \"db4400a2-c286-467e-b62d-a5cb3042aa88\") " pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.567467 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-shj49" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.639885 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-utilities\") pod \"005c0030-75ad-49fa-9f5c-60ebe0fbedb0\" (UID: \"005c0030-75ad-49fa-9f5c-60ebe0fbedb0\") " Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.639989 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-catalog-content\") pod \"005c0030-75ad-49fa-9f5c-60ebe0fbedb0\" (UID: \"005c0030-75ad-49fa-9f5c-60ebe0fbedb0\") " Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.640019 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzzgm\" (UniqueName: \"kubernetes.io/projected/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-kube-api-access-vzzgm\") pod \"005c0030-75ad-49fa-9f5c-60ebe0fbedb0\" (UID: \"005c0030-75ad-49fa-9f5c-60ebe0fbedb0\") " Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.641597 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-utilities" (OuterVolumeSpecName: "utilities") pod "005c0030-75ad-49fa-9f5c-60ebe0fbedb0" (UID: "005c0030-75ad-49fa-9f5c-60ebe0fbedb0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.643962 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-kube-api-access-vzzgm" (OuterVolumeSpecName: "kube-api-access-vzzgm") pod "005c0030-75ad-49fa-9f5c-60ebe0fbedb0" (UID: "005c0030-75ad-49fa-9f5c-60ebe0fbedb0"). InnerVolumeSpecName "kube-api-access-vzzgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.655654 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "005c0030-75ad-49fa-9f5c-60ebe0fbedb0" (UID: "005c0030-75ad-49fa-9f5c-60ebe0fbedb0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.693309 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.742150 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.742183 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzzgm\" (UniqueName: \"kubernetes.io/projected/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-kube-api-access-vzzgm\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:32 crc kubenswrapper[4766]: I1002 11:16:32.742192 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/005c0030-75ad-49fa-9f5c-60ebe0fbedb0-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.130220 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.247742 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"db4400a2-c286-467e-b62d-a5cb3042aa88","Type":"ContainerStarted","Data":"c1ccaeb4d1397ace6d1b78c16ae286d96f01561a75118cf137042a01f30abc40"} Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.250704 4766 generic.go:334] "Generic (PLEG): container finished" podID="005c0030-75ad-49fa-9f5c-60ebe0fbedb0" containerID="1f3b8c3333dc0379b7a0426a6b14572fef9cb6145a584e2efe916428d57d733e" exitCode=0 Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.250844 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-shj49" Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.251530 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shj49" event={"ID":"005c0030-75ad-49fa-9f5c-60ebe0fbedb0","Type":"ContainerDied","Data":"1f3b8c3333dc0379b7a0426a6b14572fef9cb6145a584e2efe916428d57d733e"} Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.251554 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-shj49" event={"ID":"005c0030-75ad-49fa-9f5c-60ebe0fbedb0","Type":"ContainerDied","Data":"62a56633d6c10091654a59989af81cbd3ded3e854233b66968d058d91c706d4a"} Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.251572 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.251592 4766 scope.go:117] "RemoveContainer" containerID="1f3b8c3333dc0379b7a0426a6b14572fef9cb6145a584e2efe916428d57d733e" Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.277890 4766 scope.go:117] "RemoveContainer" containerID="f5e5ec10695ede5c30e68d72187df8e8adfdef32a82757167024affaa886e474" Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.290634 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-shj49"] Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.298222 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-shj49"] Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.319900 4766 scope.go:117] "RemoveContainer" containerID="8be986fa0983b8d276c352ce07dd8e44159451272e4a4b27e56b8800c495b4c6" Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.354611 4766 scope.go:117] "RemoveContainer" containerID="1f3b8c3333dc0379b7a0426a6b14572fef9cb6145a584e2efe916428d57d733e" Oct 02 11:16:33 crc kubenswrapper[4766]: E1002 11:16:33.355040 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f3b8c3333dc0379b7a0426a6b14572fef9cb6145a584e2efe916428d57d733e\": container with ID starting with 1f3b8c3333dc0379b7a0426a6b14572fef9cb6145a584e2efe916428d57d733e not found: ID does not exist" containerID="1f3b8c3333dc0379b7a0426a6b14572fef9cb6145a584e2efe916428d57d733e" Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.355070 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f3b8c3333dc0379b7a0426a6b14572fef9cb6145a584e2efe916428d57d733e"} err="failed to get container status \"1f3b8c3333dc0379b7a0426a6b14572fef9cb6145a584e2efe916428d57d733e\": rpc error: code = NotFound desc = could not find container \"1f3b8c3333dc0379b7a0426a6b14572fef9cb6145a584e2efe916428d57d733e\": container with ID starting with 1f3b8c3333dc0379b7a0426a6b14572fef9cb6145a584e2efe916428d57d733e not found: ID does not exist" Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.355092 4766 scope.go:117] "RemoveContainer" containerID="f5e5ec10695ede5c30e68d72187df8e8adfdef32a82757167024affaa886e474" Oct 02 11:16:33 crc kubenswrapper[4766]: E1002 11:16:33.355286 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5e5ec10695ede5c30e68d72187df8e8adfdef32a82757167024affaa886e474\": container with ID starting with f5e5ec10695ede5c30e68d72187df8e8adfdef32a82757167024affaa886e474 not found: ID does not exist" containerID="f5e5ec10695ede5c30e68d72187df8e8adfdef32a82757167024affaa886e474" Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.355307 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5e5ec10695ede5c30e68d72187df8e8adfdef32a82757167024affaa886e474"} err="failed to get container status \"f5e5ec10695ede5c30e68d72187df8e8adfdef32a82757167024affaa886e474\": rpc error: code = NotFound desc = could not find container \"f5e5ec10695ede5c30e68d72187df8e8adfdef32a82757167024affaa886e474\": container with ID starting with f5e5ec10695ede5c30e68d72187df8e8adfdef32a82757167024affaa886e474 not found: ID does not exist" Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.355321 4766 scope.go:117] "RemoveContainer" containerID="8be986fa0983b8d276c352ce07dd8e44159451272e4a4b27e56b8800c495b4c6" Oct 02 11:16:33 crc kubenswrapper[4766]: E1002 11:16:33.355889 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be986fa0983b8d276c352ce07dd8e44159451272e4a4b27e56b8800c495b4c6\": container with ID starting with 8be986fa0983b8d276c352ce07dd8e44159451272e4a4b27e56b8800c495b4c6 not found: ID does not exist" containerID="8be986fa0983b8d276c352ce07dd8e44159451272e4a4b27e56b8800c495b4c6" Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.355914 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be986fa0983b8d276c352ce07dd8e44159451272e4a4b27e56b8800c495b4c6"} err="failed to get container status \"8be986fa0983b8d276c352ce07dd8e44159451272e4a4b27e56b8800c495b4c6\": rpc error: code = NotFound desc = could not find container \"8be986fa0983b8d276c352ce07dd8e44159451272e4a4b27e56b8800c495b4c6\": container with ID starting with 8be986fa0983b8d276c352ce07dd8e44159451272e4a4b27e56b8800c495b4c6 not found: ID does not exist" Oct 02 11:16:33 crc kubenswrapper[4766]: I1002 11:16:33.891827 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="005c0030-75ad-49fa-9f5c-60ebe0fbedb0" path="/var/lib/kubelet/pods/005c0030-75ad-49fa-9f5c-60ebe0fbedb0/volumes" Oct 02 11:16:34 crc kubenswrapper[4766]: I1002 11:16:34.263299 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"db4400a2-c286-467e-b62d-a5cb3042aa88","Type":"ContainerStarted","Data":"43a906e7afa204c5a711bfc8ccb6e78006f4ffebb41d2ad46a51e21a16f179ce"} Oct 02 11:16:34 crc kubenswrapper[4766]: I1002 11:16:34.263806 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:34 crc kubenswrapper[4766]: I1002 11:16:34.282799 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.282782098 podStartE2EDuration="2.282782098s" podCreationTimestamp="2025-10-02 11:16:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:16:34.27629449 +0000 UTC m=+1509.219165434" watchObservedRunningTime="2025-10-02 11:16:34.282782098 +0000 UTC m=+1509.225653042" Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.291604 4766 generic.go:334] "Generic (PLEG): container finished" podID="43489c17-898c-487c-b134-a6aa6c441299" containerID="d8860d058d4acc0515a92572caa65aaaa3305c0653b58e0e58649e153729e19d" exitCode=0 Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.291694 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43489c17-898c-487c-b134-a6aa6c441299","Type":"ContainerDied","Data":"d8860d058d4acc0515a92572caa65aaaa3305c0653b58e0e58649e153729e19d"} Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.517585 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.626964 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43489c17-898c-487c-b134-a6aa6c441299-log-httpd\") pod \"43489c17-898c-487c-b134-a6aa6c441299\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.627066 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-config-data\") pod \"43489c17-898c-487c-b134-a6aa6c441299\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.627153 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkn4d\" (UniqueName: \"kubernetes.io/projected/43489c17-898c-487c-b134-a6aa6c441299-kube-api-access-pkn4d\") pod \"43489c17-898c-487c-b134-a6aa6c441299\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.627208 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-combined-ca-bundle\") pod \"43489c17-898c-487c-b134-a6aa6c441299\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.627233 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-scripts\") pod \"43489c17-898c-487c-b134-a6aa6c441299\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.627282 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43489c17-898c-487c-b134-a6aa6c441299-run-httpd\") pod \"43489c17-898c-487c-b134-a6aa6c441299\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.627315 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-sg-core-conf-yaml\") pod \"43489c17-898c-487c-b134-a6aa6c441299\" (UID: \"43489c17-898c-487c-b134-a6aa6c441299\") " Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.627638 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43489c17-898c-487c-b134-a6aa6c441299-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "43489c17-898c-487c-b134-a6aa6c441299" (UID: "43489c17-898c-487c-b134-a6aa6c441299"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.627823 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43489c17-898c-487c-b134-a6aa6c441299-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.628698 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43489c17-898c-487c-b134-a6aa6c441299-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "43489c17-898c-487c-b134-a6aa6c441299" (UID: "43489c17-898c-487c-b134-a6aa6c441299"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.644569 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43489c17-898c-487c-b134-a6aa6c441299-kube-api-access-pkn4d" (OuterVolumeSpecName: "kube-api-access-pkn4d") pod "43489c17-898c-487c-b134-a6aa6c441299" (UID: "43489c17-898c-487c-b134-a6aa6c441299"). InnerVolumeSpecName "kube-api-access-pkn4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.653989 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-scripts" (OuterVolumeSpecName: "scripts") pod "43489c17-898c-487c-b134-a6aa6c441299" (UID: "43489c17-898c-487c-b134-a6aa6c441299"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.660645 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "43489c17-898c-487c-b134-a6aa6c441299" (UID: "43489c17-898c-487c-b134-a6aa6c441299"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.716323 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43489c17-898c-487c-b134-a6aa6c441299" (UID: "43489c17-898c-487c-b134-a6aa6c441299"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.729736 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.729773 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkn4d\" (UniqueName: \"kubernetes.io/projected/43489c17-898c-487c-b134-a6aa6c441299-kube-api-access-pkn4d\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.729784 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.729791 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.729799 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43489c17-898c-487c-b134-a6aa6c441299-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.737416 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-config-data" (OuterVolumeSpecName: "config-data") pod "43489c17-898c-487c-b134-a6aa6c441299" (UID: "43489c17-898c-487c-b134-a6aa6c441299"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.831151 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43489c17-898c-487c-b134-a6aa6c441299-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:36 crc kubenswrapper[4766]: I1002 11:16:36.899809 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fqng4" podUID="fe991bdf-f974-4959-bb0b-e10001c1c380" containerName="registry-server" probeResult="failure" output=< Oct 02 11:16:36 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Oct 02 11:16:36 crc kubenswrapper[4766]: > Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.308032 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43489c17-898c-487c-b134-a6aa6c441299","Type":"ContainerDied","Data":"8ae9682cfc74325ae47309c018f54eb483a1d77f520ae5c2855f25d5b2ca4e78"} Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.308600 4766 scope.go:117] "RemoveContainer" containerID="6822bd7f7084b431b4a8b5176ad9646a3ddb5dbd2ec7441b1475cd57cdb05e43" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.308100 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.335789 4766 scope.go:117] "RemoveContainer" containerID="2eecbaec9a47b5b971b34a006e92793fa386e41fcaf08c4b10f863cd047efa92" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.364155 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.369865 4766 scope.go:117] "RemoveContainer" containerID="d8860d058d4acc0515a92572caa65aaaa3305c0653b58e0e58649e153729e19d" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.410823 4766 scope.go:117] "RemoveContainer" containerID="2f193d89bd1c1213ea9f2fdb4398b39637ce5d08394de4e011e48dfc01f7cb72" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.422065 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.441494 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:37 crc kubenswrapper[4766]: E1002 11:16:37.441990 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43489c17-898c-487c-b134-a6aa6c441299" containerName="sg-core" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.442008 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="43489c17-898c-487c-b134-a6aa6c441299" containerName="sg-core" Oct 02 11:16:37 crc kubenswrapper[4766]: E1002 11:16:37.442034 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43489c17-898c-487c-b134-a6aa6c441299" containerName="ceilometer-notification-agent" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.442041 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="43489c17-898c-487c-b134-a6aa6c441299" containerName="ceilometer-notification-agent" Oct 02 11:16:37 crc kubenswrapper[4766]: E1002 11:16:37.442054 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="005c0030-75ad-49fa-9f5c-60ebe0fbedb0" containerName="registry-server" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.442062 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="005c0030-75ad-49fa-9f5c-60ebe0fbedb0" containerName="registry-server" Oct 02 11:16:37 crc kubenswrapper[4766]: E1002 11:16:37.442078 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43489c17-898c-487c-b134-a6aa6c441299" containerName="ceilometer-central-agent" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.442085 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="43489c17-898c-487c-b134-a6aa6c441299" containerName="ceilometer-central-agent" Oct 02 11:16:37 crc kubenswrapper[4766]: E1002 11:16:37.442099 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43489c17-898c-487c-b134-a6aa6c441299" containerName="proxy-httpd" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.442105 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="43489c17-898c-487c-b134-a6aa6c441299" containerName="proxy-httpd" Oct 02 11:16:37 crc kubenswrapper[4766]: E1002 11:16:37.442118 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="005c0030-75ad-49fa-9f5c-60ebe0fbedb0" containerName="extract-content" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.442124 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="005c0030-75ad-49fa-9f5c-60ebe0fbedb0" containerName="extract-content" Oct 02 11:16:37 crc kubenswrapper[4766]: E1002 11:16:37.442132 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="005c0030-75ad-49fa-9f5c-60ebe0fbedb0" containerName="extract-utilities" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.442139 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="005c0030-75ad-49fa-9f5c-60ebe0fbedb0" containerName="extract-utilities" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.442318 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="43489c17-898c-487c-b134-a6aa6c441299" containerName="proxy-httpd" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.442329 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="43489c17-898c-487c-b134-a6aa6c441299" containerName="sg-core" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.442341 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="43489c17-898c-487c-b134-a6aa6c441299" containerName="ceilometer-central-agent" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.442359 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="43489c17-898c-487c-b134-a6aa6c441299" containerName="ceilometer-notification-agent" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.442374 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="005c0030-75ad-49fa-9f5c-60ebe0fbedb0" containerName="registry-server" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.444464 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.446660 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.446914 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.450521 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.455263 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.543663 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.543747 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.543799 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.543884 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgfqj\" (UniqueName: \"kubernetes.io/projected/2f382271-dcd5-4199-a98a-490ddb92e1b4-kube-api-access-fgfqj\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.544268 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-scripts\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.544310 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f382271-dcd5-4199-a98a-490ddb92e1b4-run-httpd\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.544337 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f382271-dcd5-4199-a98a-490ddb92e1b4-log-httpd\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.544468 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-config-data\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.645806 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgfqj\" (UniqueName: \"kubernetes.io/projected/2f382271-dcd5-4199-a98a-490ddb92e1b4-kube-api-access-fgfqj\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.645876 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-scripts\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.645897 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f382271-dcd5-4199-a98a-490ddb92e1b4-run-httpd\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.645912 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f382271-dcd5-4199-a98a-490ddb92e1b4-log-httpd\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.645947 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-config-data\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.645994 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.646015 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.646039 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.646561 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f382271-dcd5-4199-a98a-490ddb92e1b4-run-httpd\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.646822 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f382271-dcd5-4199-a98a-490ddb92e1b4-log-httpd\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.652066 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.652156 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-config-data\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.652424 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.653689 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.653710 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-scripts\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.674991 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgfqj\" (UniqueName: \"kubernetes.io/projected/2f382271-dcd5-4199-a98a-490ddb92e1b4-kube-api-access-fgfqj\") pod \"ceilometer-0\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.762183 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:16:37 crc kubenswrapper[4766]: I1002 11:16:37.910630 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43489c17-898c-487c-b134-a6aa6c441299" path="/var/lib/kubelet/pods/43489c17-898c-487c-b134-a6aa6c441299/volumes" Oct 02 11:16:38 crc kubenswrapper[4766]: I1002 11:16:38.244039 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:16:38 crc kubenswrapper[4766]: I1002 11:16:38.320381 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f382271-dcd5-4199-a98a-490ddb92e1b4","Type":"ContainerStarted","Data":"ca87442b80b9703deb8c67562e8306f2317043e5ff8d5ab361ca3d1d2637110e"} Oct 02 11:16:40 crc kubenswrapper[4766]: I1002 11:16:40.623015 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 02 11:16:42 crc kubenswrapper[4766]: I1002 11:16:42.720896 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.225660 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-2jv7w"] Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.227592 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2jv7w" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.229776 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.230048 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.276299 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-2jv7w"] Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.362526 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-config-data\") pod \"nova-cell0-cell-mapping-2jv7w\" (UID: \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\") " pod="openstack/nova-cell0-cell-mapping-2jv7w" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.362666 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2jv7w\" (UID: \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\") " pod="openstack/nova-cell0-cell-mapping-2jv7w" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.362697 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt68p\" (UniqueName: \"kubernetes.io/projected/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-kube-api-access-kt68p\") pod \"nova-cell0-cell-mapping-2jv7w\" (UID: \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\") " pod="openstack/nova-cell0-cell-mapping-2jv7w" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.362727 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-scripts\") pod \"nova-cell0-cell-mapping-2jv7w\" (UID: \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\") " pod="openstack/nova-cell0-cell-mapping-2jv7w" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.372771 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f382271-dcd5-4199-a98a-490ddb92e1b4","Type":"ContainerStarted","Data":"46c2ed514e2bdf9ef031ae67c47aca7ce83582591d3d5ceb160b1a1f8a5d12fc"} Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.415791 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.420381 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.426910 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.441141 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.464225 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-config-data\") pod \"nova-cell0-cell-mapping-2jv7w\" (UID: \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\") " pod="openstack/nova-cell0-cell-mapping-2jv7w" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.464369 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2jv7w\" (UID: \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\") " pod="openstack/nova-cell0-cell-mapping-2jv7w" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.464403 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt68p\" (UniqueName: \"kubernetes.io/projected/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-kube-api-access-kt68p\") pod \"nova-cell0-cell-mapping-2jv7w\" (UID: \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\") " pod="openstack/nova-cell0-cell-mapping-2jv7w" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.464428 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-scripts\") pod \"nova-cell0-cell-mapping-2jv7w\" (UID: \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\") " pod="openstack/nova-cell0-cell-mapping-2jv7w" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.483774 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-2jv7w\" (UID: \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\") " pod="openstack/nova-cell0-cell-mapping-2jv7w" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.484481 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-scripts\") pod \"nova-cell0-cell-mapping-2jv7w\" (UID: \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\") " pod="openstack/nova-cell0-cell-mapping-2jv7w" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.493822 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-config-data\") pod \"nova-cell0-cell-mapping-2jv7w\" (UID: \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\") " pod="openstack/nova-cell0-cell-mapping-2jv7w" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.501570 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt68p\" (UniqueName: \"kubernetes.io/projected/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-kube-api-access-kt68p\") pod \"nova-cell0-cell-mapping-2jv7w\" (UID: \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\") " pod="openstack/nova-cell0-cell-mapping-2jv7w" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.507126 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.508272 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.531806 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.546965 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.569530 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c916a84f-283b-4dcf-b4d2-4324b24305c2-config-data\") pod \"nova-api-0\" (UID: \"c916a84f-283b-4dcf-b4d2-4324b24305c2\") " pod="openstack/nova-api-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.569575 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db100297-f73c-4f83-b6dc-2f4d9661123f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"db100297-f73c-4f83-b6dc-2f4d9661123f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.569616 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c916a84f-283b-4dcf-b4d2-4324b24305c2-logs\") pod \"nova-api-0\" (UID: \"c916a84f-283b-4dcf-b4d2-4324b24305c2\") " pod="openstack/nova-api-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.569637 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db100297-f73c-4f83-b6dc-2f4d9661123f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"db100297-f73c-4f83-b6dc-2f4d9661123f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.569690 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g424s\" (UniqueName: \"kubernetes.io/projected/db100297-f73c-4f83-b6dc-2f4d9661123f-kube-api-access-g424s\") pod \"nova-cell1-novncproxy-0\" (UID: \"db100297-f73c-4f83-b6dc-2f4d9661123f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.569712 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpp72\" (UniqueName: \"kubernetes.io/projected/c916a84f-283b-4dcf-b4d2-4324b24305c2-kube-api-access-jpp72\") pod \"nova-api-0\" (UID: \"c916a84f-283b-4dcf-b4d2-4324b24305c2\") " pod="openstack/nova-api-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.569726 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c916a84f-283b-4dcf-b4d2-4324b24305c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c916a84f-283b-4dcf-b4d2-4324b24305c2\") " pod="openstack/nova-api-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.608828 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2jv7w" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.637024 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.638829 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.658140 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.671118 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c916a84f-283b-4dcf-b4d2-4324b24305c2-config-data\") pod \"nova-api-0\" (UID: \"c916a84f-283b-4dcf-b4d2-4324b24305c2\") " pod="openstack/nova-api-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.671169 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db100297-f73c-4f83-b6dc-2f4d9661123f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"db100297-f73c-4f83-b6dc-2f4d9661123f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.671230 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c916a84f-283b-4dcf-b4d2-4324b24305c2-logs\") pod \"nova-api-0\" (UID: \"c916a84f-283b-4dcf-b4d2-4324b24305c2\") " pod="openstack/nova-api-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.671252 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db100297-f73c-4f83-b6dc-2f4d9661123f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"db100297-f73c-4f83-b6dc-2f4d9661123f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.671334 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g424s\" (UniqueName: \"kubernetes.io/projected/db100297-f73c-4f83-b6dc-2f4d9661123f-kube-api-access-g424s\") pod \"nova-cell1-novncproxy-0\" (UID: \"db100297-f73c-4f83-b6dc-2f4d9661123f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.671360 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpp72\" (UniqueName: \"kubernetes.io/projected/c916a84f-283b-4dcf-b4d2-4324b24305c2-kube-api-access-jpp72\") pod \"nova-api-0\" (UID: \"c916a84f-283b-4dcf-b4d2-4324b24305c2\") " pod="openstack/nova-api-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.671385 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c916a84f-283b-4dcf-b4d2-4324b24305c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c916a84f-283b-4dcf-b4d2-4324b24305c2\") " pod="openstack/nova-api-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.672459 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c916a84f-283b-4dcf-b4d2-4324b24305c2-logs\") pod \"nova-api-0\" (UID: \"c916a84f-283b-4dcf-b4d2-4324b24305c2\") " pod="openstack/nova-api-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.683410 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c916a84f-283b-4dcf-b4d2-4324b24305c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c916a84f-283b-4dcf-b4d2-4324b24305c2\") " pod="openstack/nova-api-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.684292 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db100297-f73c-4f83-b6dc-2f4d9661123f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"db100297-f73c-4f83-b6dc-2f4d9661123f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.690205 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db100297-f73c-4f83-b6dc-2f4d9661123f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"db100297-f73c-4f83-b6dc-2f4d9661123f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.690729 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c916a84f-283b-4dcf-b4d2-4324b24305c2-config-data\") pod \"nova-api-0\" (UID: \"c916a84f-283b-4dcf-b4d2-4324b24305c2\") " pod="openstack/nova-api-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.708879 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-szvhz"] Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.731023 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-szvhz" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.739340 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpp72\" (UniqueName: \"kubernetes.io/projected/c916a84f-283b-4dcf-b4d2-4324b24305c2-kube-api-access-jpp72\") pod \"nova-api-0\" (UID: \"c916a84f-283b-4dcf-b4d2-4324b24305c2\") " pod="openstack/nova-api-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.749932 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.759142 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g424s\" (UniqueName: \"kubernetes.io/projected/db100297-f73c-4f83-b6dc-2f4d9661123f-kube-api-access-g424s\") pod \"nova-cell1-novncproxy-0\" (UID: \"db100297-f73c-4f83-b6dc-2f4d9661123f\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.767202 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.773323 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-logs\") pod \"nova-metadata-0\" (UID: \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\") " pod="openstack/nova-metadata-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.773379 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpgsf\" (UniqueName: \"kubernetes.io/projected/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-kube-api-access-lpgsf\") pod \"nova-metadata-0\" (UID: \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\") " pod="openstack/nova-metadata-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.773433 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-config-data\") pod \"nova-metadata-0\" (UID: \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\") " pod="openstack/nova-metadata-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.773464 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf9qd\" (UniqueName: \"kubernetes.io/projected/51a14c59-9b2a-4470-af2a-918b717cd721-kube-api-access-cf9qd\") pod \"certified-operators-szvhz\" (UID: \"51a14c59-9b2a-4470-af2a-918b717cd721\") " pod="openshift-marketplace/certified-operators-szvhz" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.773522 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a14c59-9b2a-4470-af2a-918b717cd721-utilities\") pod \"certified-operators-szvhz\" (UID: \"51a14c59-9b2a-4470-af2a-918b717cd721\") " pod="openshift-marketplace/certified-operators-szvhz" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.773580 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\") " pod="openstack/nova-metadata-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.773597 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a14c59-9b2a-4470-af2a-918b717cd721-catalog-content\") pod \"certified-operators-szvhz\" (UID: \"51a14c59-9b2a-4470-af2a-918b717cd721\") " pod="openshift-marketplace/certified-operators-szvhz" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.816532 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-szvhz"] Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.890721 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-logs\") pod \"nova-metadata-0\" (UID: \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\") " pod="openstack/nova-metadata-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.890775 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpgsf\" (UniqueName: \"kubernetes.io/projected/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-kube-api-access-lpgsf\") pod \"nova-metadata-0\" (UID: \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\") " pod="openstack/nova-metadata-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.890801 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-config-data\") pod \"nova-metadata-0\" (UID: \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\") " pod="openstack/nova-metadata-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.890821 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf9qd\" (UniqueName: \"kubernetes.io/projected/51a14c59-9b2a-4470-af2a-918b717cd721-kube-api-access-cf9qd\") pod \"certified-operators-szvhz\" (UID: \"51a14c59-9b2a-4470-af2a-918b717cd721\") " pod="openshift-marketplace/certified-operators-szvhz" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.890854 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a14c59-9b2a-4470-af2a-918b717cd721-utilities\") pod \"certified-operators-szvhz\" (UID: \"51a14c59-9b2a-4470-af2a-918b717cd721\") " pod="openshift-marketplace/certified-operators-szvhz" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.890920 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\") " pod="openstack/nova-metadata-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.890945 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a14c59-9b2a-4470-af2a-918b717cd721-catalog-content\") pod \"certified-operators-szvhz\" (UID: \"51a14c59-9b2a-4470-af2a-918b717cd721\") " pod="openshift-marketplace/certified-operators-szvhz" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.891479 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a14c59-9b2a-4470-af2a-918b717cd721-catalog-content\") pod \"certified-operators-szvhz\" (UID: \"51a14c59-9b2a-4470-af2a-918b717cd721\") " pod="openshift-marketplace/certified-operators-szvhz" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.891994 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a14c59-9b2a-4470-af2a-918b717cd721-utilities\") pod \"certified-operators-szvhz\" (UID: \"51a14c59-9b2a-4470-af2a-918b717cd721\") " pod="openshift-marketplace/certified-operators-szvhz" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.892557 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-logs\") pod \"nova-metadata-0\" (UID: \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\") " pod="openstack/nova-metadata-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.929864 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\") " pod="openstack/nova-metadata-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.930834 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-config-data\") pod \"nova-metadata-0\" (UID: \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\") " pod="openstack/nova-metadata-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.949839 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.960323 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.961443 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.961697 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.967875 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.969797 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-fmk4f"] Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.971575 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:43 crc kubenswrapper[4766]: I1002 11:16:43.984210 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-fmk4f"] Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.013049 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpgsf\" (UniqueName: \"kubernetes.io/projected/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-kube-api-access-lpgsf\") pod \"nova-metadata-0\" (UID: \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\") " pod="openstack/nova-metadata-0" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.015715 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf9qd\" (UniqueName: \"kubernetes.io/projected/51a14c59-9b2a-4470-af2a-918b717cd721-kube-api-access-cf9qd\") pod \"certified-operators-szvhz\" (UID: \"51a14c59-9b2a-4470-af2a-918b717cd721\") " pod="openshift-marketplace/certified-operators-szvhz" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.070916 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-szvhz" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.108683 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n5vr\" (UniqueName: \"kubernetes.io/projected/d260f950-8815-4b40-b2f7-5a27fca9690d-kube-api-access-4n5vr\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.108728 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-config\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.108783 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.108873 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.108911 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5de1c0-341f-411b-b68a-c95f34f52362-config-data\") pod \"nova-scheduler-0\" (UID: \"6c5de1c0-341f-411b-b68a-c95f34f52362\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.108930 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5de1c0-341f-411b-b68a-c95f34f52362-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6c5de1c0-341f-411b-b68a-c95f34f52362\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.109015 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vdrz\" (UniqueName: \"kubernetes.io/projected/6c5de1c0-341f-411b-b68a-c95f34f52362-kube-api-access-8vdrz\") pod \"nova-scheduler-0\" (UID: \"6c5de1c0-341f-411b-b68a-c95f34f52362\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.109044 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.109067 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.211933 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n5vr\" (UniqueName: \"kubernetes.io/projected/d260f950-8815-4b40-b2f7-5a27fca9690d-kube-api-access-4n5vr\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.211988 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-config\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.212024 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.212082 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.212108 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5de1c0-341f-411b-b68a-c95f34f52362-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6c5de1c0-341f-411b-b68a-c95f34f52362\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.212129 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5de1c0-341f-411b-b68a-c95f34f52362-config-data\") pod \"nova-scheduler-0\" (UID: \"6c5de1c0-341f-411b-b68a-c95f34f52362\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.212189 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vdrz\" (UniqueName: \"kubernetes.io/projected/6c5de1c0-341f-411b-b68a-c95f34f52362-kube-api-access-8vdrz\") pod \"nova-scheduler-0\" (UID: \"6c5de1c0-341f-411b-b68a-c95f34f52362\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.212211 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.212231 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.213281 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.214114 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-config\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.218372 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.219551 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.221588 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.225004 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5de1c0-341f-411b-b68a-c95f34f52362-config-data\") pod \"nova-scheduler-0\" (UID: \"6c5de1c0-341f-411b-b68a-c95f34f52362\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.232852 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5de1c0-341f-411b-b68a-c95f34f52362-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6c5de1c0-341f-411b-b68a-c95f34f52362\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.233490 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n5vr\" (UniqueName: \"kubernetes.io/projected/d260f950-8815-4b40-b2f7-5a27fca9690d-kube-api-access-4n5vr\") pod \"dnsmasq-dns-845d6d6f59-fmk4f\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.258475 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vdrz\" (UniqueName: \"kubernetes.io/projected/6c5de1c0-341f-411b-b68a-c95f34f52362-kube-api-access-8vdrz\") pod \"nova-scheduler-0\" (UID: \"6c5de1c0-341f-411b-b68a-c95f34f52362\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.278402 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.439980 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f382271-dcd5-4199-a98a-490ddb92e1b4","Type":"ContainerStarted","Data":"c5d7d2d79ad9957cb5b20466f8e1c91880dbebaa3656e4ecddf154e4b91af0d4"} Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.517000 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.537355 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.599714 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-2jv7w"] Oct 02 11:16:44 crc kubenswrapper[4766]: W1002 11:16:44.815714 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a14c59_9b2a_4470_af2a_918b717cd721.slice/crio-6a39c2bd2ba2986b38afeb570f10b11047a0315d9643d29a19811059090f5f55 WatchSource:0}: Error finding container 6a39c2bd2ba2986b38afeb570f10b11047a0315d9643d29a19811059090f5f55: Status 404 returned error can't find the container with id 6a39c2bd2ba2986b38afeb570f10b11047a0315d9643d29a19811059090f5f55 Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.829620 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-szvhz"] Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.877035 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:16:44 crc kubenswrapper[4766]: I1002 11:16:44.953520 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.186430 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.201043 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:16:45 crc kubenswrapper[4766]: W1002 11:16:45.226544 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c5de1c0_341f_411b_b68a_c95f34f52362.slice/crio-0ed389aaa15e8fa60d11da0b016d5b229fd7788b8cc585f211306f6a150ea98e WatchSource:0}: Error finding container 0ed389aaa15e8fa60d11da0b016d5b229fd7788b8cc585f211306f6a150ea98e: Status 404 returned error can't find the container with id 0ed389aaa15e8fa60d11da0b016d5b229fd7788b8cc585f211306f6a150ea98e Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.444995 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-fmk4f"] Oct 02 11:16:45 crc kubenswrapper[4766]: W1002 11:16:45.447428 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd260f950_8815_4b40_b2f7_5a27fca9690d.slice/crio-7322eaa43fb6c2ec5f705855324843ac52fa37fd7bfa23a73a366c72060bdad3 WatchSource:0}: Error finding container 7322eaa43fb6c2ec5f705855324843ac52fa37fd7bfa23a73a366c72060bdad3: Status 404 returned error can't find the container with id 7322eaa43fb6c2ec5f705855324843ac52fa37fd7bfa23a73a366c72060bdad3 Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.452091 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6c5de1c0-341f-411b-b68a-c95f34f52362","Type":"ContainerStarted","Data":"0ed389aaa15e8fa60d11da0b016d5b229fd7788b8cc585f211306f6a150ea98e"} Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.454983 4766 generic.go:334] "Generic (PLEG): container finished" podID="51a14c59-9b2a-4470-af2a-918b717cd721" containerID="aeaa160d71c90a2aa2e1cfca3faaf95a00b44d57bf733845e6206c1d24aca4c3" exitCode=0 Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.455054 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szvhz" event={"ID":"51a14c59-9b2a-4470-af2a-918b717cd721","Type":"ContainerDied","Data":"aeaa160d71c90a2aa2e1cfca3faaf95a00b44d57bf733845e6206c1d24aca4c3"} Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.455084 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szvhz" event={"ID":"51a14c59-9b2a-4470-af2a-918b717cd721","Type":"ContainerStarted","Data":"6a39c2bd2ba2986b38afeb570f10b11047a0315d9643d29a19811059090f5f55"} Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.457912 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2jv7w" event={"ID":"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f","Type":"ContainerStarted","Data":"3f01021d7015f6ce7b28a4464cf46b4c73b350981093792e92fec171df9fdb63"} Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.457948 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2jv7w" event={"ID":"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f","Type":"ContainerStarted","Data":"c94344fb3c90a120431952931b7bd2e2242e517d0a5c250099f91150fae722ee"} Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.459994 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c916a84f-283b-4dcf-b4d2-4324b24305c2","Type":"ContainerStarted","Data":"0c21cd108ceeb8d4b1142570a1adb39fe382cb0a294f08597959f0f978adede5"} Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.461266 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c","Type":"ContainerStarted","Data":"9b276da450b01b2110718bb0b8f43b4c7e78cc30480ee029918e56f9ee6662c4"} Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.466792 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"db100297-f73c-4f83-b6dc-2f4d9661123f","Type":"ContainerStarted","Data":"a19f6f6e2859b1a27c78e0da67baf01a6f896b96359feb12aac3a916f7d7cd71"} Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.488014 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f382271-dcd5-4199-a98a-490ddb92e1b4","Type":"ContainerStarted","Data":"793a807916e2716603f5252269b8f6ee91448251ec30bf50f028ad12d2deba0e"} Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.501095 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-2jv7w" podStartSLOduration=2.501077785 podStartE2EDuration="2.501077785s" podCreationTimestamp="2025-10-02 11:16:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:16:45.495882138 +0000 UTC m=+1520.438753092" watchObservedRunningTime="2025-10-02 11:16:45.501077785 +0000 UTC m=+1520.443948729" Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.598702 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b4dsb"] Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.600031 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-b4dsb" Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.607527 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.607699 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.619770 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b4dsb"] Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.680557 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2mxk\" (UniqueName: \"kubernetes.io/projected/dc899043-5f53-453c-bc00-0cc214647667-kube-api-access-x2mxk\") pod \"nova-cell1-conductor-db-sync-b4dsb\" (UID: \"dc899043-5f53-453c-bc00-0cc214647667\") " pod="openstack/nova-cell1-conductor-db-sync-b4dsb" Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.680966 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-b4dsb\" (UID: \"dc899043-5f53-453c-bc00-0cc214647667\") " pod="openstack/nova-cell1-conductor-db-sync-b4dsb" Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.681114 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-scripts\") pod \"nova-cell1-conductor-db-sync-b4dsb\" (UID: \"dc899043-5f53-453c-bc00-0cc214647667\") " pod="openstack/nova-cell1-conductor-db-sync-b4dsb" Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.681143 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-config-data\") pod \"nova-cell1-conductor-db-sync-b4dsb\" (UID: \"dc899043-5f53-453c-bc00-0cc214647667\") " pod="openstack/nova-cell1-conductor-db-sync-b4dsb" Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.784333 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-config-data\") pod \"nova-cell1-conductor-db-sync-b4dsb\" (UID: \"dc899043-5f53-453c-bc00-0cc214647667\") " pod="openstack/nova-cell1-conductor-db-sync-b4dsb" Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.784494 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2mxk\" (UniqueName: \"kubernetes.io/projected/dc899043-5f53-453c-bc00-0cc214647667-kube-api-access-x2mxk\") pod \"nova-cell1-conductor-db-sync-b4dsb\" (UID: \"dc899043-5f53-453c-bc00-0cc214647667\") " pod="openstack/nova-cell1-conductor-db-sync-b4dsb" Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.784563 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-b4dsb\" (UID: \"dc899043-5f53-453c-bc00-0cc214647667\") " pod="openstack/nova-cell1-conductor-db-sync-b4dsb" Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.784618 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-scripts\") pod \"nova-cell1-conductor-db-sync-b4dsb\" (UID: \"dc899043-5f53-453c-bc00-0cc214647667\") " pod="openstack/nova-cell1-conductor-db-sync-b4dsb" Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.790554 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-b4dsb\" (UID: \"dc899043-5f53-453c-bc00-0cc214647667\") " pod="openstack/nova-cell1-conductor-db-sync-b4dsb" Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.794568 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-config-data\") pod \"nova-cell1-conductor-db-sync-b4dsb\" (UID: \"dc899043-5f53-453c-bc00-0cc214647667\") " pod="openstack/nova-cell1-conductor-db-sync-b4dsb" Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.798428 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-scripts\") pod \"nova-cell1-conductor-db-sync-b4dsb\" (UID: \"dc899043-5f53-453c-bc00-0cc214647667\") " pod="openstack/nova-cell1-conductor-db-sync-b4dsb" Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.820556 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2mxk\" (UniqueName: \"kubernetes.io/projected/dc899043-5f53-453c-bc00-0cc214647667-kube-api-access-x2mxk\") pod \"nova-cell1-conductor-db-sync-b4dsb\" (UID: \"dc899043-5f53-453c-bc00-0cc214647667\") " pod="openstack/nova-cell1-conductor-db-sync-b4dsb" Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.961327 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fqng4" Oct 02 11:16:45 crc kubenswrapper[4766]: I1002 11:16:45.968829 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-b4dsb" Oct 02 11:16:46 crc kubenswrapper[4766]: I1002 11:16:46.043089 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fqng4" Oct 02 11:16:46 crc kubenswrapper[4766]: I1002 11:16:46.537058 4766 generic.go:334] "Generic (PLEG): container finished" podID="d260f950-8815-4b40-b2f7-5a27fca9690d" containerID="2758ac0da3918349e774e56df0a688cfe547b1d5b11b4b63da67dfac7a2ee514" exitCode=0 Oct 02 11:16:46 crc kubenswrapper[4766]: I1002 11:16:46.538734 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" event={"ID":"d260f950-8815-4b40-b2f7-5a27fca9690d","Type":"ContainerDied","Data":"2758ac0da3918349e774e56df0a688cfe547b1d5b11b4b63da67dfac7a2ee514"} Oct 02 11:16:46 crc kubenswrapper[4766]: I1002 11:16:46.538999 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" event={"ID":"d260f950-8815-4b40-b2f7-5a27fca9690d","Type":"ContainerStarted","Data":"7322eaa43fb6c2ec5f705855324843ac52fa37fd7bfa23a73a366c72060bdad3"} Oct 02 11:16:46 crc kubenswrapper[4766]: I1002 11:16:46.632877 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b4dsb"] Oct 02 11:16:47 crc kubenswrapper[4766]: I1002 11:16:47.555221 4766 generic.go:334] "Generic (PLEG): container finished" podID="51a14c59-9b2a-4470-af2a-918b717cd721" containerID="70e30137bc22caa61668617e1b1ac8762f9459d159ae604241ffbe4ea42a4b2e" exitCode=0 Oct 02 11:16:47 crc kubenswrapper[4766]: I1002 11:16:47.555591 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szvhz" event={"ID":"51a14c59-9b2a-4470-af2a-918b717cd721","Type":"ContainerDied","Data":"70e30137bc22caa61668617e1b1ac8762f9459d159ae604241ffbe4ea42a4b2e"} Oct 02 11:16:47 crc kubenswrapper[4766]: I1002 11:16:47.566975 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" event={"ID":"d260f950-8815-4b40-b2f7-5a27fca9690d","Type":"ContainerStarted","Data":"64f328e55f1eb3dace68499ae2f4a77afaef7496ecfc3efc0ef9ce8d165784dc"} Oct 02 11:16:47 crc kubenswrapper[4766]: I1002 11:16:47.567292 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:47 crc kubenswrapper[4766]: I1002 11:16:47.578640 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f382271-dcd5-4199-a98a-490ddb92e1b4","Type":"ContainerStarted","Data":"d68b8a196b3bc7566293f1f4bf2cfaa3d995d0598f5e72de076da6feb093d26b"} Oct 02 11:16:47 crc kubenswrapper[4766]: I1002 11:16:47.603643 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-b4dsb" event={"ID":"dc899043-5f53-453c-bc00-0cc214647667","Type":"ContainerStarted","Data":"5be38962640da76b9046534d30afa195b17c52471dba02787f5ee4b789836031"} Oct 02 11:16:47 crc kubenswrapper[4766]: I1002 11:16:47.603690 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-b4dsb" event={"ID":"dc899043-5f53-453c-bc00-0cc214647667","Type":"ContainerStarted","Data":"f7dbfd4a8d38956ea1477eae8dbf28083bf0c9a70c97331dc29578def72e2b60"} Oct 02 11:16:47 crc kubenswrapper[4766]: I1002 11:16:47.618069 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" podStartSLOduration=4.6180471149999995 podStartE2EDuration="4.618047115s" podCreationTimestamp="2025-10-02 11:16:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:16:47.612159617 +0000 UTC m=+1522.555030571" watchObservedRunningTime="2025-10-02 11:16:47.618047115 +0000 UTC m=+1522.560918059" Oct 02 11:16:48 crc kubenswrapper[4766]: I1002 11:16:48.128872 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fqng4"] Oct 02 11:16:48 crc kubenswrapper[4766]: I1002 11:16:48.129198 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fqng4" podUID="fe991bdf-f974-4959-bb0b-e10001c1c380" containerName="registry-server" containerID="cri-o://f9ae8d1bd91af50c0e00f98e594ee4555f7951d9a6144bf49b2eb1154160076b" gracePeriod=2 Oct 02 11:16:48 crc kubenswrapper[4766]: I1002 11:16:48.433552 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:16:48 crc kubenswrapper[4766]: I1002 11:16:48.443151 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:48 crc kubenswrapper[4766]: I1002 11:16:48.620510 4766 generic.go:334] "Generic (PLEG): container finished" podID="fe991bdf-f974-4959-bb0b-e10001c1c380" containerID="f9ae8d1bd91af50c0e00f98e594ee4555f7951d9a6144bf49b2eb1154160076b" exitCode=0 Oct 02 11:16:48 crc kubenswrapper[4766]: I1002 11:16:48.620591 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqng4" event={"ID":"fe991bdf-f974-4959-bb0b-e10001c1c380","Type":"ContainerDied","Data":"f9ae8d1bd91af50c0e00f98e594ee4555f7951d9a6144bf49b2eb1154160076b"} Oct 02 11:16:48 crc kubenswrapper[4766]: I1002 11:16:48.620935 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:16:48 crc kubenswrapper[4766]: I1002 11:16:48.646372 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.016407947 podStartE2EDuration="11.646351514s" podCreationTimestamp="2025-10-02 11:16:37 +0000 UTC" firstStartedPulling="2025-10-02 11:16:38.245466319 +0000 UTC m=+1513.188337263" lastFinishedPulling="2025-10-02 11:16:46.875409886 +0000 UTC m=+1521.818280830" observedRunningTime="2025-10-02 11:16:48.640102414 +0000 UTC m=+1523.582973378" watchObservedRunningTime="2025-10-02 11:16:48.646351514 +0000 UTC m=+1523.589222458" Oct 02 11:16:48 crc kubenswrapper[4766]: I1002 11:16:48.685148 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-b4dsb" podStartSLOduration=3.685117375 podStartE2EDuration="3.685117375s" podCreationTimestamp="2025-10-02 11:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:16:48.668258845 +0000 UTC m=+1523.611129789" watchObservedRunningTime="2025-10-02 11:16:48.685117375 +0000 UTC m=+1523.627988319" Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.081900 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqng4" Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.149852 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe991bdf-f974-4959-bb0b-e10001c1c380-utilities\") pod \"fe991bdf-f974-4959-bb0b-e10001c1c380\" (UID: \"fe991bdf-f974-4959-bb0b-e10001c1c380\") " Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.150040 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn2s8\" (UniqueName: \"kubernetes.io/projected/fe991bdf-f974-4959-bb0b-e10001c1c380-kube-api-access-qn2s8\") pod \"fe991bdf-f974-4959-bb0b-e10001c1c380\" (UID: \"fe991bdf-f974-4959-bb0b-e10001c1c380\") " Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.150106 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe991bdf-f974-4959-bb0b-e10001c1c380-catalog-content\") pod \"fe991bdf-f974-4959-bb0b-e10001c1c380\" (UID: \"fe991bdf-f974-4959-bb0b-e10001c1c380\") " Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.150733 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe991bdf-f974-4959-bb0b-e10001c1c380-utilities" (OuterVolumeSpecName: "utilities") pod "fe991bdf-f974-4959-bb0b-e10001c1c380" (UID: "fe991bdf-f974-4959-bb0b-e10001c1c380"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.158759 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe991bdf-f974-4959-bb0b-e10001c1c380-kube-api-access-qn2s8" (OuterVolumeSpecName: "kube-api-access-qn2s8") pod "fe991bdf-f974-4959-bb0b-e10001c1c380" (UID: "fe991bdf-f974-4959-bb0b-e10001c1c380"). InnerVolumeSpecName "kube-api-access-qn2s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.243407 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe991bdf-f974-4959-bb0b-e10001c1c380-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe991bdf-f974-4959-bb0b-e10001c1c380" (UID: "fe991bdf-f974-4959-bb0b-e10001c1c380"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.252914 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe991bdf-f974-4959-bb0b-e10001c1c380-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.253031 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe991bdf-f974-4959-bb0b-e10001c1c380-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.253110 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn2s8\" (UniqueName: \"kubernetes.io/projected/fe991bdf-f974-4959-bb0b-e10001c1c380-kube-api-access-qn2s8\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.649820 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fqng4" event={"ID":"fe991bdf-f974-4959-bb0b-e10001c1c380","Type":"ContainerDied","Data":"398fd8a860f21b451c92daf46b1c80ca5fc15cec28904cf2021f7c2098353beb"} Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.650080 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fqng4" Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.650113 4766 scope.go:117] "RemoveContainer" containerID="f9ae8d1bd91af50c0e00f98e594ee4555f7951d9a6144bf49b2eb1154160076b" Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.658803 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c916a84f-283b-4dcf-b4d2-4324b24305c2","Type":"ContainerStarted","Data":"9ed5169edf1a2fb32bd47a1caf150cd24766b58c122d3a390c7723f5f0b79cd6"} Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.658843 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c916a84f-283b-4dcf-b4d2-4324b24305c2","Type":"ContainerStarted","Data":"8879d8aceba87260854a150fd7bd38d8150c42379def04081d786951f3f1911a"} Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.664109 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c","Type":"ContainerStarted","Data":"cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b"} Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.664162 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c","Type":"ContainerStarted","Data":"cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281"} Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.664335 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d5bd3ac9-66fe-4121-85d6-33c0651f9b9c" containerName="nova-metadata-log" containerID="cri-o://cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281" gracePeriod=30 Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.664462 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d5bd3ac9-66fe-4121-85d6-33c0651f9b9c" containerName="nova-metadata-metadata" containerID="cri-o://cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b" gracePeriod=30 Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.681353 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"db100297-f73c-4f83-b6dc-2f4d9661123f","Type":"ContainerStarted","Data":"b6b20d6770b52679ed1ddfc4195e5c7c6a0b2ccd8ae24888f88c97c12c80f74f"} Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.681563 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="db100297-f73c-4f83-b6dc-2f4d9661123f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b6b20d6770b52679ed1ddfc4195e5c7c6a0b2ccd8ae24888f88c97c12c80f74f" gracePeriod=30 Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.683522 4766 scope.go:117] "RemoveContainer" containerID="72843f220341ecac2f876b6c9522e5107012e11c3baa2c75917d081cea48e947" Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.684962 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.8409597189999998 podStartE2EDuration="8.684945671s" podCreationTimestamp="2025-10-02 11:16:43 +0000 UTC" firstStartedPulling="2025-10-02 11:16:44.845697347 +0000 UTC m=+1519.788568291" lastFinishedPulling="2025-10-02 11:16:50.689683299 +0000 UTC m=+1525.632554243" observedRunningTime="2025-10-02 11:16:51.678207475 +0000 UTC m=+1526.621078419" watchObservedRunningTime="2025-10-02 11:16:51.684945671 +0000 UTC m=+1526.627816615" Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.697833 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6c5de1c0-341f-411b-b68a-c95f34f52362","Type":"ContainerStarted","Data":"0655f7e6c6a04cef7c1d5254e573657b1fb792ab814da3fda8e05b1969570ef7"} Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.711066 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fqng4"] Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.727867 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fqng4"] Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.729798 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szvhz" event={"ID":"51a14c59-9b2a-4470-af2a-918b717cd721","Type":"ContainerStarted","Data":"b40d0a1d0332dbd4545ef9b931c18a6d54125227d972ed85b2d13d8a295b7afc"} Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.734833 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.22229785 podStartE2EDuration="8.734813837s" podCreationTimestamp="2025-10-02 11:16:43 +0000 UTC" firstStartedPulling="2025-10-02 11:16:45.183955569 +0000 UTC m=+1520.126826513" lastFinishedPulling="2025-10-02 11:16:50.696471556 +0000 UTC m=+1525.639342500" observedRunningTime="2025-10-02 11:16:51.716367487 +0000 UTC m=+1526.659238431" watchObservedRunningTime="2025-10-02 11:16:51.734813837 +0000 UTC m=+1526.677684781" Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.747713 4766 scope.go:117] "RemoveContainer" containerID="a1563d2c6cc67fbfc0c650c00907947ca89e6f47afa6a831431516b6cbf09748" Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.750117 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.332300418 podStartE2EDuration="8.750100605s" podCreationTimestamp="2025-10-02 11:16:43 +0000 UTC" firstStartedPulling="2025-10-02 11:16:45.230595201 +0000 UTC m=+1520.173466145" lastFinishedPulling="2025-10-02 11:16:50.648395388 +0000 UTC m=+1525.591266332" observedRunningTime="2025-10-02 11:16:51.739097463 +0000 UTC m=+1526.681968407" watchObservedRunningTime="2025-10-02 11:16:51.750100605 +0000 UTC m=+1526.692971559" Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.764765 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.109604243 podStartE2EDuration="8.764746774s" podCreationTimestamp="2025-10-02 11:16:43 +0000 UTC" firstStartedPulling="2025-10-02 11:16:44.993246597 +0000 UTC m=+1519.936117541" lastFinishedPulling="2025-10-02 11:16:50.648389118 +0000 UTC m=+1525.591260072" observedRunningTime="2025-10-02 11:16:51.761234802 +0000 UTC m=+1526.704105746" watchObservedRunningTime="2025-10-02 11:16:51.764746774 +0000 UTC m=+1526.707617718" Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.794450 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-szvhz" podStartSLOduration=3.390246672 podStartE2EDuration="8.794431724s" podCreationTimestamp="2025-10-02 11:16:43 +0000 UTC" firstStartedPulling="2025-10-02 11:16:45.457907203 +0000 UTC m=+1520.400778147" lastFinishedPulling="2025-10-02 11:16:50.862092255 +0000 UTC m=+1525.804963199" observedRunningTime="2025-10-02 11:16:51.782383249 +0000 UTC m=+1526.725254193" watchObservedRunningTime="2025-10-02 11:16:51.794431724 +0000 UTC m=+1526.737302668" Oct 02 11:16:51 crc kubenswrapper[4766]: I1002 11:16:51.896042 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe991bdf-f974-4959-bb0b-e10001c1c380" path="/var/lib/kubelet/pods/fe991bdf-f974-4959-bb0b-e10001c1c380/volumes" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.471876 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.589533 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-logs\") pod \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\" (UID: \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\") " Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.589617 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-config-data\") pod \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\" (UID: \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\") " Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.589770 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-combined-ca-bundle\") pod \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\" (UID: \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\") " Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.589923 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-logs" (OuterVolumeSpecName: "logs") pod "d5bd3ac9-66fe-4121-85d6-33c0651f9b9c" (UID: "d5bd3ac9-66fe-4121-85d6-33c0651f9b9c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.590637 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpgsf\" (UniqueName: \"kubernetes.io/projected/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-kube-api-access-lpgsf\") pod \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\" (UID: \"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c\") " Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.591163 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.595541 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-kube-api-access-lpgsf" (OuterVolumeSpecName: "kube-api-access-lpgsf") pod "d5bd3ac9-66fe-4121-85d6-33c0651f9b9c" (UID: "d5bd3ac9-66fe-4121-85d6-33c0651f9b9c"). InnerVolumeSpecName "kube-api-access-lpgsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.620778 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-config-data" (OuterVolumeSpecName: "config-data") pod "d5bd3ac9-66fe-4121-85d6-33c0651f9b9c" (UID: "d5bd3ac9-66fe-4121-85d6-33c0651f9b9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.624588 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5bd3ac9-66fe-4121-85d6-33c0651f9b9c" (UID: "d5bd3ac9-66fe-4121-85d6-33c0651f9b9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.692526 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.692562 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.692587 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpgsf\" (UniqueName: \"kubernetes.io/projected/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c-kube-api-access-lpgsf\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.741727 4766 generic.go:334] "Generic (PLEG): container finished" podID="d5bd3ac9-66fe-4121-85d6-33c0651f9b9c" containerID="cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b" exitCode=0 Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.742076 4766 generic.go:334] "Generic (PLEG): container finished" podID="d5bd3ac9-66fe-4121-85d6-33c0651f9b9c" containerID="cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281" exitCode=143 Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.742130 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c","Type":"ContainerDied","Data":"cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b"} Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.742163 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c","Type":"ContainerDied","Data":"cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281"} Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.742178 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5bd3ac9-66fe-4121-85d6-33c0651f9b9c","Type":"ContainerDied","Data":"9b276da450b01b2110718bb0b8f43b4c7e78cc30480ee029918e56f9ee6662c4"} Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.742198 4766 scope.go:117] "RemoveContainer" containerID="cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.742357 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.773780 4766 scope.go:117] "RemoveContainer" containerID="cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.776684 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.790460 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.800220 4766 scope.go:117] "RemoveContainer" containerID="cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b" Oct 02 11:16:52 crc kubenswrapper[4766]: E1002 11:16:52.801862 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b\": container with ID starting with cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b not found: ID does not exist" containerID="cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.801896 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b"} err="failed to get container status \"cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b\": rpc error: code = NotFound desc = could not find container \"cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b\": container with ID starting with cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b not found: ID does not exist" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.801917 4766 scope.go:117] "RemoveContainer" containerID="cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281" Oct 02 11:16:52 crc kubenswrapper[4766]: E1002 11:16:52.808544 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281\": container with ID starting with cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281 not found: ID does not exist" containerID="cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.808595 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281"} err="failed to get container status \"cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281\": rpc error: code = NotFound desc = could not find container \"cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281\": container with ID starting with cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281 not found: ID does not exist" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.808633 4766 scope.go:117] "RemoveContainer" containerID="cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.808947 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b"} err="failed to get container status \"cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b\": rpc error: code = NotFound desc = could not find container \"cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b\": container with ID starting with cf38ebac6195931e004790f6b3f8772589d09799cd3b2e702f0dbb81c120ff3b not found: ID does not exist" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.808976 4766 scope.go:117] "RemoveContainer" containerID="cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.809544 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281"} err="failed to get container status \"cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281\": rpc error: code = NotFound desc = could not find container \"cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281\": container with ID starting with cca8df2ffc4583b130c50c28dbb45acee1c6e328e40caa804042402005dea281 not found: ID does not exist" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.816627 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:52 crc kubenswrapper[4766]: E1002 11:16:52.817034 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5bd3ac9-66fe-4121-85d6-33c0651f9b9c" containerName="nova-metadata-log" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.817049 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5bd3ac9-66fe-4121-85d6-33c0651f9b9c" containerName="nova-metadata-log" Oct 02 11:16:52 crc kubenswrapper[4766]: E1002 11:16:52.817062 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe991bdf-f974-4959-bb0b-e10001c1c380" containerName="extract-utilities" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.817068 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe991bdf-f974-4959-bb0b-e10001c1c380" containerName="extract-utilities" Oct 02 11:16:52 crc kubenswrapper[4766]: E1002 11:16:52.817082 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5bd3ac9-66fe-4121-85d6-33c0651f9b9c" containerName="nova-metadata-metadata" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.817088 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5bd3ac9-66fe-4121-85d6-33c0651f9b9c" containerName="nova-metadata-metadata" Oct 02 11:16:52 crc kubenswrapper[4766]: E1002 11:16:52.817108 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe991bdf-f974-4959-bb0b-e10001c1c380" containerName="extract-content" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.817115 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe991bdf-f974-4959-bb0b-e10001c1c380" containerName="extract-content" Oct 02 11:16:52 crc kubenswrapper[4766]: E1002 11:16:52.817134 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe991bdf-f974-4959-bb0b-e10001c1c380" containerName="registry-server" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.817147 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe991bdf-f974-4959-bb0b-e10001c1c380" containerName="registry-server" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.817311 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5bd3ac9-66fe-4121-85d6-33c0651f9b9c" containerName="nova-metadata-metadata" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.817338 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5bd3ac9-66fe-4121-85d6-33c0651f9b9c" containerName="nova-metadata-log" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.817351 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe991bdf-f974-4959-bb0b-e10001c1c380" containerName="registry-server" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.818560 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.823897 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.824694 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.832443 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.898413 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-config-data\") pod \"nova-metadata-0\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " pod="openstack/nova-metadata-0" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.898534 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b00c0dab-8c50-4401-9e82-40234dd5d6d5-logs\") pod \"nova-metadata-0\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " pod="openstack/nova-metadata-0" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.898600 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " pod="openstack/nova-metadata-0" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.898667 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " pod="openstack/nova-metadata-0" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.898721 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88mg8\" (UniqueName: \"kubernetes.io/projected/b00c0dab-8c50-4401-9e82-40234dd5d6d5-kube-api-access-88mg8\") pod \"nova-metadata-0\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " pod="openstack/nova-metadata-0" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.999827 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " pod="openstack/nova-metadata-0" Oct 02 11:16:52 crc kubenswrapper[4766]: I1002 11:16:52.999903 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88mg8\" (UniqueName: \"kubernetes.io/projected/b00c0dab-8c50-4401-9e82-40234dd5d6d5-kube-api-access-88mg8\") pod \"nova-metadata-0\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " pod="openstack/nova-metadata-0" Oct 02 11:16:53 crc kubenswrapper[4766]: I1002 11:16:52.999977 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-config-data\") pod \"nova-metadata-0\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " pod="openstack/nova-metadata-0" Oct 02 11:16:53 crc kubenswrapper[4766]: I1002 11:16:53.000032 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b00c0dab-8c50-4401-9e82-40234dd5d6d5-logs\") pod \"nova-metadata-0\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " pod="openstack/nova-metadata-0" Oct 02 11:16:53 crc kubenswrapper[4766]: I1002 11:16:53.000267 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " pod="openstack/nova-metadata-0" Oct 02 11:16:53 crc kubenswrapper[4766]: I1002 11:16:53.001723 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b00c0dab-8c50-4401-9e82-40234dd5d6d5-logs\") pod \"nova-metadata-0\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " pod="openstack/nova-metadata-0" Oct 02 11:16:53 crc kubenswrapper[4766]: I1002 11:16:53.006321 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " pod="openstack/nova-metadata-0" Oct 02 11:16:53 crc kubenswrapper[4766]: I1002 11:16:53.011204 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-config-data\") pod \"nova-metadata-0\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " pod="openstack/nova-metadata-0" Oct 02 11:16:53 crc kubenswrapper[4766]: I1002 11:16:53.020266 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " pod="openstack/nova-metadata-0" Oct 02 11:16:53 crc kubenswrapper[4766]: I1002 11:16:53.023257 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88mg8\" (UniqueName: \"kubernetes.io/projected/b00c0dab-8c50-4401-9e82-40234dd5d6d5-kube-api-access-88mg8\") pod \"nova-metadata-0\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " pod="openstack/nova-metadata-0" Oct 02 11:16:53 crc kubenswrapper[4766]: I1002 11:16:53.155290 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:16:53 crc kubenswrapper[4766]: I1002 11:16:53.609118 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:53 crc kubenswrapper[4766]: W1002 11:16:53.611614 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb00c0dab_8c50_4401_9e82_40234dd5d6d5.slice/crio-0c1937236e56a04004a39c9fb36888b15bf5d897cd46a4caa5bc3c494f07c411 WatchSource:0}: Error finding container 0c1937236e56a04004a39c9fb36888b15bf5d897cd46a4caa5bc3c494f07c411: Status 404 returned error can't find the container with id 0c1937236e56a04004a39c9fb36888b15bf5d897cd46a4caa5bc3c494f07c411 Oct 02 11:16:53 crc kubenswrapper[4766]: I1002 11:16:53.759563 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b00c0dab-8c50-4401-9e82-40234dd5d6d5","Type":"ContainerStarted","Data":"0c1937236e56a04004a39c9fb36888b15bf5d897cd46a4caa5bc3c494f07c411"} Oct 02 11:16:53 crc kubenswrapper[4766]: I1002 11:16:53.768000 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:16:53 crc kubenswrapper[4766]: I1002 11:16:53.768291 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:16:53 crc kubenswrapper[4766]: I1002 11:16:53.892911 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5bd3ac9-66fe-4121-85d6-33c0651f9b9c" path="/var/lib/kubelet/pods/d5bd3ac9-66fe-4121-85d6-33c0651f9b9c/volumes" Oct 02 11:16:53 crc kubenswrapper[4766]: I1002 11:16:53.953623 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.072368 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-szvhz" Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.072419 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-szvhz" Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.432317 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.432384 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.432432 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.433045 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.433113 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" gracePeriod=600 Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.518015 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.519362 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.575715 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:16:54 crc kubenswrapper[4766]: E1002 11:16:54.581599 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.588281 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.671075 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-ljnft"] Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.671341 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-ljnft" podUID="e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2" containerName="dnsmasq-dns" containerID="cri-o://20b9b7aff7b3a88aadfaa1c77608b6649cd13f567278c0707c1122d97f8a9f6d" gracePeriod=10 Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.777295 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" exitCode=0 Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.777354 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e"} Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.777387 4766 scope.go:117] "RemoveContainer" containerID="586f742ea27e273779868792840bda390cd263c60dd6b64b6d933d49d83569e4" Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.777987 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:16:54 crc kubenswrapper[4766]: E1002 11:16:54.778211 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.795537 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b00c0dab-8c50-4401-9e82-40234dd5d6d5","Type":"ContainerStarted","Data":"6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a"} Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.795587 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b00c0dab-8c50-4401-9e82-40234dd5d6d5","Type":"ContainerStarted","Data":"4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df"} Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.805098 4766 generic.go:334] "Generic (PLEG): container finished" podID="0ecd07b2-f47c-44c2-8c54-943c8c91ef0f" containerID="3f01021d7015f6ce7b28a4464cf46b4c73b350981093792e92fec171df9fdb63" exitCode=0 Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.805434 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2jv7w" event={"ID":"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f","Type":"ContainerDied","Data":"3f01021d7015f6ce7b28a4464cf46b4c73b350981093792e92fec171df9fdb63"} Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.854047 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c916a84f-283b-4dcf-b4d2-4324b24305c2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.854288 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c916a84f-283b-4dcf-b4d2-4324b24305c2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.868299 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.868276569 podStartE2EDuration="2.868276569s" podCreationTimestamp="2025-10-02 11:16:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:16:54.845112547 +0000 UTC m=+1529.787983491" watchObservedRunningTime="2025-10-02 11:16:54.868276569 +0000 UTC m=+1529.811147523" Oct 02 11:16:54 crc kubenswrapper[4766]: I1002 11:16:54.898275 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.144375 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-szvhz" podUID="51a14c59-9b2a-4470-af2a-918b717cd721" containerName="registry-server" probeResult="failure" output=< Oct 02 11:16:55 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Oct 02 11:16:55 crc kubenswrapper[4766]: > Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.283897 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.359232 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snmnf\" (UniqueName: \"kubernetes.io/projected/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-kube-api-access-snmnf\") pod \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.359986 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-ovsdbserver-nb\") pod \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.360945 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-dns-svc\") pod \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.361038 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-ovsdbserver-sb\") pod \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.361120 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-config\") pod \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.361168 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-dns-swift-storage-0\") pod \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\" (UID: \"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2\") " Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.388760 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-kube-api-access-snmnf" (OuterVolumeSpecName: "kube-api-access-snmnf") pod "e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2" (UID: "e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2"). InnerVolumeSpecName "kube-api-access-snmnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.466797 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snmnf\" (UniqueName: \"kubernetes.io/projected/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-kube-api-access-snmnf\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.485267 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2" (UID: "e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.494683 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2" (UID: "e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.518195 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2" (UID: "e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.523645 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-config" (OuterVolumeSpecName: "config") pod "e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2" (UID: "e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.539617 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2" (UID: "e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.568318 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.568360 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.568375 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.568387 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.568398 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.817650 4766 generic.go:334] "Generic (PLEG): container finished" podID="e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2" containerID="20b9b7aff7b3a88aadfaa1c77608b6649cd13f567278c0707c1122d97f8a9f6d" exitCode=0 Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.817956 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-ljnft" event={"ID":"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2","Type":"ContainerDied","Data":"20b9b7aff7b3a88aadfaa1c77608b6649cd13f567278c0707c1122d97f8a9f6d"} Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.818015 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-ljnft" event={"ID":"e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2","Type":"ContainerDied","Data":"b8afa8f0e538fbf83f36c50f1f32969785af0f308a07b7113c9159f9e90f60b5"} Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.818049 4766 scope.go:117] "RemoveContainer" containerID="20b9b7aff7b3a88aadfaa1c77608b6649cd13f567278c0707c1122d97f8a9f6d" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.818250 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-ljnft" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.854565 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-ljnft"] Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.866568 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-ljnft"] Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.866832 4766 scope.go:117] "RemoveContainer" containerID="ec31f616b968fc330e876b786269e916f1f32e69467c6916e8777e3aeaaeeff2" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.897647 4766 scope.go:117] "RemoveContainer" containerID="20b9b7aff7b3a88aadfaa1c77608b6649cd13f567278c0707c1122d97f8a9f6d" Oct 02 11:16:55 crc kubenswrapper[4766]: E1002 11:16:55.901862 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20b9b7aff7b3a88aadfaa1c77608b6649cd13f567278c0707c1122d97f8a9f6d\": container with ID starting with 20b9b7aff7b3a88aadfaa1c77608b6649cd13f567278c0707c1122d97f8a9f6d not found: ID does not exist" containerID="20b9b7aff7b3a88aadfaa1c77608b6649cd13f567278c0707c1122d97f8a9f6d" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.901903 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20b9b7aff7b3a88aadfaa1c77608b6649cd13f567278c0707c1122d97f8a9f6d"} err="failed to get container status \"20b9b7aff7b3a88aadfaa1c77608b6649cd13f567278c0707c1122d97f8a9f6d\": rpc error: code = NotFound desc = could not find container \"20b9b7aff7b3a88aadfaa1c77608b6649cd13f567278c0707c1122d97f8a9f6d\": container with ID starting with 20b9b7aff7b3a88aadfaa1c77608b6649cd13f567278c0707c1122d97f8a9f6d not found: ID does not exist" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.901927 4766 scope.go:117] "RemoveContainer" containerID="ec31f616b968fc330e876b786269e916f1f32e69467c6916e8777e3aeaaeeff2" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.904595 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2" path="/var/lib/kubelet/pods/e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2/volumes" Oct 02 11:16:55 crc kubenswrapper[4766]: E1002 11:16:55.905867 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec31f616b968fc330e876b786269e916f1f32e69467c6916e8777e3aeaaeeff2\": container with ID starting with ec31f616b968fc330e876b786269e916f1f32e69467c6916e8777e3aeaaeeff2 not found: ID does not exist" containerID="ec31f616b968fc330e876b786269e916f1f32e69467c6916e8777e3aeaaeeff2" Oct 02 11:16:55 crc kubenswrapper[4766]: I1002 11:16:55.905893 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec31f616b968fc330e876b786269e916f1f32e69467c6916e8777e3aeaaeeff2"} err="failed to get container status \"ec31f616b968fc330e876b786269e916f1f32e69467c6916e8777e3aeaaeeff2\": rpc error: code = NotFound desc = could not find container \"ec31f616b968fc330e876b786269e916f1f32e69467c6916e8777e3aeaaeeff2\": container with ID starting with ec31f616b968fc330e876b786269e916f1f32e69467c6916e8777e3aeaaeeff2 not found: ID does not exist" Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.308166 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2jv7w" Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.382698 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-config-data\") pod \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\" (UID: \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\") " Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.382882 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt68p\" (UniqueName: \"kubernetes.io/projected/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-kube-api-access-kt68p\") pod \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\" (UID: \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\") " Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.382977 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-combined-ca-bundle\") pod \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\" (UID: \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\") " Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.382996 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-scripts\") pod \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\" (UID: \"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f\") " Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.390755 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-kube-api-access-kt68p" (OuterVolumeSpecName: "kube-api-access-kt68p") pod "0ecd07b2-f47c-44c2-8c54-943c8c91ef0f" (UID: "0ecd07b2-f47c-44c2-8c54-943c8c91ef0f"). InnerVolumeSpecName "kube-api-access-kt68p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.403167 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-scripts" (OuterVolumeSpecName: "scripts") pod "0ecd07b2-f47c-44c2-8c54-943c8c91ef0f" (UID: "0ecd07b2-f47c-44c2-8c54-943c8c91ef0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.413898 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ecd07b2-f47c-44c2-8c54-943c8c91ef0f" (UID: "0ecd07b2-f47c-44c2-8c54-943c8c91ef0f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.417300 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-config-data" (OuterVolumeSpecName: "config-data") pod "0ecd07b2-f47c-44c2-8c54-943c8c91ef0f" (UID: "0ecd07b2-f47c-44c2-8c54-943c8c91ef0f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.484989 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt68p\" (UniqueName: \"kubernetes.io/projected/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-kube-api-access-kt68p\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.485032 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.485041 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.485053 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.830458 4766 generic.go:334] "Generic (PLEG): container finished" podID="dc899043-5f53-453c-bc00-0cc214647667" containerID="5be38962640da76b9046534d30afa195b17c52471dba02787f5ee4b789836031" exitCode=0 Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.830539 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-b4dsb" event={"ID":"dc899043-5f53-453c-bc00-0cc214647667","Type":"ContainerDied","Data":"5be38962640da76b9046534d30afa195b17c52471dba02787f5ee4b789836031"} Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.833052 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-2jv7w" event={"ID":"0ecd07b2-f47c-44c2-8c54-943c8c91ef0f","Type":"ContainerDied","Data":"c94344fb3c90a120431952931b7bd2e2242e517d0a5c250099f91150fae722ee"} Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.833097 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c94344fb3c90a120431952931b7bd2e2242e517d0a5c250099f91150fae722ee" Oct 02 11:16:56 crc kubenswrapper[4766]: I1002 11:16:56.833068 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-2jv7w" Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.056157 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.067377 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.067733 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c916a84f-283b-4dcf-b4d2-4324b24305c2" containerName="nova-api-api" containerID="cri-o://9ed5169edf1a2fb32bd47a1caf150cd24766b58c122d3a390c7723f5f0b79cd6" gracePeriod=30 Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.067719 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c916a84f-283b-4dcf-b4d2-4324b24305c2" containerName="nova-api-log" containerID="cri-o://8879d8aceba87260854a150fd7bd38d8150c42379def04081d786951f3f1911a" gracePeriod=30 Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.088056 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.088321 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b00c0dab-8c50-4401-9e82-40234dd5d6d5" containerName="nova-metadata-log" containerID="cri-o://4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df" gracePeriod=30 Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.088378 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b00c0dab-8c50-4401-9e82-40234dd5d6d5" containerName="nova-metadata-metadata" containerID="cri-o://6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a" gracePeriod=30 Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.706681 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.810941 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-combined-ca-bundle\") pod \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.811071 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88mg8\" (UniqueName: \"kubernetes.io/projected/b00c0dab-8c50-4401-9e82-40234dd5d6d5-kube-api-access-88mg8\") pod \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.811109 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-config-data\") pod \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.811838 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-nova-metadata-tls-certs\") pod \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.811979 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b00c0dab-8c50-4401-9e82-40234dd5d6d5-logs\") pod \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\" (UID: \"b00c0dab-8c50-4401-9e82-40234dd5d6d5\") " Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.816665 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b00c0dab-8c50-4401-9e82-40234dd5d6d5-logs" (OuterVolumeSpecName: "logs") pod "b00c0dab-8c50-4401-9e82-40234dd5d6d5" (UID: "b00c0dab-8c50-4401-9e82-40234dd5d6d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.819841 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b00c0dab-8c50-4401-9e82-40234dd5d6d5-kube-api-access-88mg8" (OuterVolumeSpecName: "kube-api-access-88mg8") pod "b00c0dab-8c50-4401-9e82-40234dd5d6d5" (UID: "b00c0dab-8c50-4401-9e82-40234dd5d6d5"). InnerVolumeSpecName "kube-api-access-88mg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.861233 4766 generic.go:334] "Generic (PLEG): container finished" podID="c916a84f-283b-4dcf-b4d2-4324b24305c2" containerID="8879d8aceba87260854a150fd7bd38d8150c42379def04081d786951f3f1911a" exitCode=143 Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.861315 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c916a84f-283b-4dcf-b4d2-4324b24305c2","Type":"ContainerDied","Data":"8879d8aceba87260854a150fd7bd38d8150c42379def04081d786951f3f1911a"} Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.869167 4766 generic.go:334] "Generic (PLEG): container finished" podID="b00c0dab-8c50-4401-9e82-40234dd5d6d5" containerID="6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a" exitCode=0 Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.869209 4766 generic.go:334] "Generic (PLEG): container finished" podID="b00c0dab-8c50-4401-9e82-40234dd5d6d5" containerID="4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df" exitCode=143 Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.869229 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b00c0dab-8c50-4401-9e82-40234dd5d6d5" (UID: "b00c0dab-8c50-4401-9e82-40234dd5d6d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.869281 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b00c0dab-8c50-4401-9e82-40234dd5d6d5","Type":"ContainerDied","Data":"6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a"} Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.869318 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b00c0dab-8c50-4401-9e82-40234dd5d6d5","Type":"ContainerDied","Data":"4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df"} Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.869328 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b00c0dab-8c50-4401-9e82-40234dd5d6d5","Type":"ContainerDied","Data":"0c1937236e56a04004a39c9fb36888b15bf5d897cd46a4caa5bc3c494f07c411"} Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.869346 4766 scope.go:117] "RemoveContainer" containerID="6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a" Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.869543 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.869872 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6c5de1c0-341f-411b-b68a-c95f34f52362" containerName="nova-scheduler-scheduler" containerID="cri-o://0655f7e6c6a04cef7c1d5254e573657b1fb792ab814da3fda8e05b1969570ef7" gracePeriod=30 Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.878953 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-config-data" (OuterVolumeSpecName: "config-data") pod "b00c0dab-8c50-4401-9e82-40234dd5d6d5" (UID: "b00c0dab-8c50-4401-9e82-40234dd5d6d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.903009 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b00c0dab-8c50-4401-9e82-40234dd5d6d5" (UID: "b00c0dab-8c50-4401-9e82-40234dd5d6d5"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.914301 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88mg8\" (UniqueName: \"kubernetes.io/projected/b00c0dab-8c50-4401-9e82-40234dd5d6d5-kube-api-access-88mg8\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.914361 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.914377 4766 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.914390 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b00c0dab-8c50-4401-9e82-40234dd5d6d5-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.914402 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b00c0dab-8c50-4401-9e82-40234dd5d6d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:57 crc kubenswrapper[4766]: I1002 11:16:57.995291 4766 scope.go:117] "RemoveContainer" containerID="4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.014711 4766 scope.go:117] "RemoveContainer" containerID="6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a" Oct 02 11:16:58 crc kubenswrapper[4766]: E1002 11:16:58.016918 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a\": container with ID starting with 6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a not found: ID does not exist" containerID="6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.016967 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a"} err="failed to get container status \"6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a\": rpc error: code = NotFound desc = could not find container \"6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a\": container with ID starting with 6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a not found: ID does not exist" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.016998 4766 scope.go:117] "RemoveContainer" containerID="4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df" Oct 02 11:16:58 crc kubenswrapper[4766]: E1002 11:16:58.017528 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df\": container with ID starting with 4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df not found: ID does not exist" containerID="4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.017576 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df"} err="failed to get container status \"4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df\": rpc error: code = NotFound desc = could not find container \"4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df\": container with ID starting with 4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df not found: ID does not exist" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.017622 4766 scope.go:117] "RemoveContainer" containerID="6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.021037 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a"} err="failed to get container status \"6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a\": rpc error: code = NotFound desc = could not find container \"6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a\": container with ID starting with 6c49f6dd046ee619395c2f623a6fca3c7061660cc438883a139a06c55af6e22a not found: ID does not exist" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.021075 4766 scope.go:117] "RemoveContainer" containerID="4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.025972 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df"} err="failed to get container status \"4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df\": rpc error: code = NotFound desc = could not find container \"4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df\": container with ID starting with 4e7cd171473eb5edc4b75cc4e7c50df1e4825fdb401bdbde3221dca4651893df not found: ID does not exist" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.168909 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-b4dsb" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.208879 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.219091 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-scripts\") pod \"dc899043-5f53-453c-bc00-0cc214647667\" (UID: \"dc899043-5f53-453c-bc00-0cc214647667\") " Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.219248 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2mxk\" (UniqueName: \"kubernetes.io/projected/dc899043-5f53-453c-bc00-0cc214647667-kube-api-access-x2mxk\") pod \"dc899043-5f53-453c-bc00-0cc214647667\" (UID: \"dc899043-5f53-453c-bc00-0cc214647667\") " Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.219318 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-combined-ca-bundle\") pod \"dc899043-5f53-453c-bc00-0cc214647667\" (UID: \"dc899043-5f53-453c-bc00-0cc214647667\") " Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.219414 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-config-data\") pod \"dc899043-5f53-453c-bc00-0cc214647667\" (UID: \"dc899043-5f53-453c-bc00-0cc214647667\") " Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.221600 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.239835 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-scripts" (OuterVolumeSpecName: "scripts") pod "dc899043-5f53-453c-bc00-0cc214647667" (UID: "dc899043-5f53-453c-bc00-0cc214647667"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.240305 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc899043-5f53-453c-bc00-0cc214647667-kube-api-access-x2mxk" (OuterVolumeSpecName: "kube-api-access-x2mxk") pod "dc899043-5f53-453c-bc00-0cc214647667" (UID: "dc899043-5f53-453c-bc00-0cc214647667"). InnerVolumeSpecName "kube-api-access-x2mxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.250663 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:58 crc kubenswrapper[4766]: E1002 11:16:58.251562 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc899043-5f53-453c-bc00-0cc214647667" containerName="nova-cell1-conductor-db-sync" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.251598 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc899043-5f53-453c-bc00-0cc214647667" containerName="nova-cell1-conductor-db-sync" Oct 02 11:16:58 crc kubenswrapper[4766]: E1002 11:16:58.251640 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00c0dab-8c50-4401-9e82-40234dd5d6d5" containerName="nova-metadata-metadata" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.251650 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00c0dab-8c50-4401-9e82-40234dd5d6d5" containerName="nova-metadata-metadata" Oct 02 11:16:58 crc kubenswrapper[4766]: E1002 11:16:58.251667 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2" containerName="dnsmasq-dns" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.251677 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2" containerName="dnsmasq-dns" Oct 02 11:16:58 crc kubenswrapper[4766]: E1002 11:16:58.251712 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2" containerName="init" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.251723 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2" containerName="init" Oct 02 11:16:58 crc kubenswrapper[4766]: E1002 11:16:58.251749 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b00c0dab-8c50-4401-9e82-40234dd5d6d5" containerName="nova-metadata-log" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.251759 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b00c0dab-8c50-4401-9e82-40234dd5d6d5" containerName="nova-metadata-log" Oct 02 11:16:58 crc kubenswrapper[4766]: E1002 11:16:58.251773 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ecd07b2-f47c-44c2-8c54-943c8c91ef0f" containerName="nova-manage" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.251780 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ecd07b2-f47c-44c2-8c54-943c8c91ef0f" containerName="nova-manage" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.252022 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc899043-5f53-453c-bc00-0cc214647667" containerName="nova-cell1-conductor-db-sync" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.252039 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b00c0dab-8c50-4401-9e82-40234dd5d6d5" containerName="nova-metadata-log" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.252048 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b00c0dab-8c50-4401-9e82-40234dd5d6d5" containerName="nova-metadata-metadata" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.252060 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4f37fd8-7e80-4c40-a7d2-7e7d78e60cb2" containerName="dnsmasq-dns" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.252074 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ecd07b2-f47c-44c2-8c54-943c8c91ef0f" containerName="nova-manage" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.253468 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.259091 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.259897 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.260720 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-config-data" (OuterVolumeSpecName: "config-data") pod "dc899043-5f53-453c-bc00-0cc214647667" (UID: "dc899043-5f53-453c-bc00-0cc214647667"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.263550 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.281920 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc899043-5f53-453c-bc00-0cc214647667" (UID: "dc899043-5f53-453c-bc00-0cc214647667"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.321638 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d8b9b37-94aa-4611-b9aa-08d55c42987b-logs\") pod \"nova-metadata-0\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " pod="openstack/nova-metadata-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.321826 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " pod="openstack/nova-metadata-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.321868 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrcpt\" (UniqueName: \"kubernetes.io/projected/1d8b9b37-94aa-4611-b9aa-08d55c42987b-kube-api-access-jrcpt\") pod \"nova-metadata-0\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " pod="openstack/nova-metadata-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.321951 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " pod="openstack/nova-metadata-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.322094 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-config-data\") pod \"nova-metadata-0\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " pod="openstack/nova-metadata-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.322270 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.322297 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.322309 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc899043-5f53-453c-bc00-0cc214647667-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.322321 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2mxk\" (UniqueName: \"kubernetes.io/projected/dc899043-5f53-453c-bc00-0cc214647667-kube-api-access-x2mxk\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.424117 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " pod="openstack/nova-metadata-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.424168 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrcpt\" (UniqueName: \"kubernetes.io/projected/1d8b9b37-94aa-4611-b9aa-08d55c42987b-kube-api-access-jrcpt\") pod \"nova-metadata-0\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " pod="openstack/nova-metadata-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.424197 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " pod="openstack/nova-metadata-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.424323 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-config-data\") pod \"nova-metadata-0\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " pod="openstack/nova-metadata-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.424376 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d8b9b37-94aa-4611-b9aa-08d55c42987b-logs\") pod \"nova-metadata-0\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " pod="openstack/nova-metadata-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.424752 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d8b9b37-94aa-4611-b9aa-08d55c42987b-logs\") pod \"nova-metadata-0\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " pod="openstack/nova-metadata-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.428158 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " pod="openstack/nova-metadata-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.439030 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-config-data\") pod \"nova-metadata-0\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " pod="openstack/nova-metadata-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.439296 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " pod="openstack/nova-metadata-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.441465 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrcpt\" (UniqueName: \"kubernetes.io/projected/1d8b9b37-94aa-4611-b9aa-08d55c42987b-kube-api-access-jrcpt\") pod \"nova-metadata-0\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " pod="openstack/nova-metadata-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.580983 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.760550 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.831579 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vdrz\" (UniqueName: \"kubernetes.io/projected/6c5de1c0-341f-411b-b68a-c95f34f52362-kube-api-access-8vdrz\") pod \"6c5de1c0-341f-411b-b68a-c95f34f52362\" (UID: \"6c5de1c0-341f-411b-b68a-c95f34f52362\") " Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.831701 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5de1c0-341f-411b-b68a-c95f34f52362-combined-ca-bundle\") pod \"6c5de1c0-341f-411b-b68a-c95f34f52362\" (UID: \"6c5de1c0-341f-411b-b68a-c95f34f52362\") " Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.831769 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5de1c0-341f-411b-b68a-c95f34f52362-config-data\") pod \"6c5de1c0-341f-411b-b68a-c95f34f52362\" (UID: \"6c5de1c0-341f-411b-b68a-c95f34f52362\") " Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.837436 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c5de1c0-341f-411b-b68a-c95f34f52362-kube-api-access-8vdrz" (OuterVolumeSpecName: "kube-api-access-8vdrz") pod "6c5de1c0-341f-411b-b68a-c95f34f52362" (UID: "6c5de1c0-341f-411b-b68a-c95f34f52362"). InnerVolumeSpecName "kube-api-access-8vdrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.861852 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5de1c0-341f-411b-b68a-c95f34f52362-config-data" (OuterVolumeSpecName: "config-data") pod "6c5de1c0-341f-411b-b68a-c95f34f52362" (UID: "6c5de1c0-341f-411b-b68a-c95f34f52362"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.883865 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5de1c0-341f-411b-b68a-c95f34f52362-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c5de1c0-341f-411b-b68a-c95f34f52362" (UID: "6c5de1c0-341f-411b-b68a-c95f34f52362"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.888232 4766 generic.go:334] "Generic (PLEG): container finished" podID="6c5de1c0-341f-411b-b68a-c95f34f52362" containerID="0655f7e6c6a04cef7c1d5254e573657b1fb792ab814da3fda8e05b1969570ef7" exitCode=0 Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.888296 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6c5de1c0-341f-411b-b68a-c95f34f52362","Type":"ContainerDied","Data":"0655f7e6c6a04cef7c1d5254e573657b1fb792ab814da3fda8e05b1969570ef7"} Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.888321 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6c5de1c0-341f-411b-b68a-c95f34f52362","Type":"ContainerDied","Data":"0ed389aaa15e8fa60d11da0b016d5b229fd7788b8cc585f211306f6a150ea98e"} Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.888338 4766 scope.go:117] "RemoveContainer" containerID="0655f7e6c6a04cef7c1d5254e573657b1fb792ab814da3fda8e05b1969570ef7" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.888343 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.896694 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-b4dsb" event={"ID":"dc899043-5f53-453c-bc00-0cc214647667","Type":"ContainerDied","Data":"f7dbfd4a8d38956ea1477eae8dbf28083bf0c9a70c97331dc29578def72e2b60"} Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.896760 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7dbfd4a8d38956ea1477eae8dbf28083bf0c9a70c97331dc29578def72e2b60" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.899039 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-b4dsb" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.931844 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:16:58 crc kubenswrapper[4766]: E1002 11:16:58.932451 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5de1c0-341f-411b-b68a-c95f34f52362" containerName="nova-scheduler-scheduler" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.932475 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5de1c0-341f-411b-b68a-c95f34f52362" containerName="nova-scheduler-scheduler" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.934053 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5de1c0-341f-411b-b68a-c95f34f52362-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.934080 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5de1c0-341f-411b-b68a-c95f34f52362-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.934093 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vdrz\" (UniqueName: \"kubernetes.io/projected/6c5de1c0-341f-411b-b68a-c95f34f52362-kube-api-access-8vdrz\") on node \"crc\" DevicePath \"\"" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.935231 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5de1c0-341f-411b-b68a-c95f34f52362" containerName="nova-scheduler-scheduler" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.936056 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.938200 4766 scope.go:117] "RemoveContainer" containerID="0655f7e6c6a04cef7c1d5254e573657b1fb792ab814da3fda8e05b1969570ef7" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.938802 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 11:16:58 crc kubenswrapper[4766]: E1002 11:16:58.939145 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0655f7e6c6a04cef7c1d5254e573657b1fb792ab814da3fda8e05b1969570ef7\": container with ID starting with 0655f7e6c6a04cef7c1d5254e573657b1fb792ab814da3fda8e05b1969570ef7 not found: ID does not exist" containerID="0655f7e6c6a04cef7c1d5254e573657b1fb792ab814da3fda8e05b1969570ef7" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.939242 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0655f7e6c6a04cef7c1d5254e573657b1fb792ab814da3fda8e05b1969570ef7"} err="failed to get container status \"0655f7e6c6a04cef7c1d5254e573657b1fb792ab814da3fda8e05b1969570ef7\": rpc error: code = NotFound desc = could not find container \"0655f7e6c6a04cef7c1d5254e573657b1fb792ab814da3fda8e05b1969570ef7\": container with ID starting with 0655f7e6c6a04cef7c1d5254e573657b1fb792ab814da3fda8e05b1969570ef7 not found: ID does not exist" Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.945615 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.968240 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:16:58 crc kubenswrapper[4766]: I1002 11:16:58.983113 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.004157 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.005757 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.007830 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.028391 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.035885 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxxf7\" (UniqueName: \"kubernetes.io/projected/0064fd48-390f-4a0f-abfe-9922c8c431f9-kube-api-access-lxxf7\") pod \"nova-cell1-conductor-0\" (UID: \"0064fd48-390f-4a0f-abfe-9922c8c431f9\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.035942 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0064fd48-390f-4a0f-abfe-9922c8c431f9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0064fd48-390f-4a0f-abfe-9922c8c431f9\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.035966 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0064fd48-390f-4a0f-abfe-9922c8c431f9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0064fd48-390f-4a0f-abfe-9922c8c431f9\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:59 crc kubenswrapper[4766]: W1002 11:16:59.040118 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d8b9b37_94aa_4611_b9aa_08d55c42987b.slice/crio-8ec81440daa9ffd72f347c06a028421425339bf133107a5b7404adf24b9fb3b7 WatchSource:0}: Error finding container 8ec81440daa9ffd72f347c06a028421425339bf133107a5b7404adf24b9fb3b7: Status 404 returned error can't find the container with id 8ec81440daa9ffd72f347c06a028421425339bf133107a5b7404adf24b9fb3b7 Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.049356 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.138072 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds9sb\" (UniqueName: \"kubernetes.io/projected/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-kube-api-access-ds9sb\") pod \"nova-scheduler-0\" (UID: \"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.138468 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxxf7\" (UniqueName: \"kubernetes.io/projected/0064fd48-390f-4a0f-abfe-9922c8c431f9-kube-api-access-lxxf7\") pod \"nova-cell1-conductor-0\" (UID: \"0064fd48-390f-4a0f-abfe-9922c8c431f9\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.138548 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.138574 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-config-data\") pod \"nova-scheduler-0\" (UID: \"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.138639 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0064fd48-390f-4a0f-abfe-9922c8c431f9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0064fd48-390f-4a0f-abfe-9922c8c431f9\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.138663 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0064fd48-390f-4a0f-abfe-9922c8c431f9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0064fd48-390f-4a0f-abfe-9922c8c431f9\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.143462 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0064fd48-390f-4a0f-abfe-9922c8c431f9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0064fd48-390f-4a0f-abfe-9922c8c431f9\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.143487 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0064fd48-390f-4a0f-abfe-9922c8c431f9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0064fd48-390f-4a0f-abfe-9922c8c431f9\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.159566 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxxf7\" (UniqueName: \"kubernetes.io/projected/0064fd48-390f-4a0f-abfe-9922c8c431f9-kube-api-access-lxxf7\") pod \"nova-cell1-conductor-0\" (UID: \"0064fd48-390f-4a0f-abfe-9922c8c431f9\") " pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.240604 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.240651 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-config-data\") pod \"nova-scheduler-0\" (UID: \"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.240778 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds9sb\" (UniqueName: \"kubernetes.io/projected/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-kube-api-access-ds9sb\") pod \"nova-scheduler-0\" (UID: \"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.246890 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-config-data\") pod \"nova-scheduler-0\" (UID: \"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.248115 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.259306 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds9sb\" (UniqueName: \"kubernetes.io/projected/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-kube-api-access-ds9sb\") pod \"nova-scheduler-0\" (UID: \"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7\") " pod="openstack/nova-scheduler-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.263185 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.324484 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.765477 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:16:59 crc kubenswrapper[4766]: W1002 11:16:59.768891 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0064fd48_390f_4a0f_abfe_9922c8c431f9.slice/crio-ba6115d8ae5ab6c8b285b8b221fe8db9027b76e8b2acca29732c54f957a5152b WatchSource:0}: Error finding container ba6115d8ae5ab6c8b285b8b221fe8db9027b76e8b2acca29732c54f957a5152b: Status 404 returned error can't find the container with id ba6115d8ae5ab6c8b285b8b221fe8db9027b76e8b2acca29732c54f957a5152b Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.902913 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c5de1c0-341f-411b-b68a-c95f34f52362" path="/var/lib/kubelet/pods/6c5de1c0-341f-411b-b68a-c95f34f52362/volumes" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.904473 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b00c0dab-8c50-4401-9e82-40234dd5d6d5" path="/var/lib/kubelet/pods/b00c0dab-8c50-4401-9e82-40234dd5d6d5/volumes" Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.905067 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.908521 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d8b9b37-94aa-4611-b9aa-08d55c42987b","Type":"ContainerStarted","Data":"0bc8ac2cb598eaa8aca752f18cd4ff6afdedadff6528d3ac77845ba3cd53f9dd"} Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.908549 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d8b9b37-94aa-4611-b9aa-08d55c42987b","Type":"ContainerStarted","Data":"a504f0dbaa5b2e2221ad476c96d49c3ede95d35904c8dc12baa5c5f199157005"} Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.908565 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d8b9b37-94aa-4611-b9aa-08d55c42987b","Type":"ContainerStarted","Data":"8ec81440daa9ffd72f347c06a028421425339bf133107a5b7404adf24b9fb3b7"} Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.909822 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0064fd48-390f-4a0f-abfe-9922c8c431f9","Type":"ContainerStarted","Data":"ba6115d8ae5ab6c8b285b8b221fe8db9027b76e8b2acca29732c54f957a5152b"} Oct 02 11:16:59 crc kubenswrapper[4766]: I1002 11:16:59.939046 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.9390180799999999 podStartE2EDuration="1.93901808s" podCreationTimestamp="2025-10-02 11:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:16:59.924641011 +0000 UTC m=+1534.867511955" watchObservedRunningTime="2025-10-02 11:16:59.93901808 +0000 UTC m=+1534.881889014" Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.589542 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.667241 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpp72\" (UniqueName: \"kubernetes.io/projected/c916a84f-283b-4dcf-b4d2-4324b24305c2-kube-api-access-jpp72\") pod \"c916a84f-283b-4dcf-b4d2-4324b24305c2\" (UID: \"c916a84f-283b-4dcf-b4d2-4324b24305c2\") " Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.667330 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c916a84f-283b-4dcf-b4d2-4324b24305c2-logs\") pod \"c916a84f-283b-4dcf-b4d2-4324b24305c2\" (UID: \"c916a84f-283b-4dcf-b4d2-4324b24305c2\") " Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.667721 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c916a84f-283b-4dcf-b4d2-4324b24305c2-combined-ca-bundle\") pod \"c916a84f-283b-4dcf-b4d2-4324b24305c2\" (UID: \"c916a84f-283b-4dcf-b4d2-4324b24305c2\") " Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.667748 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c916a84f-283b-4dcf-b4d2-4324b24305c2-config-data\") pod \"c916a84f-283b-4dcf-b4d2-4324b24305c2\" (UID: \"c916a84f-283b-4dcf-b4d2-4324b24305c2\") " Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.670293 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c916a84f-283b-4dcf-b4d2-4324b24305c2-logs" (OuterVolumeSpecName: "logs") pod "c916a84f-283b-4dcf-b4d2-4324b24305c2" (UID: "c916a84f-283b-4dcf-b4d2-4324b24305c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.678686 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c916a84f-283b-4dcf-b4d2-4324b24305c2-kube-api-access-jpp72" (OuterVolumeSpecName: "kube-api-access-jpp72") pod "c916a84f-283b-4dcf-b4d2-4324b24305c2" (UID: "c916a84f-283b-4dcf-b4d2-4324b24305c2"). InnerVolumeSpecName "kube-api-access-jpp72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.710550 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c916a84f-283b-4dcf-b4d2-4324b24305c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c916a84f-283b-4dcf-b4d2-4324b24305c2" (UID: "c916a84f-283b-4dcf-b4d2-4324b24305c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.713062 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c916a84f-283b-4dcf-b4d2-4324b24305c2-config-data" (OuterVolumeSpecName: "config-data") pod "c916a84f-283b-4dcf-b4d2-4324b24305c2" (UID: "c916a84f-283b-4dcf-b4d2-4324b24305c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.770058 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c916a84f-283b-4dcf-b4d2-4324b24305c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.770385 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c916a84f-283b-4dcf-b4d2-4324b24305c2-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.770394 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpp72\" (UniqueName: \"kubernetes.io/projected/c916a84f-283b-4dcf-b4d2-4324b24305c2-kube-api-access-jpp72\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.770406 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c916a84f-283b-4dcf-b4d2-4324b24305c2-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.920686 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7","Type":"ContainerStarted","Data":"3c3f1237df38c8b6ccd32b780a793babbd91e3830b9cb19cf87c41863f677752"} Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.920744 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7","Type":"ContainerStarted","Data":"4ed345a3119bdc9f5fc79c2fa7e147ad7c93cca94448758ed5a48df4771962c2"} Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.922096 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0064fd48-390f-4a0f-abfe-9922c8c431f9","Type":"ContainerStarted","Data":"95b502421d0b283b99e2e399ae912e61831216f1aad7cf4549c10685585bea25"} Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.922231 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.924013 4766 generic.go:334] "Generic (PLEG): container finished" podID="c916a84f-283b-4dcf-b4d2-4324b24305c2" containerID="9ed5169edf1a2fb32bd47a1caf150cd24766b58c122d3a390c7723f5f0b79cd6" exitCode=0 Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.924055 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.924094 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c916a84f-283b-4dcf-b4d2-4324b24305c2","Type":"ContainerDied","Data":"9ed5169edf1a2fb32bd47a1caf150cd24766b58c122d3a390c7723f5f0b79cd6"} Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.924181 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c916a84f-283b-4dcf-b4d2-4324b24305c2","Type":"ContainerDied","Data":"0c21cd108ceeb8d4b1142570a1adb39fe382cb0a294f08597959f0f978adede5"} Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.924252 4766 scope.go:117] "RemoveContainer" containerID="9ed5169edf1a2fb32bd47a1caf150cd24766b58c122d3a390c7723f5f0b79cd6" Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.948965 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.948905121 podStartE2EDuration="2.948905121s" podCreationTimestamp="2025-10-02 11:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:00.937138935 +0000 UTC m=+1535.880009879" watchObservedRunningTime="2025-10-02 11:17:00.948905121 +0000 UTC m=+1535.891776075" Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.958385 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.958365663 podStartE2EDuration="2.958365663s" podCreationTimestamp="2025-10-02 11:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:00.957335731 +0000 UTC m=+1535.900206675" watchObservedRunningTime="2025-10-02 11:17:00.958365663 +0000 UTC m=+1535.901236607" Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.962545 4766 scope.go:117] "RemoveContainer" containerID="8879d8aceba87260854a150fd7bd38d8150c42379def04081d786951f3f1911a" Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.979960 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:00 crc kubenswrapper[4766]: I1002 11:17:00.988668 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.012675 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.012681 4766 scope.go:117] "RemoveContainer" containerID="9ed5169edf1a2fb32bd47a1caf150cd24766b58c122d3a390c7723f5f0b79cd6" Oct 02 11:17:01 crc kubenswrapper[4766]: E1002 11:17:01.013161 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c916a84f-283b-4dcf-b4d2-4324b24305c2" containerName="nova-api-log" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.013185 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c916a84f-283b-4dcf-b4d2-4324b24305c2" containerName="nova-api-log" Oct 02 11:17:01 crc kubenswrapper[4766]: E1002 11:17:01.013205 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c916a84f-283b-4dcf-b4d2-4324b24305c2" containerName="nova-api-api" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.013215 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c916a84f-283b-4dcf-b4d2-4324b24305c2" containerName="nova-api-api" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.013448 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c916a84f-283b-4dcf-b4d2-4324b24305c2" containerName="nova-api-api" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.013479 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c916a84f-283b-4dcf-b4d2-4324b24305c2" containerName="nova-api-log" Oct 02 11:17:01 crc kubenswrapper[4766]: E1002 11:17:01.014413 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ed5169edf1a2fb32bd47a1caf150cd24766b58c122d3a390c7723f5f0b79cd6\": container with ID starting with 9ed5169edf1a2fb32bd47a1caf150cd24766b58c122d3a390c7723f5f0b79cd6 not found: ID does not exist" containerID="9ed5169edf1a2fb32bd47a1caf150cd24766b58c122d3a390c7723f5f0b79cd6" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.014448 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ed5169edf1a2fb32bd47a1caf150cd24766b58c122d3a390c7723f5f0b79cd6"} err="failed to get container status \"9ed5169edf1a2fb32bd47a1caf150cd24766b58c122d3a390c7723f5f0b79cd6\": rpc error: code = NotFound desc = could not find container \"9ed5169edf1a2fb32bd47a1caf150cd24766b58c122d3a390c7723f5f0b79cd6\": container with ID starting with 9ed5169edf1a2fb32bd47a1caf150cd24766b58c122d3a390c7723f5f0b79cd6 not found: ID does not exist" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.014473 4766 scope.go:117] "RemoveContainer" containerID="8879d8aceba87260854a150fd7bd38d8150c42379def04081d786951f3f1911a" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.014659 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:01 crc kubenswrapper[4766]: E1002 11:17:01.014734 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8879d8aceba87260854a150fd7bd38d8150c42379def04081d786951f3f1911a\": container with ID starting with 8879d8aceba87260854a150fd7bd38d8150c42379def04081d786951f3f1911a not found: ID does not exist" containerID="8879d8aceba87260854a150fd7bd38d8150c42379def04081d786951f3f1911a" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.014780 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8879d8aceba87260854a150fd7bd38d8150c42379def04081d786951f3f1911a"} err="failed to get container status \"8879d8aceba87260854a150fd7bd38d8150c42379def04081d786951f3f1911a\": rpc error: code = NotFound desc = could not find container \"8879d8aceba87260854a150fd7bd38d8150c42379def04081d786951f3f1911a\": container with ID starting with 8879d8aceba87260854a150fd7bd38d8150c42379def04081d786951f3f1911a not found: ID does not exist" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.018998 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.038393 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.075771 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz8tw\" (UniqueName: \"kubernetes.io/projected/99bf4282-354b-463e-a138-415852ac29be-kube-api-access-zz8tw\") pod \"nova-api-0\" (UID: \"99bf4282-354b-463e-a138-415852ac29be\") " pod="openstack/nova-api-0" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.075866 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99bf4282-354b-463e-a138-415852ac29be-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"99bf4282-354b-463e-a138-415852ac29be\") " pod="openstack/nova-api-0" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.075922 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99bf4282-354b-463e-a138-415852ac29be-logs\") pod \"nova-api-0\" (UID: \"99bf4282-354b-463e-a138-415852ac29be\") " pod="openstack/nova-api-0" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.075947 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99bf4282-354b-463e-a138-415852ac29be-config-data\") pod \"nova-api-0\" (UID: \"99bf4282-354b-463e-a138-415852ac29be\") " pod="openstack/nova-api-0" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.177444 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99bf4282-354b-463e-a138-415852ac29be-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"99bf4282-354b-463e-a138-415852ac29be\") " pod="openstack/nova-api-0" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.177653 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99bf4282-354b-463e-a138-415852ac29be-logs\") pod \"nova-api-0\" (UID: \"99bf4282-354b-463e-a138-415852ac29be\") " pod="openstack/nova-api-0" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.177715 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99bf4282-354b-463e-a138-415852ac29be-config-data\") pod \"nova-api-0\" (UID: \"99bf4282-354b-463e-a138-415852ac29be\") " pod="openstack/nova-api-0" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.177829 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz8tw\" (UniqueName: \"kubernetes.io/projected/99bf4282-354b-463e-a138-415852ac29be-kube-api-access-zz8tw\") pod \"nova-api-0\" (UID: \"99bf4282-354b-463e-a138-415852ac29be\") " pod="openstack/nova-api-0" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.178271 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99bf4282-354b-463e-a138-415852ac29be-logs\") pod \"nova-api-0\" (UID: \"99bf4282-354b-463e-a138-415852ac29be\") " pod="openstack/nova-api-0" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.181614 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99bf4282-354b-463e-a138-415852ac29be-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"99bf4282-354b-463e-a138-415852ac29be\") " pod="openstack/nova-api-0" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.181840 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99bf4282-354b-463e-a138-415852ac29be-config-data\") pod \"nova-api-0\" (UID: \"99bf4282-354b-463e-a138-415852ac29be\") " pod="openstack/nova-api-0" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.200647 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz8tw\" (UniqueName: \"kubernetes.io/projected/99bf4282-354b-463e-a138-415852ac29be-kube-api-access-zz8tw\") pod \"nova-api-0\" (UID: \"99bf4282-354b-463e-a138-415852ac29be\") " pod="openstack/nova-api-0" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.343927 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.780839 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:01 crc kubenswrapper[4766]: W1002 11:17:01.785634 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99bf4282_354b_463e_a138_415852ac29be.slice/crio-b55f090029070eb58a2c7961d8675ae51ca4cc2d759aa2cca700eb184127e161 WatchSource:0}: Error finding container b55f090029070eb58a2c7961d8675ae51ca4cc2d759aa2cca700eb184127e161: Status 404 returned error can't find the container with id b55f090029070eb58a2c7961d8675ae51ca4cc2d759aa2cca700eb184127e161 Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.891877 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c916a84f-283b-4dcf-b4d2-4324b24305c2" path="/var/lib/kubelet/pods/c916a84f-283b-4dcf-b4d2-4324b24305c2/volumes" Oct 02 11:17:01 crc kubenswrapper[4766]: I1002 11:17:01.935248 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99bf4282-354b-463e-a138-415852ac29be","Type":"ContainerStarted","Data":"b55f090029070eb58a2c7961d8675ae51ca4cc2d759aa2cca700eb184127e161"} Oct 02 11:17:02 crc kubenswrapper[4766]: I1002 11:17:02.946604 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99bf4282-354b-463e-a138-415852ac29be","Type":"ContainerStarted","Data":"703b6cba446ff163628b04e87dc98e30942b0ccb5f64ffbea850955af7033b82"} Oct 02 11:17:02 crc kubenswrapper[4766]: I1002 11:17:02.946933 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99bf4282-354b-463e-a138-415852ac29be","Type":"ContainerStarted","Data":"ff96ac1157c6bcbb685c56fe7ed71d0865b0c6ed9b4090c4cc0dca200ed610d9"} Oct 02 11:17:02 crc kubenswrapper[4766]: I1002 11:17:02.972020 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.972001328 podStartE2EDuration="2.972001328s" podCreationTimestamp="2025-10-02 11:17:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:02.964049064 +0000 UTC m=+1537.906920008" watchObservedRunningTime="2025-10-02 11:17:02.972001328 +0000 UTC m=+1537.914872272" Oct 02 11:17:03 crc kubenswrapper[4766]: I1002 11:17:03.582141 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:17:03 crc kubenswrapper[4766]: I1002 11:17:03.583183 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:17:04 crc kubenswrapper[4766]: I1002 11:17:04.134795 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-szvhz" Oct 02 11:17:04 crc kubenswrapper[4766]: I1002 11:17:04.180275 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-szvhz" Oct 02 11:17:04 crc kubenswrapper[4766]: I1002 11:17:04.325858 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 11:17:04 crc kubenswrapper[4766]: I1002 11:17:04.383355 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-szvhz"] Oct 02 11:17:05 crc kubenswrapper[4766]: I1002 11:17:05.974027 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-szvhz" podUID="51a14c59-9b2a-4470-af2a-918b717cd721" containerName="registry-server" containerID="cri-o://b40d0a1d0332dbd4545ef9b931c18a6d54125227d972ed85b2d13d8a295b7afc" gracePeriod=2 Oct 02 11:17:06 crc kubenswrapper[4766]: I1002 11:17:06.463176 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-szvhz" Oct 02 11:17:06 crc kubenswrapper[4766]: I1002 11:17:06.584226 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf9qd\" (UniqueName: \"kubernetes.io/projected/51a14c59-9b2a-4470-af2a-918b717cd721-kube-api-access-cf9qd\") pod \"51a14c59-9b2a-4470-af2a-918b717cd721\" (UID: \"51a14c59-9b2a-4470-af2a-918b717cd721\") " Oct 02 11:17:06 crc kubenswrapper[4766]: I1002 11:17:06.584328 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a14c59-9b2a-4470-af2a-918b717cd721-utilities\") pod \"51a14c59-9b2a-4470-af2a-918b717cd721\" (UID: \"51a14c59-9b2a-4470-af2a-918b717cd721\") " Oct 02 11:17:06 crc kubenswrapper[4766]: I1002 11:17:06.584498 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a14c59-9b2a-4470-af2a-918b717cd721-catalog-content\") pod \"51a14c59-9b2a-4470-af2a-918b717cd721\" (UID: \"51a14c59-9b2a-4470-af2a-918b717cd721\") " Oct 02 11:17:06 crc kubenswrapper[4766]: I1002 11:17:06.585066 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a14c59-9b2a-4470-af2a-918b717cd721-utilities" (OuterVolumeSpecName: "utilities") pod "51a14c59-9b2a-4470-af2a-918b717cd721" (UID: "51a14c59-9b2a-4470-af2a-918b717cd721"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:17:06 crc kubenswrapper[4766]: I1002 11:17:06.589466 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a14c59-9b2a-4470-af2a-918b717cd721-kube-api-access-cf9qd" (OuterVolumeSpecName: "kube-api-access-cf9qd") pod "51a14c59-9b2a-4470-af2a-918b717cd721" (UID: "51a14c59-9b2a-4470-af2a-918b717cd721"). InnerVolumeSpecName "kube-api-access-cf9qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:06 crc kubenswrapper[4766]: I1002 11:17:06.627566 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a14c59-9b2a-4470-af2a-918b717cd721-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51a14c59-9b2a-4470-af2a-918b717cd721" (UID: "51a14c59-9b2a-4470-af2a-918b717cd721"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:17:06 crc kubenswrapper[4766]: I1002 11:17:06.686817 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a14c59-9b2a-4470-af2a-918b717cd721-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:06 crc kubenswrapper[4766]: I1002 11:17:06.686851 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a14c59-9b2a-4470-af2a-918b717cd721-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:06 crc kubenswrapper[4766]: I1002 11:17:06.686868 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf9qd\" (UniqueName: \"kubernetes.io/projected/51a14c59-9b2a-4470-af2a-918b717cd721-kube-api-access-cf9qd\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:06 crc kubenswrapper[4766]: I1002 11:17:06.984031 4766 generic.go:334] "Generic (PLEG): container finished" podID="51a14c59-9b2a-4470-af2a-918b717cd721" containerID="b40d0a1d0332dbd4545ef9b931c18a6d54125227d972ed85b2d13d8a295b7afc" exitCode=0 Oct 02 11:17:06 crc kubenswrapper[4766]: I1002 11:17:06.984078 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szvhz" event={"ID":"51a14c59-9b2a-4470-af2a-918b717cd721","Type":"ContainerDied","Data":"b40d0a1d0332dbd4545ef9b931c18a6d54125227d972ed85b2d13d8a295b7afc"} Oct 02 11:17:06 crc kubenswrapper[4766]: I1002 11:17:06.984105 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szvhz" event={"ID":"51a14c59-9b2a-4470-af2a-918b717cd721","Type":"ContainerDied","Data":"6a39c2bd2ba2986b38afeb570f10b11047a0315d9643d29a19811059090f5f55"} Oct 02 11:17:06 crc kubenswrapper[4766]: I1002 11:17:06.984123 4766 scope.go:117] "RemoveContainer" containerID="b40d0a1d0332dbd4545ef9b931c18a6d54125227d972ed85b2d13d8a295b7afc" Oct 02 11:17:06 crc kubenswrapper[4766]: I1002 11:17:06.984125 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-szvhz" Oct 02 11:17:07 crc kubenswrapper[4766]: I1002 11:17:07.007059 4766 scope.go:117] "RemoveContainer" containerID="70e30137bc22caa61668617e1b1ac8762f9459d159ae604241ffbe4ea42a4b2e" Oct 02 11:17:07 crc kubenswrapper[4766]: I1002 11:17:07.022674 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-szvhz"] Oct 02 11:17:07 crc kubenswrapper[4766]: I1002 11:17:07.046645 4766 scope.go:117] "RemoveContainer" containerID="aeaa160d71c90a2aa2e1cfca3faaf95a00b44d57bf733845e6206c1d24aca4c3" Oct 02 11:17:07 crc kubenswrapper[4766]: I1002 11:17:07.048999 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-szvhz"] Oct 02 11:17:07 crc kubenswrapper[4766]: I1002 11:17:07.077932 4766 scope.go:117] "RemoveContainer" containerID="b40d0a1d0332dbd4545ef9b931c18a6d54125227d972ed85b2d13d8a295b7afc" Oct 02 11:17:07 crc kubenswrapper[4766]: E1002 11:17:07.078414 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b40d0a1d0332dbd4545ef9b931c18a6d54125227d972ed85b2d13d8a295b7afc\": container with ID starting with b40d0a1d0332dbd4545ef9b931c18a6d54125227d972ed85b2d13d8a295b7afc not found: ID does not exist" containerID="b40d0a1d0332dbd4545ef9b931c18a6d54125227d972ed85b2d13d8a295b7afc" Oct 02 11:17:07 crc kubenswrapper[4766]: I1002 11:17:07.078465 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b40d0a1d0332dbd4545ef9b931c18a6d54125227d972ed85b2d13d8a295b7afc"} err="failed to get container status \"b40d0a1d0332dbd4545ef9b931c18a6d54125227d972ed85b2d13d8a295b7afc\": rpc error: code = NotFound desc = could not find container \"b40d0a1d0332dbd4545ef9b931c18a6d54125227d972ed85b2d13d8a295b7afc\": container with ID starting with b40d0a1d0332dbd4545ef9b931c18a6d54125227d972ed85b2d13d8a295b7afc not found: ID does not exist" Oct 02 11:17:07 crc kubenswrapper[4766]: I1002 11:17:07.078516 4766 scope.go:117] "RemoveContainer" containerID="70e30137bc22caa61668617e1b1ac8762f9459d159ae604241ffbe4ea42a4b2e" Oct 02 11:17:07 crc kubenswrapper[4766]: E1002 11:17:07.078808 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70e30137bc22caa61668617e1b1ac8762f9459d159ae604241ffbe4ea42a4b2e\": container with ID starting with 70e30137bc22caa61668617e1b1ac8762f9459d159ae604241ffbe4ea42a4b2e not found: ID does not exist" containerID="70e30137bc22caa61668617e1b1ac8762f9459d159ae604241ffbe4ea42a4b2e" Oct 02 11:17:07 crc kubenswrapper[4766]: I1002 11:17:07.078886 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70e30137bc22caa61668617e1b1ac8762f9459d159ae604241ffbe4ea42a4b2e"} err="failed to get container status \"70e30137bc22caa61668617e1b1ac8762f9459d159ae604241ffbe4ea42a4b2e\": rpc error: code = NotFound desc = could not find container \"70e30137bc22caa61668617e1b1ac8762f9459d159ae604241ffbe4ea42a4b2e\": container with ID starting with 70e30137bc22caa61668617e1b1ac8762f9459d159ae604241ffbe4ea42a4b2e not found: ID does not exist" Oct 02 11:17:07 crc kubenswrapper[4766]: I1002 11:17:07.078909 4766 scope.go:117] "RemoveContainer" containerID="aeaa160d71c90a2aa2e1cfca3faaf95a00b44d57bf733845e6206c1d24aca4c3" Oct 02 11:17:07 crc kubenswrapper[4766]: E1002 11:17:07.079216 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeaa160d71c90a2aa2e1cfca3faaf95a00b44d57bf733845e6206c1d24aca4c3\": container with ID starting with aeaa160d71c90a2aa2e1cfca3faaf95a00b44d57bf733845e6206c1d24aca4c3 not found: ID does not exist" containerID="aeaa160d71c90a2aa2e1cfca3faaf95a00b44d57bf733845e6206c1d24aca4c3" Oct 02 11:17:07 crc kubenswrapper[4766]: I1002 11:17:07.079261 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeaa160d71c90a2aa2e1cfca3faaf95a00b44d57bf733845e6206c1d24aca4c3"} err="failed to get container status \"aeaa160d71c90a2aa2e1cfca3faaf95a00b44d57bf733845e6206c1d24aca4c3\": rpc error: code = NotFound desc = could not find container \"aeaa160d71c90a2aa2e1cfca3faaf95a00b44d57bf733845e6206c1d24aca4c3\": container with ID starting with aeaa160d71c90a2aa2e1cfca3faaf95a00b44d57bf733845e6206c1d24aca4c3 not found: ID does not exist" Oct 02 11:17:07 crc kubenswrapper[4766]: I1002 11:17:07.788678 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 11:17:07 crc kubenswrapper[4766]: I1002 11:17:07.891644 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a14c59-9b2a-4470-af2a-918b717cd721" path="/var/lib/kubelet/pods/51a14c59-9b2a-4470-af2a-918b717cd721/volumes" Oct 02 11:17:08 crc kubenswrapper[4766]: I1002 11:17:08.581764 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 11:17:08 crc kubenswrapper[4766]: I1002 11:17:08.582947 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 11:17:08 crc kubenswrapper[4766]: I1002 11:17:08.881331 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:17:08 crc kubenswrapper[4766]: E1002 11:17:08.883170 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:17:09 crc kubenswrapper[4766]: I1002 11:17:09.300825 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 02 11:17:09 crc kubenswrapper[4766]: I1002 11:17:09.325695 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 11:17:09 crc kubenswrapper[4766]: I1002 11:17:09.356217 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 11:17:09 crc kubenswrapper[4766]: I1002 11:17:09.587802 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1d8b9b37-94aa-4611-b9aa-08d55c42987b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:09 crc kubenswrapper[4766]: I1002 11:17:09.591745 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1d8b9b37-94aa-4611-b9aa-08d55c42987b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:10 crc kubenswrapper[4766]: I1002 11:17:10.039665 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 11:17:11 crc kubenswrapper[4766]: I1002 11:17:11.344993 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:17:11 crc kubenswrapper[4766]: I1002 11:17:11.345253 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:17:12 crc kubenswrapper[4766]: I1002 11:17:12.427845 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="99bf4282-354b-463e-a138-415852ac29be" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:12 crc kubenswrapper[4766]: I1002 11:17:12.427883 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="99bf4282-354b-463e-a138-415852ac29be" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.196:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:18 crc kubenswrapper[4766]: I1002 11:17:18.586765 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 11:17:18 crc kubenswrapper[4766]: I1002 11:17:18.587306 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 11:17:18 crc kubenswrapper[4766]: I1002 11:17:18.594113 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 11:17:18 crc kubenswrapper[4766]: I1002 11:17:18.594445 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 11:17:21 crc kubenswrapper[4766]: I1002 11:17:21.347423 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 11:17:21 crc kubenswrapper[4766]: I1002 11:17:21.348115 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:17:21 crc kubenswrapper[4766]: I1002 11:17:21.351013 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 11:17:21 crc kubenswrapper[4766]: I1002 11:17:21.351149 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.096341 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.129378 4766 generic.go:334] "Generic (PLEG): container finished" podID="db100297-f73c-4f83-b6dc-2f4d9661123f" containerID="b6b20d6770b52679ed1ddfc4195e5c7c6a0b2ccd8ae24888f88c97c12c80f74f" exitCode=137 Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.129423 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.129438 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"db100297-f73c-4f83-b6dc-2f4d9661123f","Type":"ContainerDied","Data":"b6b20d6770b52679ed1ddfc4195e5c7c6a0b2ccd8ae24888f88c97c12c80f74f"} Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.129791 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"db100297-f73c-4f83-b6dc-2f4d9661123f","Type":"ContainerDied","Data":"a19f6f6e2859b1a27c78e0da67baf01a6f896b96359feb12aac3a916f7d7cd71"} Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.129816 4766 scope.go:117] "RemoveContainer" containerID="b6b20d6770b52679ed1ddfc4195e5c7c6a0b2ccd8ae24888f88c97c12c80f74f" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.134107 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.140577 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.155365 4766 scope.go:117] "RemoveContainer" containerID="b6b20d6770b52679ed1ddfc4195e5c7c6a0b2ccd8ae24888f88c97c12c80f74f" Oct 02 11:17:22 crc kubenswrapper[4766]: E1002 11:17:22.156777 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6b20d6770b52679ed1ddfc4195e5c7c6a0b2ccd8ae24888f88c97c12c80f74f\": container with ID starting with b6b20d6770b52679ed1ddfc4195e5c7c6a0b2ccd8ae24888f88c97c12c80f74f not found: ID does not exist" containerID="b6b20d6770b52679ed1ddfc4195e5c7c6a0b2ccd8ae24888f88c97c12c80f74f" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.156827 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6b20d6770b52679ed1ddfc4195e5c7c6a0b2ccd8ae24888f88c97c12c80f74f"} err="failed to get container status \"b6b20d6770b52679ed1ddfc4195e5c7c6a0b2ccd8ae24888f88c97c12c80f74f\": rpc error: code = NotFound desc = could not find container \"b6b20d6770b52679ed1ddfc4195e5c7c6a0b2ccd8ae24888f88c97c12c80f74f\": container with ID starting with b6b20d6770b52679ed1ddfc4195e5c7c6a0b2ccd8ae24888f88c97c12c80f74f not found: ID does not exist" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.202092 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db100297-f73c-4f83-b6dc-2f4d9661123f-config-data\") pod \"db100297-f73c-4f83-b6dc-2f4d9661123f\" (UID: \"db100297-f73c-4f83-b6dc-2f4d9661123f\") " Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.202185 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db100297-f73c-4f83-b6dc-2f4d9661123f-combined-ca-bundle\") pod \"db100297-f73c-4f83-b6dc-2f4d9661123f\" (UID: \"db100297-f73c-4f83-b6dc-2f4d9661123f\") " Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.202421 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g424s\" (UniqueName: \"kubernetes.io/projected/db100297-f73c-4f83-b6dc-2f4d9661123f-kube-api-access-g424s\") pod \"db100297-f73c-4f83-b6dc-2f4d9661123f\" (UID: \"db100297-f73c-4f83-b6dc-2f4d9661123f\") " Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.221888 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db100297-f73c-4f83-b6dc-2f4d9661123f-kube-api-access-g424s" (OuterVolumeSpecName: "kube-api-access-g424s") pod "db100297-f73c-4f83-b6dc-2f4d9661123f" (UID: "db100297-f73c-4f83-b6dc-2f4d9661123f"). InnerVolumeSpecName "kube-api-access-g424s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.271188 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db100297-f73c-4f83-b6dc-2f4d9661123f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db100297-f73c-4f83-b6dc-2f4d9661123f" (UID: "db100297-f73c-4f83-b6dc-2f4d9661123f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.274651 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db100297-f73c-4f83-b6dc-2f4d9661123f-config-data" (OuterVolumeSpecName: "config-data") pod "db100297-f73c-4f83-b6dc-2f4d9661123f" (UID: "db100297-f73c-4f83-b6dc-2f4d9661123f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.308540 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g424s\" (UniqueName: \"kubernetes.io/projected/db100297-f73c-4f83-b6dc-2f4d9661123f-kube-api-access-g424s\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.308573 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db100297-f73c-4f83-b6dc-2f4d9661123f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.308583 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db100297-f73c-4f83-b6dc-2f4d9661123f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.323457 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gbg65"] Oct 02 11:17:22 crc kubenswrapper[4766]: E1002 11:17:22.325053 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a14c59-9b2a-4470-af2a-918b717cd721" containerName="extract-content" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.325091 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a14c59-9b2a-4470-af2a-918b717cd721" containerName="extract-content" Oct 02 11:17:22 crc kubenswrapper[4766]: E1002 11:17:22.325110 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db100297-f73c-4f83-b6dc-2f4d9661123f" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.325120 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="db100297-f73c-4f83-b6dc-2f4d9661123f" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:17:22 crc kubenswrapper[4766]: E1002 11:17:22.325143 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a14c59-9b2a-4470-af2a-918b717cd721" containerName="registry-server" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.325150 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a14c59-9b2a-4470-af2a-918b717cd721" containerName="registry-server" Oct 02 11:17:22 crc kubenswrapper[4766]: E1002 11:17:22.325272 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a14c59-9b2a-4470-af2a-918b717cd721" containerName="extract-utilities" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.325286 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a14c59-9b2a-4470-af2a-918b717cd721" containerName="extract-utilities" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.326439 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a14c59-9b2a-4470-af2a-918b717cd721" containerName="registry-server" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.326467 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="db100297-f73c-4f83-b6dc-2f4d9661123f" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.329265 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.337027 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gbg65"] Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.411340 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.411396 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-config\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.411786 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.412020 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jhsh\" (UniqueName: \"kubernetes.io/projected/9fdf37c9-9a32-4103-8418-198d45d14415-kube-api-access-7jhsh\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.412078 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.412104 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.468289 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.493923 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.509179 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.510897 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.513180 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.513882 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.514085 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.515881 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.515986 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jhsh\" (UniqueName: \"kubernetes.io/projected/9fdf37c9-9a32-4103-8418-198d45d14415-kube-api-access-7jhsh\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.516033 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.516067 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.516153 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.516207 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-config\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.518248 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.518441 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-config\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.519117 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.520319 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.520823 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.521629 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.539957 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jhsh\" (UniqueName: \"kubernetes.io/projected/9fdf37c9-9a32-4103-8418-198d45d14415-kube-api-access-7jhsh\") pod \"dnsmasq-dns-59cf4bdb65-gbg65\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.617562 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.617878 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8rzm\" (UniqueName: \"kubernetes.io/projected/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-kube-api-access-w8rzm\") pod \"nova-cell1-novncproxy-0\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.618015 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.618119 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.618218 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.688310 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.719802 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8rzm\" (UniqueName: \"kubernetes.io/projected/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-kube-api-access-w8rzm\") pod \"nova-cell1-novncproxy-0\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.719897 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.719945 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.719964 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.719998 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.724293 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.724293 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.724336 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.728805 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.743625 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8rzm\" (UniqueName: \"kubernetes.io/projected/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-kube-api-access-w8rzm\") pod \"nova-cell1-novncproxy-0\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:22 crc kubenswrapper[4766]: I1002 11:17:22.844791 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:23 crc kubenswrapper[4766]: I1002 11:17:23.005118 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gbg65"] Oct 02 11:17:23 crc kubenswrapper[4766]: I1002 11:17:23.150280 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" event={"ID":"9fdf37c9-9a32-4103-8418-198d45d14415","Type":"ContainerStarted","Data":"ce489e596e9ddd769df973bc1233a9dc847025a10294fe124d70398d646e919e"} Oct 02 11:17:23 crc kubenswrapper[4766]: W1002 11:17:23.340756 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod123b65f7_a8e8_434b_baf1_e9b0d3a985d9.slice/crio-0f940b03ab75c092d0d62f5f74a4f6f7d16c0c06588333446c8859d2097afec1 WatchSource:0}: Error finding container 0f940b03ab75c092d0d62f5f74a4f6f7d16c0c06588333446c8859d2097afec1: Status 404 returned error can't find the container with id 0f940b03ab75c092d0d62f5f74a4f6f7d16c0c06588333446c8859d2097afec1 Oct 02 11:17:23 crc kubenswrapper[4766]: I1002 11:17:23.351039 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:17:23 crc kubenswrapper[4766]: I1002 11:17:23.881542 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:17:23 crc kubenswrapper[4766]: E1002 11:17:23.882137 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:17:23 crc kubenswrapper[4766]: I1002 11:17:23.894496 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db100297-f73c-4f83-b6dc-2f4d9661123f" path="/var/lib/kubelet/pods/db100297-f73c-4f83-b6dc-2f4d9661123f/volumes" Oct 02 11:17:24 crc kubenswrapper[4766]: I1002 11:17:24.158209 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"123b65f7-a8e8-434b-baf1-e9b0d3a985d9","Type":"ContainerStarted","Data":"805fa4defeec778eb8f810a670bb90a7f802c044adf651111138c5f240d5a4ad"} Oct 02 11:17:24 crc kubenswrapper[4766]: I1002 11:17:24.158251 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"123b65f7-a8e8-434b-baf1-e9b0d3a985d9","Type":"ContainerStarted","Data":"0f940b03ab75c092d0d62f5f74a4f6f7d16c0c06588333446c8859d2097afec1"} Oct 02 11:17:24 crc kubenswrapper[4766]: I1002 11:17:24.160054 4766 generic.go:334] "Generic (PLEG): container finished" podID="9fdf37c9-9a32-4103-8418-198d45d14415" containerID="2facd81316d0bb433c5134538dde848cf53f8502a7244939d8aa6ec3bc9bf2db" exitCode=0 Oct 02 11:17:24 crc kubenswrapper[4766]: I1002 11:17:24.160120 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" event={"ID":"9fdf37c9-9a32-4103-8418-198d45d14415","Type":"ContainerDied","Data":"2facd81316d0bb433c5134538dde848cf53f8502a7244939d8aa6ec3bc9bf2db"} Oct 02 11:17:24 crc kubenswrapper[4766]: I1002 11:17:24.178533 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.178492107 podStartE2EDuration="2.178492107s" podCreationTimestamp="2025-10-02 11:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:24.177128764 +0000 UTC m=+1559.119999728" watchObservedRunningTime="2025-10-02 11:17:24.178492107 +0000 UTC m=+1559.121363051" Oct 02 11:17:24 crc kubenswrapper[4766]: I1002 11:17:24.348155 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:17:24 crc kubenswrapper[4766]: I1002 11:17:24.348726 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerName="ceilometer-central-agent" containerID="cri-o://46c2ed514e2bdf9ef031ae67c47aca7ce83582591d3d5ceb160b1a1f8a5d12fc" gracePeriod=30 Oct 02 11:17:24 crc kubenswrapper[4766]: I1002 11:17:24.348855 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerName="proxy-httpd" containerID="cri-o://d68b8a196b3bc7566293f1f4bf2cfaa3d995d0598f5e72de076da6feb093d26b" gracePeriod=30 Oct 02 11:17:24 crc kubenswrapper[4766]: I1002 11:17:24.348891 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerName="sg-core" containerID="cri-o://793a807916e2716603f5252269b8f6ee91448251ec30bf50f028ad12d2deba0e" gracePeriod=30 Oct 02 11:17:24 crc kubenswrapper[4766]: I1002 11:17:24.348920 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerName="ceilometer-notification-agent" containerID="cri-o://c5d7d2d79ad9957cb5b20466f8e1c91880dbebaa3656e4ecddf154e4b91af0d4" gracePeriod=30 Oct 02 11:17:24 crc kubenswrapper[4766]: I1002 11:17:24.811952 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:25 crc kubenswrapper[4766]: I1002 11:17:25.170829 4766 generic.go:334] "Generic (PLEG): container finished" podID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerID="d68b8a196b3bc7566293f1f4bf2cfaa3d995d0598f5e72de076da6feb093d26b" exitCode=0 Oct 02 11:17:25 crc kubenswrapper[4766]: I1002 11:17:25.170854 4766 generic.go:334] "Generic (PLEG): container finished" podID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerID="793a807916e2716603f5252269b8f6ee91448251ec30bf50f028ad12d2deba0e" exitCode=2 Oct 02 11:17:25 crc kubenswrapper[4766]: I1002 11:17:25.170861 4766 generic.go:334] "Generic (PLEG): container finished" podID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerID="46c2ed514e2bdf9ef031ae67c47aca7ce83582591d3d5ceb160b1a1f8a5d12fc" exitCode=0 Oct 02 11:17:25 crc kubenswrapper[4766]: I1002 11:17:25.170922 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f382271-dcd5-4199-a98a-490ddb92e1b4","Type":"ContainerDied","Data":"d68b8a196b3bc7566293f1f4bf2cfaa3d995d0598f5e72de076da6feb093d26b"} Oct 02 11:17:25 crc kubenswrapper[4766]: I1002 11:17:25.170947 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f382271-dcd5-4199-a98a-490ddb92e1b4","Type":"ContainerDied","Data":"793a807916e2716603f5252269b8f6ee91448251ec30bf50f028ad12d2deba0e"} Oct 02 11:17:25 crc kubenswrapper[4766]: I1002 11:17:25.170956 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f382271-dcd5-4199-a98a-490ddb92e1b4","Type":"ContainerDied","Data":"46c2ed514e2bdf9ef031ae67c47aca7ce83582591d3d5ceb160b1a1f8a5d12fc"} Oct 02 11:17:25 crc kubenswrapper[4766]: I1002 11:17:25.173972 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="99bf4282-354b-463e-a138-415852ac29be" containerName="nova-api-log" containerID="cri-o://ff96ac1157c6bcbb685c56fe7ed71d0865b0c6ed9b4090c4cc0dca200ed610d9" gracePeriod=30 Oct 02 11:17:25 crc kubenswrapper[4766]: I1002 11:17:25.174806 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" event={"ID":"9fdf37c9-9a32-4103-8418-198d45d14415","Type":"ContainerStarted","Data":"a159f561aef681dec7a32ee79dcabb81731c25f79c9eafcbda290d83ab7ba093"} Oct 02 11:17:25 crc kubenswrapper[4766]: I1002 11:17:25.174835 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:25 crc kubenswrapper[4766]: I1002 11:17:25.175457 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="99bf4282-354b-463e-a138-415852ac29be" containerName="nova-api-api" containerID="cri-o://703b6cba446ff163628b04e87dc98e30942b0ccb5f64ffbea850955af7033b82" gracePeriod=30 Oct 02 11:17:25 crc kubenswrapper[4766]: I1002 11:17:25.203392 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" podStartSLOduration=3.203374347 podStartE2EDuration="3.203374347s" podCreationTimestamp="2025-10-02 11:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:25.195199206 +0000 UTC m=+1560.138070150" watchObservedRunningTime="2025-10-02 11:17:25.203374347 +0000 UTC m=+1560.146245291" Oct 02 11:17:26 crc kubenswrapper[4766]: I1002 11:17:26.185076 4766 generic.go:334] "Generic (PLEG): container finished" podID="99bf4282-354b-463e-a138-415852ac29be" containerID="ff96ac1157c6bcbb685c56fe7ed71d0865b0c6ed9b4090c4cc0dca200ed610d9" exitCode=143 Oct 02 11:17:26 crc kubenswrapper[4766]: I1002 11:17:26.185161 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99bf4282-354b-463e-a138-415852ac29be","Type":"ContainerDied","Data":"ff96ac1157c6bcbb685c56fe7ed71d0865b0c6ed9b4090c4cc0dca200ed610d9"} Oct 02 11:17:27 crc kubenswrapper[4766]: I1002 11:17:27.845799 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.208051 4766 generic.go:334] "Generic (PLEG): container finished" podID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerID="c5d7d2d79ad9957cb5b20466f8e1c91880dbebaa3656e4ecddf154e4b91af0d4" exitCode=0 Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.208091 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f382271-dcd5-4199-a98a-490ddb92e1b4","Type":"ContainerDied","Data":"c5d7d2d79ad9957cb5b20466f8e1c91880dbebaa3656e4ecddf154e4b91af0d4"} Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.523190 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.629713 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-combined-ca-bundle\") pod \"2f382271-dcd5-4199-a98a-490ddb92e1b4\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.629875 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f382271-dcd5-4199-a98a-490ddb92e1b4-log-httpd\") pod \"2f382271-dcd5-4199-a98a-490ddb92e1b4\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.629937 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-sg-core-conf-yaml\") pod \"2f382271-dcd5-4199-a98a-490ddb92e1b4\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.629961 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgfqj\" (UniqueName: \"kubernetes.io/projected/2f382271-dcd5-4199-a98a-490ddb92e1b4-kube-api-access-fgfqj\") pod \"2f382271-dcd5-4199-a98a-490ddb92e1b4\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.630037 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-scripts\") pod \"2f382271-dcd5-4199-a98a-490ddb92e1b4\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.630064 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f382271-dcd5-4199-a98a-490ddb92e1b4-run-httpd\") pod \"2f382271-dcd5-4199-a98a-490ddb92e1b4\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.630098 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-config-data\") pod \"2f382271-dcd5-4199-a98a-490ddb92e1b4\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.630122 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-ceilometer-tls-certs\") pod \"2f382271-dcd5-4199-a98a-490ddb92e1b4\" (UID: \"2f382271-dcd5-4199-a98a-490ddb92e1b4\") " Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.632038 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f382271-dcd5-4199-a98a-490ddb92e1b4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2f382271-dcd5-4199-a98a-490ddb92e1b4" (UID: "2f382271-dcd5-4199-a98a-490ddb92e1b4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.640002 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f382271-dcd5-4199-a98a-490ddb92e1b4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2f382271-dcd5-4199-a98a-490ddb92e1b4" (UID: "2f382271-dcd5-4199-a98a-490ddb92e1b4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.642647 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f382271-dcd5-4199-a98a-490ddb92e1b4-kube-api-access-fgfqj" (OuterVolumeSpecName: "kube-api-access-fgfqj") pod "2f382271-dcd5-4199-a98a-490ddb92e1b4" (UID: "2f382271-dcd5-4199-a98a-490ddb92e1b4"). InnerVolumeSpecName "kube-api-access-fgfqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.655065 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-scripts" (OuterVolumeSpecName: "scripts") pod "2f382271-dcd5-4199-a98a-490ddb92e1b4" (UID: "2f382271-dcd5-4199-a98a-490ddb92e1b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.710702 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2f382271-dcd5-4199-a98a-490ddb92e1b4" (UID: "2f382271-dcd5-4199-a98a-490ddb92e1b4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.710966 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2f382271-dcd5-4199-a98a-490ddb92e1b4" (UID: "2f382271-dcd5-4199-a98a-490ddb92e1b4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.733066 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f382271-dcd5-4199-a98a-490ddb92e1b4-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.733106 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.733120 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgfqj\" (UniqueName: \"kubernetes.io/projected/2f382271-dcd5-4199-a98a-490ddb92e1b4-kube-api-access-fgfqj\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.733131 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.733143 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f382271-dcd5-4199-a98a-490ddb92e1b4-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.733153 4766 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.741893 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f382271-dcd5-4199-a98a-490ddb92e1b4" (UID: "2f382271-dcd5-4199-a98a-490ddb92e1b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.765015 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.772796 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-config-data" (OuterVolumeSpecName: "config-data") pod "2f382271-dcd5-4199-a98a-490ddb92e1b4" (UID: "2f382271-dcd5-4199-a98a-490ddb92e1b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.835296 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.835613 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f382271-dcd5-4199-a98a-490ddb92e1b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.936738 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99bf4282-354b-463e-a138-415852ac29be-config-data\") pod \"99bf4282-354b-463e-a138-415852ac29be\" (UID: \"99bf4282-354b-463e-a138-415852ac29be\") " Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.936869 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99bf4282-354b-463e-a138-415852ac29be-combined-ca-bundle\") pod \"99bf4282-354b-463e-a138-415852ac29be\" (UID: \"99bf4282-354b-463e-a138-415852ac29be\") " Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.936964 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99bf4282-354b-463e-a138-415852ac29be-logs\") pod \"99bf4282-354b-463e-a138-415852ac29be\" (UID: \"99bf4282-354b-463e-a138-415852ac29be\") " Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.936995 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz8tw\" (UniqueName: \"kubernetes.io/projected/99bf4282-354b-463e-a138-415852ac29be-kube-api-access-zz8tw\") pod \"99bf4282-354b-463e-a138-415852ac29be\" (UID: \"99bf4282-354b-463e-a138-415852ac29be\") " Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.937517 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99bf4282-354b-463e-a138-415852ac29be-logs" (OuterVolumeSpecName: "logs") pod "99bf4282-354b-463e-a138-415852ac29be" (UID: "99bf4282-354b-463e-a138-415852ac29be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.947988 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99bf4282-354b-463e-a138-415852ac29be-kube-api-access-zz8tw" (OuterVolumeSpecName: "kube-api-access-zz8tw") pod "99bf4282-354b-463e-a138-415852ac29be" (UID: "99bf4282-354b-463e-a138-415852ac29be"). InnerVolumeSpecName "kube-api-access-zz8tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.986184 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99bf4282-354b-463e-a138-415852ac29be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99bf4282-354b-463e-a138-415852ac29be" (UID: "99bf4282-354b-463e-a138-415852ac29be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:28 crc kubenswrapper[4766]: I1002 11:17:28.991703 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99bf4282-354b-463e-a138-415852ac29be-config-data" (OuterVolumeSpecName: "config-data") pod "99bf4282-354b-463e-a138-415852ac29be" (UID: "99bf4282-354b-463e-a138-415852ac29be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.039790 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99bf4282-354b-463e-a138-415852ac29be-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.039822 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz8tw\" (UniqueName: \"kubernetes.io/projected/99bf4282-354b-463e-a138-415852ac29be-kube-api-access-zz8tw\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.039834 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99bf4282-354b-463e-a138-415852ac29be-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.039843 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99bf4282-354b-463e-a138-415852ac29be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.218167 4766 generic.go:334] "Generic (PLEG): container finished" podID="99bf4282-354b-463e-a138-415852ac29be" containerID="703b6cba446ff163628b04e87dc98e30942b0ccb5f64ffbea850955af7033b82" exitCode=0 Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.218218 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99bf4282-354b-463e-a138-415852ac29be","Type":"ContainerDied","Data":"703b6cba446ff163628b04e87dc98e30942b0ccb5f64ffbea850955af7033b82"} Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.218243 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"99bf4282-354b-463e-a138-415852ac29be","Type":"ContainerDied","Data":"b55f090029070eb58a2c7961d8675ae51ca4cc2d759aa2cca700eb184127e161"} Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.218258 4766 scope.go:117] "RemoveContainer" containerID="703b6cba446ff163628b04e87dc98e30942b0ccb5f64ffbea850955af7033b82" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.218375 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.223812 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f382271-dcd5-4199-a98a-490ddb92e1b4","Type":"ContainerDied","Data":"ca87442b80b9703deb8c67562e8306f2317043e5ff8d5ab361ca3d1d2637110e"} Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.223822 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.310543 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.326728 4766 scope.go:117] "RemoveContainer" containerID="ff96ac1157c6bcbb685c56fe7ed71d0865b0c6ed9b4090c4cc0dca200ed610d9" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.358105 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.375371 4766 scope.go:117] "RemoveContainer" containerID="703b6cba446ff163628b04e87dc98e30942b0ccb5f64ffbea850955af7033b82" Oct 02 11:17:29 crc kubenswrapper[4766]: E1002 11:17:29.385692 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"703b6cba446ff163628b04e87dc98e30942b0ccb5f64ffbea850955af7033b82\": container with ID starting with 703b6cba446ff163628b04e87dc98e30942b0ccb5f64ffbea850955af7033b82 not found: ID does not exist" containerID="703b6cba446ff163628b04e87dc98e30942b0ccb5f64ffbea850955af7033b82" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.385743 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"703b6cba446ff163628b04e87dc98e30942b0ccb5f64ffbea850955af7033b82"} err="failed to get container status \"703b6cba446ff163628b04e87dc98e30942b0ccb5f64ffbea850955af7033b82\": rpc error: code = NotFound desc = could not find container \"703b6cba446ff163628b04e87dc98e30942b0ccb5f64ffbea850955af7033b82\": container with ID starting with 703b6cba446ff163628b04e87dc98e30942b0ccb5f64ffbea850955af7033b82 not found: ID does not exist" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.385769 4766 scope.go:117] "RemoveContainer" containerID="ff96ac1157c6bcbb685c56fe7ed71d0865b0c6ed9b4090c4cc0dca200ed610d9" Oct 02 11:17:29 crc kubenswrapper[4766]: E1002 11:17:29.391629 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff96ac1157c6bcbb685c56fe7ed71d0865b0c6ed9b4090c4cc0dca200ed610d9\": container with ID starting with ff96ac1157c6bcbb685c56fe7ed71d0865b0c6ed9b4090c4cc0dca200ed610d9 not found: ID does not exist" containerID="ff96ac1157c6bcbb685c56fe7ed71d0865b0c6ed9b4090c4cc0dca200ed610d9" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.391680 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff96ac1157c6bcbb685c56fe7ed71d0865b0c6ed9b4090c4cc0dca200ed610d9"} err="failed to get container status \"ff96ac1157c6bcbb685c56fe7ed71d0865b0c6ed9b4090c4cc0dca200ed610d9\": rpc error: code = NotFound desc = could not find container \"ff96ac1157c6bcbb685c56fe7ed71d0865b0c6ed9b4090c4cc0dca200ed610d9\": container with ID starting with ff96ac1157c6bcbb685c56fe7ed71d0865b0c6ed9b4090c4cc0dca200ed610d9 not found: ID does not exist" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.391711 4766 scope.go:117] "RemoveContainer" containerID="d68b8a196b3bc7566293f1f4bf2cfaa3d995d0598f5e72de076da6feb093d26b" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.399580 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.407998 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.418488 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:29 crc kubenswrapper[4766]: E1002 11:17:29.419239 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerName="ceilometer-central-agent" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.419262 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerName="ceilometer-central-agent" Oct 02 11:17:29 crc kubenswrapper[4766]: E1002 11:17:29.419275 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bf4282-354b-463e-a138-415852ac29be" containerName="nova-api-api" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.419283 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bf4282-354b-463e-a138-415852ac29be" containerName="nova-api-api" Oct 02 11:17:29 crc kubenswrapper[4766]: E1002 11:17:29.419320 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerName="ceilometer-notification-agent" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.419328 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerName="ceilometer-notification-agent" Oct 02 11:17:29 crc kubenswrapper[4766]: E1002 11:17:29.419348 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerName="sg-core" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.419356 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerName="sg-core" Oct 02 11:17:29 crc kubenswrapper[4766]: E1002 11:17:29.419364 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bf4282-354b-463e-a138-415852ac29be" containerName="nova-api-log" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.419373 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bf4282-354b-463e-a138-415852ac29be" containerName="nova-api-log" Oct 02 11:17:29 crc kubenswrapper[4766]: E1002 11:17:29.419403 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerName="proxy-httpd" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.419411 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerName="proxy-httpd" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.419693 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerName="ceilometer-central-agent" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.419721 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerName="proxy-httpd" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.419745 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerName="ceilometer-notification-agent" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.419755 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f382271-dcd5-4199-a98a-490ddb92e1b4" containerName="sg-core" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.419772 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bf4282-354b-463e-a138-415852ac29be" containerName="nova-api-log" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.419786 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bf4282-354b-463e-a138-415852ac29be" containerName="nova-api-api" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.421367 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.425120 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.425192 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.426240 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.430148 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.433673 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.435772 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.435810 4766 scope.go:117] "RemoveContainer" containerID="793a807916e2716603f5252269b8f6ee91448251ec30bf50f028ad12d2deba0e" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.436071 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.437489 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.442572 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.461650 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.490313 4766 scope.go:117] "RemoveContainer" containerID="c5d7d2d79ad9957cb5b20466f8e1c91880dbebaa3656e4ecddf154e4b91af0d4" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.529271 4766 scope.go:117] "RemoveContainer" containerID="46c2ed514e2bdf9ef031ae67c47aca7ce83582591d3d5ceb160b1a1f8a5d12fc" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.555844 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-scripts\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.555896 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-config-data\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.555919 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfzxs\" (UniqueName: \"kubernetes.io/projected/7b5ac374-df46-4a36-947d-de07af25426c-kube-api-access-vfzxs\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.555945 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.555973 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.556001 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b5ac374-df46-4a36-947d-de07af25426c-log-httpd\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.556019 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-public-tls-certs\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.556063 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.556116 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-config-data\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.556154 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qchw\" (UniqueName: \"kubernetes.io/projected/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-kube-api-access-8qchw\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.556172 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.556218 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b5ac374-df46-4a36-947d-de07af25426c-run-httpd\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.556235 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.556250 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-logs\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.658182 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b5ac374-df46-4a36-947d-de07af25426c-run-httpd\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.658232 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.658248 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-logs\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.658291 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-scripts\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.658321 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-config-data\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.658345 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfzxs\" (UniqueName: \"kubernetes.io/projected/7b5ac374-df46-4a36-947d-de07af25426c-kube-api-access-vfzxs\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.658532 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.658558 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.658583 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b5ac374-df46-4a36-947d-de07af25426c-log-httpd\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.658598 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-public-tls-certs\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.658672 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.658697 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-config-data\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.658727 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qchw\" (UniqueName: \"kubernetes.io/projected/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-kube-api-access-8qchw\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.658741 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.659719 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b5ac374-df46-4a36-947d-de07af25426c-log-httpd\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.659805 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-logs\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.660266 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b5ac374-df46-4a36-947d-de07af25426c-run-httpd\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.662887 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-scripts\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.663805 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.664303 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.664309 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-public-tls-certs\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.664457 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-config-data\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.665605 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.665853 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.666951 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.669772 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-config-data\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.673635 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfzxs\" (UniqueName: \"kubernetes.io/projected/7b5ac374-df46-4a36-947d-de07af25426c-kube-api-access-vfzxs\") pod \"ceilometer-0\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.679466 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qchw\" (UniqueName: \"kubernetes.io/projected/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-kube-api-access-8qchw\") pod \"nova-api-0\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.755255 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.772646 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.895135 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f382271-dcd5-4199-a98a-490ddb92e1b4" path="/var/lib/kubelet/pods/2f382271-dcd5-4199-a98a-490ddb92e1b4/volumes" Oct 02 11:17:29 crc kubenswrapper[4766]: I1002 11:17:29.896783 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99bf4282-354b-463e-a138-415852ac29be" path="/var/lib/kubelet/pods/99bf4282-354b-463e-a138-415852ac29be/volumes" Oct 02 11:17:30 crc kubenswrapper[4766]: I1002 11:17:30.288416 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:30 crc kubenswrapper[4766]: W1002 11:17:30.290407 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0992e9f_7b92_48af_bd2e_69734c0c8dfd.slice/crio-648a3f943daa549207cb18d595afcc56ec41a0ff823cc259ae9b188da8f6ae27 WatchSource:0}: Error finding container 648a3f943daa549207cb18d595afcc56ec41a0ff823cc259ae9b188da8f6ae27: Status 404 returned error can't find the container with id 648a3f943daa549207cb18d595afcc56ec41a0ff823cc259ae9b188da8f6ae27 Oct 02 11:17:30 crc kubenswrapper[4766]: I1002 11:17:30.367788 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:17:31 crc kubenswrapper[4766]: I1002 11:17:31.247747 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d0992e9f-7b92-48af-bd2e-69734c0c8dfd","Type":"ContainerStarted","Data":"64dc6ab3ff732665409e85c5bc055f92613518acdef12f6479c6e0f4b4e38a2d"} Oct 02 11:17:31 crc kubenswrapper[4766]: I1002 11:17:31.248340 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d0992e9f-7b92-48af-bd2e-69734c0c8dfd","Type":"ContainerStarted","Data":"df0bfc3f5c98293d3fa0e218566f25d00376650787827c5ad6d6711d318929c0"} Oct 02 11:17:31 crc kubenswrapper[4766]: I1002 11:17:31.248357 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d0992e9f-7b92-48af-bd2e-69734c0c8dfd","Type":"ContainerStarted","Data":"648a3f943daa549207cb18d595afcc56ec41a0ff823cc259ae9b188da8f6ae27"} Oct 02 11:17:31 crc kubenswrapper[4766]: I1002 11:17:31.250058 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b5ac374-df46-4a36-947d-de07af25426c","Type":"ContainerStarted","Data":"e7e7c6bcc90173226bb27fb9e8539546111c6ddeb92e1200c4fa61b8a055e508"} Oct 02 11:17:31 crc kubenswrapper[4766]: I1002 11:17:31.250098 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b5ac374-df46-4a36-947d-de07af25426c","Type":"ContainerStarted","Data":"9528ae3cc414cdaf93226f68c3078cc8943cd1ba76abcd03af19c848321b06b5"} Oct 02 11:17:31 crc kubenswrapper[4766]: I1002 11:17:31.271251 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.271229502 podStartE2EDuration="2.271229502s" podCreationTimestamp="2025-10-02 11:17:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:31.264345551 +0000 UTC m=+1566.207216505" watchObservedRunningTime="2025-10-02 11:17:31.271229502 +0000 UTC m=+1566.214100446" Oct 02 11:17:32 crc kubenswrapper[4766]: I1002 11:17:32.261198 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b5ac374-df46-4a36-947d-de07af25426c","Type":"ContainerStarted","Data":"6384876206ceb62e698ffec3b4534e9b368119d727684623f371cd380375cf5c"} Oct 02 11:17:32 crc kubenswrapper[4766]: I1002 11:17:32.690523 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:17:32 crc kubenswrapper[4766]: I1002 11:17:32.747205 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-fmk4f"] Oct 02 11:17:32 crc kubenswrapper[4766]: I1002 11:17:32.747521 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" podUID="d260f950-8815-4b40-b2f7-5a27fca9690d" containerName="dnsmasq-dns" containerID="cri-o://64f328e55f1eb3dace68499ae2f4a77afaef7496ecfc3efc0ef9ce8d165784dc" gracePeriod=10 Oct 02 11:17:32 crc kubenswrapper[4766]: I1002 11:17:32.846110 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:32 crc kubenswrapper[4766]: I1002 11:17:32.865834 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.216945 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.249225 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-config\") pod \"d260f950-8815-4b40-b2f7-5a27fca9690d\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.249320 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-ovsdbserver-sb\") pod \"d260f950-8815-4b40-b2f7-5a27fca9690d\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.249547 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-dns-swift-storage-0\") pod \"d260f950-8815-4b40-b2f7-5a27fca9690d\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.249623 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-ovsdbserver-nb\") pod \"d260f950-8815-4b40-b2f7-5a27fca9690d\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.249644 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n5vr\" (UniqueName: \"kubernetes.io/projected/d260f950-8815-4b40-b2f7-5a27fca9690d-kube-api-access-4n5vr\") pod \"d260f950-8815-4b40-b2f7-5a27fca9690d\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.249666 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-dns-svc\") pod \"d260f950-8815-4b40-b2f7-5a27fca9690d\" (UID: \"d260f950-8815-4b40-b2f7-5a27fca9690d\") " Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.256300 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d260f950-8815-4b40-b2f7-5a27fca9690d-kube-api-access-4n5vr" (OuterVolumeSpecName: "kube-api-access-4n5vr") pod "d260f950-8815-4b40-b2f7-5a27fca9690d" (UID: "d260f950-8815-4b40-b2f7-5a27fca9690d"). InnerVolumeSpecName "kube-api-access-4n5vr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.287384 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b5ac374-df46-4a36-947d-de07af25426c","Type":"ContainerStarted","Data":"76a65e0801ebae9343bc82e73244baaf4bf39324be0c967cb15033840367ed33"} Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.291404 4766 generic.go:334] "Generic (PLEG): container finished" podID="d260f950-8815-4b40-b2f7-5a27fca9690d" containerID="64f328e55f1eb3dace68499ae2f4a77afaef7496ecfc3efc0ef9ce8d165784dc" exitCode=0 Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.291486 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.291546 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" event={"ID":"d260f950-8815-4b40-b2f7-5a27fca9690d","Type":"ContainerDied","Data":"64f328e55f1eb3dace68499ae2f4a77afaef7496ecfc3efc0ef9ce8d165784dc"} Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.291580 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-fmk4f" event={"ID":"d260f950-8815-4b40-b2f7-5a27fca9690d","Type":"ContainerDied","Data":"7322eaa43fb6c2ec5f705855324843ac52fa37fd7bfa23a73a366c72060bdad3"} Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.291605 4766 scope.go:117] "RemoveContainer" containerID="64f328e55f1eb3dace68499ae2f4a77afaef7496ecfc3efc0ef9ce8d165784dc" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.317020 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.320832 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d260f950-8815-4b40-b2f7-5a27fca9690d" (UID: "d260f950-8815-4b40-b2f7-5a27fca9690d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.321887 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d260f950-8815-4b40-b2f7-5a27fca9690d" (UID: "d260f950-8815-4b40-b2f7-5a27fca9690d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.348443 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d260f950-8815-4b40-b2f7-5a27fca9690d" (UID: "d260f950-8815-4b40-b2f7-5a27fca9690d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.352275 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.352298 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4n5vr\" (UniqueName: \"kubernetes.io/projected/d260f950-8815-4b40-b2f7-5a27fca9690d-kube-api-access-4n5vr\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.352310 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.352319 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.359010 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-config" (OuterVolumeSpecName: "config") pod "d260f950-8815-4b40-b2f7-5a27fca9690d" (UID: "d260f950-8815-4b40-b2f7-5a27fca9690d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.379778 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d260f950-8815-4b40-b2f7-5a27fca9690d" (UID: "d260f950-8815-4b40-b2f7-5a27fca9690d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.453787 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.453831 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d260f950-8815-4b40-b2f7-5a27fca9690d-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.457061 4766 scope.go:117] "RemoveContainer" containerID="2758ac0da3918349e774e56df0a688cfe547b1d5b11b4b63da67dfac7a2ee514" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.480039 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-qxksg"] Oct 02 11:17:33 crc kubenswrapper[4766]: E1002 11:17:33.480370 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d260f950-8815-4b40-b2f7-5a27fca9690d" containerName="dnsmasq-dns" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.480381 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d260f950-8815-4b40-b2f7-5a27fca9690d" containerName="dnsmasq-dns" Oct 02 11:17:33 crc kubenswrapper[4766]: E1002 11:17:33.480418 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d260f950-8815-4b40-b2f7-5a27fca9690d" containerName="init" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.480424 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d260f950-8815-4b40-b2f7-5a27fca9690d" containerName="init" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.480967 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d260f950-8815-4b40-b2f7-5a27fca9690d" containerName="dnsmasq-dns" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.481556 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qxksg" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.484538 4766 scope.go:117] "RemoveContainer" containerID="64f328e55f1eb3dace68499ae2f4a77afaef7496ecfc3efc0ef9ce8d165784dc" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.484574 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 02 11:17:33 crc kubenswrapper[4766]: E1002 11:17:33.485520 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64f328e55f1eb3dace68499ae2f4a77afaef7496ecfc3efc0ef9ce8d165784dc\": container with ID starting with 64f328e55f1eb3dace68499ae2f4a77afaef7496ecfc3efc0ef9ce8d165784dc not found: ID does not exist" containerID="64f328e55f1eb3dace68499ae2f4a77afaef7496ecfc3efc0ef9ce8d165784dc" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.485561 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f328e55f1eb3dace68499ae2f4a77afaef7496ecfc3efc0ef9ce8d165784dc"} err="failed to get container status \"64f328e55f1eb3dace68499ae2f4a77afaef7496ecfc3efc0ef9ce8d165784dc\": rpc error: code = NotFound desc = could not find container \"64f328e55f1eb3dace68499ae2f4a77afaef7496ecfc3efc0ef9ce8d165784dc\": container with ID starting with 64f328e55f1eb3dace68499ae2f4a77afaef7496ecfc3efc0ef9ce8d165784dc not found: ID does not exist" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.485595 4766 scope.go:117] "RemoveContainer" containerID="2758ac0da3918349e774e56df0a688cfe547b1d5b11b4b63da67dfac7a2ee514" Oct 02 11:17:33 crc kubenswrapper[4766]: E1002 11:17:33.486211 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2758ac0da3918349e774e56df0a688cfe547b1d5b11b4b63da67dfac7a2ee514\": container with ID starting with 2758ac0da3918349e774e56df0a688cfe547b1d5b11b4b63da67dfac7a2ee514 not found: ID does not exist" containerID="2758ac0da3918349e774e56df0a688cfe547b1d5b11b4b63da67dfac7a2ee514" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.486268 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2758ac0da3918349e774e56df0a688cfe547b1d5b11b4b63da67dfac7a2ee514"} err="failed to get container status \"2758ac0da3918349e774e56df0a688cfe547b1d5b11b4b63da67dfac7a2ee514\": rpc error: code = NotFound desc = could not find container \"2758ac0da3918349e774e56df0a688cfe547b1d5b11b4b63da67dfac7a2ee514\": container with ID starting with 2758ac0da3918349e774e56df0a688cfe547b1d5b11b4b63da67dfac7a2ee514 not found: ID does not exist" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.490436 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.493851 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-qxksg"] Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.554954 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-config-data\") pod \"nova-cell1-cell-mapping-qxksg\" (UID: \"da4d9512-afe1-45d8-ba7c-77c398a7955d\") " pod="openstack/nova-cell1-cell-mapping-qxksg" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.554996 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-qxksg\" (UID: \"da4d9512-afe1-45d8-ba7c-77c398a7955d\") " pod="openstack/nova-cell1-cell-mapping-qxksg" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.555595 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-scripts\") pod \"nova-cell1-cell-mapping-qxksg\" (UID: \"da4d9512-afe1-45d8-ba7c-77c398a7955d\") " pod="openstack/nova-cell1-cell-mapping-qxksg" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.555694 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hljx7\" (UniqueName: \"kubernetes.io/projected/da4d9512-afe1-45d8-ba7c-77c398a7955d-kube-api-access-hljx7\") pod \"nova-cell1-cell-mapping-qxksg\" (UID: \"da4d9512-afe1-45d8-ba7c-77c398a7955d\") " pod="openstack/nova-cell1-cell-mapping-qxksg" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.641554 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-fmk4f"] Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.655911 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-fmk4f"] Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.656990 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-scripts\") pod \"nova-cell1-cell-mapping-qxksg\" (UID: \"da4d9512-afe1-45d8-ba7c-77c398a7955d\") " pod="openstack/nova-cell1-cell-mapping-qxksg" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.657039 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hljx7\" (UniqueName: \"kubernetes.io/projected/da4d9512-afe1-45d8-ba7c-77c398a7955d-kube-api-access-hljx7\") pod \"nova-cell1-cell-mapping-qxksg\" (UID: \"da4d9512-afe1-45d8-ba7c-77c398a7955d\") " pod="openstack/nova-cell1-cell-mapping-qxksg" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.657166 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-config-data\") pod \"nova-cell1-cell-mapping-qxksg\" (UID: \"da4d9512-afe1-45d8-ba7c-77c398a7955d\") " pod="openstack/nova-cell1-cell-mapping-qxksg" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.657197 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-qxksg\" (UID: \"da4d9512-afe1-45d8-ba7c-77c398a7955d\") " pod="openstack/nova-cell1-cell-mapping-qxksg" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.661373 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-qxksg\" (UID: \"da4d9512-afe1-45d8-ba7c-77c398a7955d\") " pod="openstack/nova-cell1-cell-mapping-qxksg" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.661425 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-config-data\") pod \"nova-cell1-cell-mapping-qxksg\" (UID: \"da4d9512-afe1-45d8-ba7c-77c398a7955d\") " pod="openstack/nova-cell1-cell-mapping-qxksg" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.671910 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-scripts\") pod \"nova-cell1-cell-mapping-qxksg\" (UID: \"da4d9512-afe1-45d8-ba7c-77c398a7955d\") " pod="openstack/nova-cell1-cell-mapping-qxksg" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.678091 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hljx7\" (UniqueName: \"kubernetes.io/projected/da4d9512-afe1-45d8-ba7c-77c398a7955d-kube-api-access-hljx7\") pod \"nova-cell1-cell-mapping-qxksg\" (UID: \"da4d9512-afe1-45d8-ba7c-77c398a7955d\") " pod="openstack/nova-cell1-cell-mapping-qxksg" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.824439 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qxksg" Oct 02 11:17:33 crc kubenswrapper[4766]: I1002 11:17:33.900934 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d260f950-8815-4b40-b2f7-5a27fca9690d" path="/var/lib/kubelet/pods/d260f950-8815-4b40-b2f7-5a27fca9690d/volumes" Oct 02 11:17:34 crc kubenswrapper[4766]: I1002 11:17:34.275712 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-qxksg"] Oct 02 11:17:34 crc kubenswrapper[4766]: I1002 11:17:34.318805 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qxksg" event={"ID":"da4d9512-afe1-45d8-ba7c-77c398a7955d","Type":"ContainerStarted","Data":"0ff9905c22b0aff7cea40974939513b8952c46e0a3006ac0e37a4ce141c74035"} Oct 02 11:17:34 crc kubenswrapper[4766]: I1002 11:17:34.321893 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b5ac374-df46-4a36-947d-de07af25426c","Type":"ContainerStarted","Data":"56bd28360a526e730981b495f3d32920b04b2ee9b375daef8362b798d8067eaf"} Oct 02 11:17:34 crc kubenswrapper[4766]: I1002 11:17:34.323126 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 11:17:34 crc kubenswrapper[4766]: I1002 11:17:34.350958 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.860570174 podStartE2EDuration="5.350937274s" podCreationTimestamp="2025-10-02 11:17:29 +0000 UTC" firstStartedPulling="2025-10-02 11:17:30.3738325 +0000 UTC m=+1565.316703444" lastFinishedPulling="2025-10-02 11:17:33.8641996 +0000 UTC m=+1568.807070544" observedRunningTime="2025-10-02 11:17:34.340580422 +0000 UTC m=+1569.283451366" watchObservedRunningTime="2025-10-02 11:17:34.350937274 +0000 UTC m=+1569.293808218" Oct 02 11:17:35 crc kubenswrapper[4766]: I1002 11:17:35.335609 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qxksg" event={"ID":"da4d9512-afe1-45d8-ba7c-77c398a7955d","Type":"ContainerStarted","Data":"e79b5f1b193fe730d66df4b38d682b12876fcbb70e51c1029f4b2b498b6ededd"} Oct 02 11:17:35 crc kubenswrapper[4766]: I1002 11:17:35.355202 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-qxksg" podStartSLOduration=2.355183843 podStartE2EDuration="2.355183843s" podCreationTimestamp="2025-10-02 11:17:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:35.354871263 +0000 UTC m=+1570.297742207" watchObservedRunningTime="2025-10-02 11:17:35.355183843 +0000 UTC m=+1570.298054787" Oct 02 11:17:36 crc kubenswrapper[4766]: I1002 11:17:36.881557 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:17:36 crc kubenswrapper[4766]: E1002 11:17:36.882183 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:17:39 crc kubenswrapper[4766]: I1002 11:17:39.373731 4766 generic.go:334] "Generic (PLEG): container finished" podID="da4d9512-afe1-45d8-ba7c-77c398a7955d" containerID="e79b5f1b193fe730d66df4b38d682b12876fcbb70e51c1029f4b2b498b6ededd" exitCode=0 Oct 02 11:17:39 crc kubenswrapper[4766]: I1002 11:17:39.373833 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qxksg" event={"ID":"da4d9512-afe1-45d8-ba7c-77c398a7955d","Type":"ContainerDied","Data":"e79b5f1b193fe730d66df4b38d682b12876fcbb70e51c1029f4b2b498b6ededd"} Oct 02 11:17:39 crc kubenswrapper[4766]: I1002 11:17:39.756237 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:17:39 crc kubenswrapper[4766]: I1002 11:17:39.756296 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:17:40 crc kubenswrapper[4766]: I1002 11:17:40.749083 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qxksg" Oct 02 11:17:40 crc kubenswrapper[4766]: I1002 11:17:40.768707 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d0992e9f-7b92-48af-bd2e-69734c0c8dfd" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:40 crc kubenswrapper[4766]: I1002 11:17:40.768728 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d0992e9f-7b92-48af-bd2e-69734c0c8dfd" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:40 crc kubenswrapper[4766]: I1002 11:17:40.937205 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-scripts\") pod \"da4d9512-afe1-45d8-ba7c-77c398a7955d\" (UID: \"da4d9512-afe1-45d8-ba7c-77c398a7955d\") " Oct 02 11:17:40 crc kubenswrapper[4766]: I1002 11:17:40.937372 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-combined-ca-bundle\") pod \"da4d9512-afe1-45d8-ba7c-77c398a7955d\" (UID: \"da4d9512-afe1-45d8-ba7c-77c398a7955d\") " Oct 02 11:17:40 crc kubenswrapper[4766]: I1002 11:17:40.937442 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-config-data\") pod \"da4d9512-afe1-45d8-ba7c-77c398a7955d\" (UID: \"da4d9512-afe1-45d8-ba7c-77c398a7955d\") " Oct 02 11:17:40 crc kubenswrapper[4766]: I1002 11:17:40.937475 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hljx7\" (UniqueName: \"kubernetes.io/projected/da4d9512-afe1-45d8-ba7c-77c398a7955d-kube-api-access-hljx7\") pod \"da4d9512-afe1-45d8-ba7c-77c398a7955d\" (UID: \"da4d9512-afe1-45d8-ba7c-77c398a7955d\") " Oct 02 11:17:40 crc kubenswrapper[4766]: I1002 11:17:40.943189 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-scripts" (OuterVolumeSpecName: "scripts") pod "da4d9512-afe1-45d8-ba7c-77c398a7955d" (UID: "da4d9512-afe1-45d8-ba7c-77c398a7955d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:40 crc kubenswrapper[4766]: I1002 11:17:40.943663 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da4d9512-afe1-45d8-ba7c-77c398a7955d-kube-api-access-hljx7" (OuterVolumeSpecName: "kube-api-access-hljx7") pod "da4d9512-afe1-45d8-ba7c-77c398a7955d" (UID: "da4d9512-afe1-45d8-ba7c-77c398a7955d"). InnerVolumeSpecName "kube-api-access-hljx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:40 crc kubenswrapper[4766]: I1002 11:17:40.964999 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da4d9512-afe1-45d8-ba7c-77c398a7955d" (UID: "da4d9512-afe1-45d8-ba7c-77c398a7955d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:40 crc kubenswrapper[4766]: I1002 11:17:40.968816 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-config-data" (OuterVolumeSpecName: "config-data") pod "da4d9512-afe1-45d8-ba7c-77c398a7955d" (UID: "da4d9512-afe1-45d8-ba7c-77c398a7955d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:41 crc kubenswrapper[4766]: I1002 11:17:41.040710 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hljx7\" (UniqueName: \"kubernetes.io/projected/da4d9512-afe1-45d8-ba7c-77c398a7955d-kube-api-access-hljx7\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:41 crc kubenswrapper[4766]: I1002 11:17:41.041262 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:41 crc kubenswrapper[4766]: I1002 11:17:41.041330 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:41 crc kubenswrapper[4766]: I1002 11:17:41.041353 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da4d9512-afe1-45d8-ba7c-77c398a7955d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:41 crc kubenswrapper[4766]: I1002 11:17:41.392766 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-qxksg" event={"ID":"da4d9512-afe1-45d8-ba7c-77c398a7955d","Type":"ContainerDied","Data":"0ff9905c22b0aff7cea40974939513b8952c46e0a3006ac0e37a4ce141c74035"} Oct 02 11:17:41 crc kubenswrapper[4766]: I1002 11:17:41.392799 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ff9905c22b0aff7cea40974939513b8952c46e0a3006ac0e37a4ce141c74035" Oct 02 11:17:41 crc kubenswrapper[4766]: I1002 11:17:41.392874 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-qxksg" Oct 02 11:17:41 crc kubenswrapper[4766]: I1002 11:17:41.580073 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:41 crc kubenswrapper[4766]: I1002 11:17:41.580387 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d0992e9f-7b92-48af-bd2e-69734c0c8dfd" containerName="nova-api-log" containerID="cri-o://df0bfc3f5c98293d3fa0e218566f25d00376650787827c5ad6d6711d318929c0" gracePeriod=30 Oct 02 11:17:41 crc kubenswrapper[4766]: I1002 11:17:41.580550 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d0992e9f-7b92-48af-bd2e-69734c0c8dfd" containerName="nova-api-api" containerID="cri-o://64dc6ab3ff732665409e85c5bc055f92613518acdef12f6479c6e0f4b4e38a2d" gracePeriod=30 Oct 02 11:17:41 crc kubenswrapper[4766]: I1002 11:17:41.606424 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:17:41 crc kubenswrapper[4766]: I1002 11:17:41.612751 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7" containerName="nova-scheduler-scheduler" containerID="cri-o://3c3f1237df38c8b6ccd32b780a793babbd91e3830b9cb19cf87c41863f677752" gracePeriod=30 Oct 02 11:17:41 crc kubenswrapper[4766]: I1002 11:17:41.642028 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:17:41 crc kubenswrapper[4766]: I1002 11:17:41.642765 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1d8b9b37-94aa-4611-b9aa-08d55c42987b" containerName="nova-metadata-log" containerID="cri-o://a504f0dbaa5b2e2221ad476c96d49c3ede95d35904c8dc12baa5c5f199157005" gracePeriod=30 Oct 02 11:17:41 crc kubenswrapper[4766]: I1002 11:17:41.642903 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1d8b9b37-94aa-4611-b9aa-08d55c42987b" containerName="nova-metadata-metadata" containerID="cri-o://0bc8ac2cb598eaa8aca752f18cd4ff6afdedadff6528d3ac77845ba3cd53f9dd" gracePeriod=30 Oct 02 11:17:42 crc kubenswrapper[4766]: I1002 11:17:42.402804 4766 generic.go:334] "Generic (PLEG): container finished" podID="d0992e9f-7b92-48af-bd2e-69734c0c8dfd" containerID="df0bfc3f5c98293d3fa0e218566f25d00376650787827c5ad6d6711d318929c0" exitCode=143 Oct 02 11:17:42 crc kubenswrapper[4766]: I1002 11:17:42.402879 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d0992e9f-7b92-48af-bd2e-69734c0c8dfd","Type":"ContainerDied","Data":"df0bfc3f5c98293d3fa0e218566f25d00376650787827c5ad6d6711d318929c0"} Oct 02 11:17:42 crc kubenswrapper[4766]: I1002 11:17:42.405079 4766 generic.go:334] "Generic (PLEG): container finished" podID="1d8b9b37-94aa-4611-b9aa-08d55c42987b" containerID="a504f0dbaa5b2e2221ad476c96d49c3ede95d35904c8dc12baa5c5f199157005" exitCode=143 Oct 02 11:17:42 crc kubenswrapper[4766]: I1002 11:17:42.405154 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d8b9b37-94aa-4611-b9aa-08d55c42987b","Type":"ContainerDied","Data":"a504f0dbaa5b2e2221ad476c96d49c3ede95d35904c8dc12baa5c5f199157005"} Oct 02 11:17:43 crc kubenswrapper[4766]: I1002 11:17:43.416140 4766 generic.go:334] "Generic (PLEG): container finished" podID="651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7" containerID="3c3f1237df38c8b6ccd32b780a793babbd91e3830b9cb19cf87c41863f677752" exitCode=0 Oct 02 11:17:43 crc kubenswrapper[4766]: I1002 11:17:43.416291 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7","Type":"ContainerDied","Data":"3c3f1237df38c8b6ccd32b780a793babbd91e3830b9cb19cf87c41863f677752"} Oct 02 11:17:43 crc kubenswrapper[4766]: I1002 11:17:43.416474 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7","Type":"ContainerDied","Data":"4ed345a3119bdc9f5fc79c2fa7e147ad7c93cca94448758ed5a48df4771962c2"} Oct 02 11:17:43 crc kubenswrapper[4766]: I1002 11:17:43.416493 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ed345a3119bdc9f5fc79c2fa7e147ad7c93cca94448758ed5a48df4771962c2" Oct 02 11:17:43 crc kubenswrapper[4766]: I1002 11:17:43.433819 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:17:43 crc kubenswrapper[4766]: I1002 11:17:43.592606 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-config-data\") pod \"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7\" (UID: \"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7\") " Oct 02 11:17:43 crc kubenswrapper[4766]: I1002 11:17:43.592814 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds9sb\" (UniqueName: \"kubernetes.io/projected/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-kube-api-access-ds9sb\") pod \"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7\" (UID: \"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7\") " Oct 02 11:17:43 crc kubenswrapper[4766]: I1002 11:17:43.592940 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-combined-ca-bundle\") pod \"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7\" (UID: \"651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7\") " Oct 02 11:17:43 crc kubenswrapper[4766]: I1002 11:17:43.598736 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-kube-api-access-ds9sb" (OuterVolumeSpecName: "kube-api-access-ds9sb") pod "651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7" (UID: "651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7"). InnerVolumeSpecName "kube-api-access-ds9sb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:43 crc kubenswrapper[4766]: I1002 11:17:43.618354 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-config-data" (OuterVolumeSpecName: "config-data") pod "651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7" (UID: "651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:43 crc kubenswrapper[4766]: I1002 11:17:43.620546 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7" (UID: "651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:43 crc kubenswrapper[4766]: I1002 11:17:43.694936 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:43 crc kubenswrapper[4766]: I1002 11:17:43.694987 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds9sb\" (UniqueName: \"kubernetes.io/projected/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-kube-api-access-ds9sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:43 crc kubenswrapper[4766]: I1002 11:17:43.695002 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.424463 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.445896 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.457881 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.469898 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:17:44 crc kubenswrapper[4766]: E1002 11:17:44.470426 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da4d9512-afe1-45d8-ba7c-77c398a7955d" containerName="nova-manage" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.470491 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="da4d9512-afe1-45d8-ba7c-77c398a7955d" containerName="nova-manage" Oct 02 11:17:44 crc kubenswrapper[4766]: E1002 11:17:44.470598 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7" containerName="nova-scheduler-scheduler" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.470645 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7" containerName="nova-scheduler-scheduler" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.470994 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7" containerName="nova-scheduler-scheduler" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.471071 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="da4d9512-afe1-45d8-ba7c-77c398a7955d" containerName="nova-manage" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.471694 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.476934 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.477146 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.612822 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g84r8\" (UniqueName: \"kubernetes.io/projected/7dec5495-e66b-4e5e-90b6-82ee673ab269-kube-api-access-g84r8\") pod \"nova-scheduler-0\" (UID: \"7dec5495-e66b-4e5e-90b6-82ee673ab269\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.613166 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dec5495-e66b-4e5e-90b6-82ee673ab269-config-data\") pod \"nova-scheduler-0\" (UID: \"7dec5495-e66b-4e5e-90b6-82ee673ab269\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.613188 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dec5495-e66b-4e5e-90b6-82ee673ab269-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7dec5495-e66b-4e5e-90b6-82ee673ab269\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.714562 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g84r8\" (UniqueName: \"kubernetes.io/projected/7dec5495-e66b-4e5e-90b6-82ee673ab269-kube-api-access-g84r8\") pod \"nova-scheduler-0\" (UID: \"7dec5495-e66b-4e5e-90b6-82ee673ab269\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.714874 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dec5495-e66b-4e5e-90b6-82ee673ab269-config-data\") pod \"nova-scheduler-0\" (UID: \"7dec5495-e66b-4e5e-90b6-82ee673ab269\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.714986 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dec5495-e66b-4e5e-90b6-82ee673ab269-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7dec5495-e66b-4e5e-90b6-82ee673ab269\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.718600 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dec5495-e66b-4e5e-90b6-82ee673ab269-config-data\") pod \"nova-scheduler-0\" (UID: \"7dec5495-e66b-4e5e-90b6-82ee673ab269\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.721218 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dec5495-e66b-4e5e-90b6-82ee673ab269-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7dec5495-e66b-4e5e-90b6-82ee673ab269\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.731318 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g84r8\" (UniqueName: \"kubernetes.io/projected/7dec5495-e66b-4e5e-90b6-82ee673ab269-kube-api-access-g84r8\") pod \"nova-scheduler-0\" (UID: \"7dec5495-e66b-4e5e-90b6-82ee673ab269\") " pod="openstack/nova-scheduler-0" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.769749 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1d8b9b37-94aa-4611-b9aa-08d55c42987b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:57726->10.217.0.193:8775: read: connection reset by peer" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.769753 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1d8b9b37-94aa-4611-b9aa-08d55c42987b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:57718->10.217.0.193:8775: read: connection reset by peer" Oct 02 11:17:44 crc kubenswrapper[4766]: I1002 11:17:44.793067 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.219020 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.240976 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.330334 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-nova-metadata-tls-certs\") pod \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.330415 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d8b9b37-94aa-4611-b9aa-08d55c42987b-logs\") pod \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.330530 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrcpt\" (UniqueName: \"kubernetes.io/projected/1d8b9b37-94aa-4611-b9aa-08d55c42987b-kube-api-access-jrcpt\") pod \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.330574 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-combined-ca-bundle\") pod \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.330645 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-config-data\") pod \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\" (UID: \"1d8b9b37-94aa-4611-b9aa-08d55c42987b\") " Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.331182 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d8b9b37-94aa-4611-b9aa-08d55c42987b-logs" (OuterVolumeSpecName: "logs") pod "1d8b9b37-94aa-4611-b9aa-08d55c42987b" (UID: "1d8b9b37-94aa-4611-b9aa-08d55c42987b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.333828 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d8b9b37-94aa-4611-b9aa-08d55c42987b-kube-api-access-jrcpt" (OuterVolumeSpecName: "kube-api-access-jrcpt") pod "1d8b9b37-94aa-4611-b9aa-08d55c42987b" (UID: "1d8b9b37-94aa-4611-b9aa-08d55c42987b"). InnerVolumeSpecName "kube-api-access-jrcpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.370555 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-config-data" (OuterVolumeSpecName: "config-data") pod "1d8b9b37-94aa-4611-b9aa-08d55c42987b" (UID: "1d8b9b37-94aa-4611-b9aa-08d55c42987b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.409616 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d8b9b37-94aa-4611-b9aa-08d55c42987b" (UID: "1d8b9b37-94aa-4611-b9aa-08d55c42987b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.417445 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "1d8b9b37-94aa-4611-b9aa-08d55c42987b" (UID: "1d8b9b37-94aa-4611-b9aa-08d55c42987b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.432640 4766 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.432669 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d8b9b37-94aa-4611-b9aa-08d55c42987b-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.432710 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrcpt\" (UniqueName: \"kubernetes.io/projected/1d8b9b37-94aa-4611-b9aa-08d55c42987b-kube-api-access-jrcpt\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.432721 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.432729 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d8b9b37-94aa-4611-b9aa-08d55c42987b-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.433810 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7dec5495-e66b-4e5e-90b6-82ee673ab269","Type":"ContainerStarted","Data":"cecd0986302cf9ff4c82dc99a37140782713856423ffb4e9875e68db83729200"} Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.436204 4766 generic.go:334] "Generic (PLEG): container finished" podID="1d8b9b37-94aa-4611-b9aa-08d55c42987b" containerID="0bc8ac2cb598eaa8aca752f18cd4ff6afdedadff6528d3ac77845ba3cd53f9dd" exitCode=0 Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.436247 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.436252 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d8b9b37-94aa-4611-b9aa-08d55c42987b","Type":"ContainerDied","Data":"0bc8ac2cb598eaa8aca752f18cd4ff6afdedadff6528d3ac77845ba3cd53f9dd"} Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.436282 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1d8b9b37-94aa-4611-b9aa-08d55c42987b","Type":"ContainerDied","Data":"8ec81440daa9ffd72f347c06a028421425339bf133107a5b7404adf24b9fb3b7"} Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.436300 4766 scope.go:117] "RemoveContainer" containerID="0bc8ac2cb598eaa8aca752f18cd4ff6afdedadff6528d3ac77845ba3cd53f9dd" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.463325 4766 scope.go:117] "RemoveContainer" containerID="a504f0dbaa5b2e2221ad476c96d49c3ede95d35904c8dc12baa5c5f199157005" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.470318 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.483555 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.484278 4766 scope.go:117] "RemoveContainer" containerID="0bc8ac2cb598eaa8aca752f18cd4ff6afdedadff6528d3ac77845ba3cd53f9dd" Oct 02 11:17:45 crc kubenswrapper[4766]: E1002 11:17:45.484873 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bc8ac2cb598eaa8aca752f18cd4ff6afdedadff6528d3ac77845ba3cd53f9dd\": container with ID starting with 0bc8ac2cb598eaa8aca752f18cd4ff6afdedadff6528d3ac77845ba3cd53f9dd not found: ID does not exist" containerID="0bc8ac2cb598eaa8aca752f18cd4ff6afdedadff6528d3ac77845ba3cd53f9dd" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.484920 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bc8ac2cb598eaa8aca752f18cd4ff6afdedadff6528d3ac77845ba3cd53f9dd"} err="failed to get container status \"0bc8ac2cb598eaa8aca752f18cd4ff6afdedadff6528d3ac77845ba3cd53f9dd\": rpc error: code = NotFound desc = could not find container \"0bc8ac2cb598eaa8aca752f18cd4ff6afdedadff6528d3ac77845ba3cd53f9dd\": container with ID starting with 0bc8ac2cb598eaa8aca752f18cd4ff6afdedadff6528d3ac77845ba3cd53f9dd not found: ID does not exist" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.484951 4766 scope.go:117] "RemoveContainer" containerID="a504f0dbaa5b2e2221ad476c96d49c3ede95d35904c8dc12baa5c5f199157005" Oct 02 11:17:45 crc kubenswrapper[4766]: E1002 11:17:45.487128 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a504f0dbaa5b2e2221ad476c96d49c3ede95d35904c8dc12baa5c5f199157005\": container with ID starting with a504f0dbaa5b2e2221ad476c96d49c3ede95d35904c8dc12baa5c5f199157005 not found: ID does not exist" containerID="a504f0dbaa5b2e2221ad476c96d49c3ede95d35904c8dc12baa5c5f199157005" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.487166 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a504f0dbaa5b2e2221ad476c96d49c3ede95d35904c8dc12baa5c5f199157005"} err="failed to get container status \"a504f0dbaa5b2e2221ad476c96d49c3ede95d35904c8dc12baa5c5f199157005\": rpc error: code = NotFound desc = could not find container \"a504f0dbaa5b2e2221ad476c96d49c3ede95d35904c8dc12baa5c5f199157005\": container with ID starting with a504f0dbaa5b2e2221ad476c96d49c3ede95d35904c8dc12baa5c5f199157005 not found: ID does not exist" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.495028 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:17:45 crc kubenswrapper[4766]: E1002 11:17:45.495561 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d8b9b37-94aa-4611-b9aa-08d55c42987b" containerName="nova-metadata-metadata" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.495586 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d8b9b37-94aa-4611-b9aa-08d55c42987b" containerName="nova-metadata-metadata" Oct 02 11:17:45 crc kubenswrapper[4766]: E1002 11:17:45.495628 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d8b9b37-94aa-4611-b9aa-08d55c42987b" containerName="nova-metadata-log" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.495636 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d8b9b37-94aa-4611-b9aa-08d55c42987b" containerName="nova-metadata-log" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.496052 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d8b9b37-94aa-4611-b9aa-08d55c42987b" containerName="nova-metadata-log" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.496093 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d8b9b37-94aa-4611-b9aa-08d55c42987b" containerName="nova-metadata-metadata" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.497323 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.499008 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.499400 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.506169 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.533797 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmkdr\" (UniqueName: \"kubernetes.io/projected/bbb13294-05c1-4a20-8265-5144efcd91cf-kube-api-access-rmkdr\") pod \"nova-metadata-0\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.533910 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbb13294-05c1-4a20-8265-5144efcd91cf-logs\") pod \"nova-metadata-0\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.533944 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-config-data\") pod \"nova-metadata-0\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.533975 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.534194 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.635750 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.635820 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmkdr\" (UniqueName: \"kubernetes.io/projected/bbb13294-05c1-4a20-8265-5144efcd91cf-kube-api-access-rmkdr\") pod \"nova-metadata-0\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.635904 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbb13294-05c1-4a20-8265-5144efcd91cf-logs\") pod \"nova-metadata-0\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.635941 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-config-data\") pod \"nova-metadata-0\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.635969 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.636785 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbb13294-05c1-4a20-8265-5144efcd91cf-logs\") pod \"nova-metadata-0\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.639129 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.641084 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.641982 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-config-data\") pod \"nova-metadata-0\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.655345 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmkdr\" (UniqueName: \"kubernetes.io/projected/bbb13294-05c1-4a20-8265-5144efcd91cf-kube-api-access-rmkdr\") pod \"nova-metadata-0\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.821307 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.896227 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d8b9b37-94aa-4611-b9aa-08d55c42987b" path="/var/lib/kubelet/pods/1d8b9b37-94aa-4611-b9aa-08d55c42987b/volumes" Oct 02 11:17:45 crc kubenswrapper[4766]: I1002 11:17:45.896948 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7" path="/var/lib/kubelet/pods/651ab82f-8cd9-49a4-a3d8-5f2ccf2595b7/volumes" Oct 02 11:17:46 crc kubenswrapper[4766]: I1002 11:17:46.253313 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:17:46 crc kubenswrapper[4766]: W1002 11:17:46.253998 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbb13294_05c1_4a20_8265_5144efcd91cf.slice/crio-fcfd252f8c9de1f953452d449be9ba00a0d24514484b0499918450f28b305d8f WatchSource:0}: Error finding container fcfd252f8c9de1f953452d449be9ba00a0d24514484b0499918450f28b305d8f: Status 404 returned error can't find the container with id fcfd252f8c9de1f953452d449be9ba00a0d24514484b0499918450f28b305d8f Oct 02 11:17:46 crc kubenswrapper[4766]: I1002 11:17:46.445962 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bbb13294-05c1-4a20-8265-5144efcd91cf","Type":"ContainerStarted","Data":"bd5acfd9f8b6e410882799f1e42f4fb64c5bdc13db56d36a3ca2d32eb57ddec0"} Oct 02 11:17:46 crc kubenswrapper[4766]: I1002 11:17:46.446022 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bbb13294-05c1-4a20-8265-5144efcd91cf","Type":"ContainerStarted","Data":"fcfd252f8c9de1f953452d449be9ba00a0d24514484b0499918450f28b305d8f"} Oct 02 11:17:46 crc kubenswrapper[4766]: I1002 11:17:46.447959 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7dec5495-e66b-4e5e-90b6-82ee673ab269","Type":"ContainerStarted","Data":"07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936"} Oct 02 11:17:46 crc kubenswrapper[4766]: I1002 11:17:46.450984 4766 generic.go:334] "Generic (PLEG): container finished" podID="d0992e9f-7b92-48af-bd2e-69734c0c8dfd" containerID="64dc6ab3ff732665409e85c5bc055f92613518acdef12f6479c6e0f4b4e38a2d" exitCode=0 Oct 02 11:17:46 crc kubenswrapper[4766]: I1002 11:17:46.451042 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d0992e9f-7b92-48af-bd2e-69734c0c8dfd","Type":"ContainerDied","Data":"64dc6ab3ff732665409e85c5bc055f92613518acdef12f6479c6e0f4b4e38a2d"} Oct 02 11:17:46 crc kubenswrapper[4766]: I1002 11:17:46.923999 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:46 crc kubenswrapper[4766]: I1002 11:17:46.955054 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.954994336 podStartE2EDuration="2.954994336s" podCreationTimestamp="2025-10-02 11:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:46.467621474 +0000 UTC m=+1581.410492418" watchObservedRunningTime="2025-10-02 11:17:46.954994336 +0000 UTC m=+1581.897865270" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.055114 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-config-data\") pod \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.055298 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qchw\" (UniqueName: \"kubernetes.io/projected/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-kube-api-access-8qchw\") pod \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.055637 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-public-tls-certs\") pod \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.057152 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-logs\") pod \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.057228 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-internal-tls-certs\") pod \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.057422 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-combined-ca-bundle\") pod \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\" (UID: \"d0992e9f-7b92-48af-bd2e-69734c0c8dfd\") " Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.057692 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-logs" (OuterVolumeSpecName: "logs") pod "d0992e9f-7b92-48af-bd2e-69734c0c8dfd" (UID: "d0992e9f-7b92-48af-bd2e-69734c0c8dfd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.058701 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.060771 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-kube-api-access-8qchw" (OuterVolumeSpecName: "kube-api-access-8qchw") pod "d0992e9f-7b92-48af-bd2e-69734c0c8dfd" (UID: "d0992e9f-7b92-48af-bd2e-69734c0c8dfd"). InnerVolumeSpecName "kube-api-access-8qchw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.097815 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0992e9f-7b92-48af-bd2e-69734c0c8dfd" (UID: "d0992e9f-7b92-48af-bd2e-69734c0c8dfd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.103492 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d0992e9f-7b92-48af-bd2e-69734c0c8dfd" (UID: "d0992e9f-7b92-48af-bd2e-69734c0c8dfd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.123829 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-config-data" (OuterVolumeSpecName: "config-data") pod "d0992e9f-7b92-48af-bd2e-69734c0c8dfd" (UID: "d0992e9f-7b92-48af-bd2e-69734c0c8dfd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.135795 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d0992e9f-7b92-48af-bd2e-69734c0c8dfd" (UID: "d0992e9f-7b92-48af-bd2e-69734c0c8dfd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.161082 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.161120 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.161131 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.161140 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.161149 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qchw\" (UniqueName: \"kubernetes.io/projected/d0992e9f-7b92-48af-bd2e-69734c0c8dfd-kube-api-access-8qchw\") on node \"crc\" DevicePath \"\"" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.464105 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bbb13294-05c1-4a20-8265-5144efcd91cf","Type":"ContainerStarted","Data":"f6cd0095fa2a61271a1c6b2812af96732964a92ed3e96a78c81919c9ef11e724"} Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.467403 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d0992e9f-7b92-48af-bd2e-69734c0c8dfd","Type":"ContainerDied","Data":"648a3f943daa549207cb18d595afcc56ec41a0ff823cc259ae9b188da8f6ae27"} Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.467457 4766 scope.go:117] "RemoveContainer" containerID="64dc6ab3ff732665409e85c5bc055f92613518acdef12f6479c6e0f4b4e38a2d" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.467491 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.488223 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.488200056 podStartE2EDuration="2.488200056s" podCreationTimestamp="2025-10-02 11:17:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:47.485965214 +0000 UTC m=+1582.428836158" watchObservedRunningTime="2025-10-02 11:17:47.488200056 +0000 UTC m=+1582.431071010" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.496090 4766 scope.go:117] "RemoveContainer" containerID="df0bfc3f5c98293d3fa0e218566f25d00376650787827c5ad6d6711d318929c0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.521255 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.530175 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.537838 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:47 crc kubenswrapper[4766]: E1002 11:17:47.538393 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0992e9f-7b92-48af-bd2e-69734c0c8dfd" containerName="nova-api-log" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.538421 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0992e9f-7b92-48af-bd2e-69734c0c8dfd" containerName="nova-api-log" Oct 02 11:17:47 crc kubenswrapper[4766]: E1002 11:17:47.538447 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0992e9f-7b92-48af-bd2e-69734c0c8dfd" containerName="nova-api-api" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.538456 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0992e9f-7b92-48af-bd2e-69734c0c8dfd" containerName="nova-api-api" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.538687 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0992e9f-7b92-48af-bd2e-69734c0c8dfd" containerName="nova-api-log" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.538715 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0992e9f-7b92-48af-bd2e-69734c0c8dfd" containerName="nova-api-api" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.540064 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.542047 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.543218 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.543891 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.549537 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.669967 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.670083 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b869942b-07a4-4a08-b312-2b09cee2abf1-logs\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.670138 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-public-tls-certs\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.670343 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-config-data\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.670599 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.670633 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9fz2\" (UniqueName: \"kubernetes.io/projected/b869942b-07a4-4a08-b312-2b09cee2abf1-kube-api-access-m9fz2\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.772543 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.772613 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9fz2\" (UniqueName: \"kubernetes.io/projected/b869942b-07a4-4a08-b312-2b09cee2abf1-kube-api-access-m9fz2\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.772670 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.772696 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b869942b-07a4-4a08-b312-2b09cee2abf1-logs\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.772757 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-public-tls-certs\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.772848 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-config-data\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.773985 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b869942b-07a4-4a08-b312-2b09cee2abf1-logs\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.783586 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-public-tls-certs\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.783735 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-config-data\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.786682 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.793382 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.795423 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9fz2\" (UniqueName: \"kubernetes.io/projected/b869942b-07a4-4a08-b312-2b09cee2abf1-kube-api-access-m9fz2\") pod \"nova-api-0\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.864245 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.890140 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:17:47 crc kubenswrapper[4766]: E1002 11:17:47.890428 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:17:47 crc kubenswrapper[4766]: I1002 11:17:47.900108 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0992e9f-7b92-48af-bd2e-69734c0c8dfd" path="/var/lib/kubelet/pods/d0992e9f-7b92-48af-bd2e-69734c0c8dfd/volumes" Oct 02 11:17:48 crc kubenswrapper[4766]: I1002 11:17:48.297419 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:17:48 crc kubenswrapper[4766]: W1002 11:17:48.302797 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb869942b_07a4_4a08_b312_2b09cee2abf1.slice/crio-91dd9bbd80d43b9ac805fd99dc0edc144c30f8299ab2c8710b8b373644193427 WatchSource:0}: Error finding container 91dd9bbd80d43b9ac805fd99dc0edc144c30f8299ab2c8710b8b373644193427: Status 404 returned error can't find the container with id 91dd9bbd80d43b9ac805fd99dc0edc144c30f8299ab2c8710b8b373644193427 Oct 02 11:17:48 crc kubenswrapper[4766]: I1002 11:17:48.478396 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b869942b-07a4-4a08-b312-2b09cee2abf1","Type":"ContainerStarted","Data":"1aee3ffe3cc0a7ee677e4497600a6cf21df562e9f835edca4df6d4e59c78e1d2"} Oct 02 11:17:48 crc kubenswrapper[4766]: I1002 11:17:48.479582 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b869942b-07a4-4a08-b312-2b09cee2abf1","Type":"ContainerStarted","Data":"91dd9bbd80d43b9ac805fd99dc0edc144c30f8299ab2c8710b8b373644193427"} Oct 02 11:17:49 crc kubenswrapper[4766]: I1002 11:17:49.490308 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b869942b-07a4-4a08-b312-2b09cee2abf1","Type":"ContainerStarted","Data":"814f365245b0829017a26fc43722610ee727739807d4d801998edff378def268"} Oct 02 11:17:49 crc kubenswrapper[4766]: I1002 11:17:49.516432 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.516401316 podStartE2EDuration="2.516401316s" podCreationTimestamp="2025-10-02 11:17:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:17:49.507618635 +0000 UTC m=+1584.450489589" watchObservedRunningTime="2025-10-02 11:17:49.516401316 +0000 UTC m=+1584.459272270" Oct 02 11:17:49 crc kubenswrapper[4766]: I1002 11:17:49.794242 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 11:17:50 crc kubenswrapper[4766]: I1002 11:17:50.821534 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:17:50 crc kubenswrapper[4766]: I1002 11:17:50.821839 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 11:17:54 crc kubenswrapper[4766]: I1002 11:17:54.794168 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 11:17:54 crc kubenswrapper[4766]: I1002 11:17:54.825634 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 11:17:55 crc kubenswrapper[4766]: I1002 11:17:55.579023 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 11:17:55 crc kubenswrapper[4766]: I1002 11:17:55.822714 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 11:17:55 crc kubenswrapper[4766]: I1002 11:17:55.822756 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 11:17:56 crc kubenswrapper[4766]: I1002 11:17:56.836665 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bbb13294-05c1-4a20-8265-5144efcd91cf" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:56 crc kubenswrapper[4766]: I1002 11:17:56.836694 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bbb13294-05c1-4a20-8265-5144efcd91cf" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:57 crc kubenswrapper[4766]: I1002 11:17:57.865816 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:17:57 crc kubenswrapper[4766]: I1002 11:17:57.865869 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 11:17:58 crc kubenswrapper[4766]: I1002 11:17:58.877855 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b869942b-07a4-4a08-b312-2b09cee2abf1" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:58 crc kubenswrapper[4766]: I1002 11:17:58.880265 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b869942b-07a4-4a08-b312-2b09cee2abf1" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.204:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 02 11:17:59 crc kubenswrapper[4766]: I1002 11:17:59.781720 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 11:18:02 crc kubenswrapper[4766]: I1002 11:18:02.881266 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:18:02 crc kubenswrapper[4766]: E1002 11:18:02.882042 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:18:05 crc kubenswrapper[4766]: I1002 11:18:05.826798 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 11:18:05 crc kubenswrapper[4766]: I1002 11:18:05.827722 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 11:18:05 crc kubenswrapper[4766]: I1002 11:18:05.831425 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 11:18:06 crc kubenswrapper[4766]: I1002 11:18:06.651287 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 11:18:07 crc kubenswrapper[4766]: I1002 11:18:07.872674 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 11:18:07 crc kubenswrapper[4766]: I1002 11:18:07.873693 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:18:07 crc kubenswrapper[4766]: I1002 11:18:07.874716 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 11:18:07 crc kubenswrapper[4766]: I1002 11:18:07.879950 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 11:18:08 crc kubenswrapper[4766]: I1002 11:18:08.661269 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 11:18:08 crc kubenswrapper[4766]: I1002 11:18:08.668215 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 11:18:13 crc kubenswrapper[4766]: I1002 11:18:13.868600 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j8gq7"] Oct 02 11:18:13 crc kubenswrapper[4766]: I1002 11:18:13.871931 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j8gq7" Oct 02 11:18:13 crc kubenswrapper[4766]: I1002 11:18:13.892416 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j8gq7"] Oct 02 11:18:13 crc kubenswrapper[4766]: I1002 11:18:13.971099 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mkdd\" (UniqueName: \"kubernetes.io/projected/0be46caa-0351-4f60-b16b-a258b9874a6f-kube-api-access-8mkdd\") pod \"community-operators-j8gq7\" (UID: \"0be46caa-0351-4f60-b16b-a258b9874a6f\") " pod="openshift-marketplace/community-operators-j8gq7" Oct 02 11:18:13 crc kubenswrapper[4766]: I1002 11:18:13.971419 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0be46caa-0351-4f60-b16b-a258b9874a6f-utilities\") pod \"community-operators-j8gq7\" (UID: \"0be46caa-0351-4f60-b16b-a258b9874a6f\") " pod="openshift-marketplace/community-operators-j8gq7" Oct 02 11:18:13 crc kubenswrapper[4766]: I1002 11:18:13.971488 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0be46caa-0351-4f60-b16b-a258b9874a6f-catalog-content\") pod \"community-operators-j8gq7\" (UID: \"0be46caa-0351-4f60-b16b-a258b9874a6f\") " pod="openshift-marketplace/community-operators-j8gq7" Oct 02 11:18:14 crc kubenswrapper[4766]: I1002 11:18:14.073053 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mkdd\" (UniqueName: \"kubernetes.io/projected/0be46caa-0351-4f60-b16b-a258b9874a6f-kube-api-access-8mkdd\") pod \"community-operators-j8gq7\" (UID: \"0be46caa-0351-4f60-b16b-a258b9874a6f\") " pod="openshift-marketplace/community-operators-j8gq7" Oct 02 11:18:14 crc kubenswrapper[4766]: I1002 11:18:14.073278 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0be46caa-0351-4f60-b16b-a258b9874a6f-utilities\") pod \"community-operators-j8gq7\" (UID: \"0be46caa-0351-4f60-b16b-a258b9874a6f\") " pod="openshift-marketplace/community-operators-j8gq7" Oct 02 11:18:14 crc kubenswrapper[4766]: I1002 11:18:14.073331 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0be46caa-0351-4f60-b16b-a258b9874a6f-catalog-content\") pod \"community-operators-j8gq7\" (UID: \"0be46caa-0351-4f60-b16b-a258b9874a6f\") " pod="openshift-marketplace/community-operators-j8gq7" Oct 02 11:18:14 crc kubenswrapper[4766]: I1002 11:18:14.073862 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0be46caa-0351-4f60-b16b-a258b9874a6f-utilities\") pod \"community-operators-j8gq7\" (UID: \"0be46caa-0351-4f60-b16b-a258b9874a6f\") " pod="openshift-marketplace/community-operators-j8gq7" Oct 02 11:18:14 crc kubenswrapper[4766]: I1002 11:18:14.073916 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0be46caa-0351-4f60-b16b-a258b9874a6f-catalog-content\") pod \"community-operators-j8gq7\" (UID: \"0be46caa-0351-4f60-b16b-a258b9874a6f\") " pod="openshift-marketplace/community-operators-j8gq7" Oct 02 11:18:14 crc kubenswrapper[4766]: I1002 11:18:14.093200 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mkdd\" (UniqueName: \"kubernetes.io/projected/0be46caa-0351-4f60-b16b-a258b9874a6f-kube-api-access-8mkdd\") pod \"community-operators-j8gq7\" (UID: \"0be46caa-0351-4f60-b16b-a258b9874a6f\") " pod="openshift-marketplace/community-operators-j8gq7" Oct 02 11:18:14 crc kubenswrapper[4766]: I1002 11:18:14.205921 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j8gq7" Oct 02 11:18:14 crc kubenswrapper[4766]: I1002 11:18:14.697383 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j8gq7"] Oct 02 11:18:14 crc kubenswrapper[4766]: I1002 11:18:14.729309 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j8gq7" event={"ID":"0be46caa-0351-4f60-b16b-a258b9874a6f","Type":"ContainerStarted","Data":"00a0052c7e29165f32eaa5c30c84a415bee37c4d36c746885bcd91e60ff7a3ac"} Oct 02 11:18:14 crc kubenswrapper[4766]: I1002 11:18:14.881585 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:18:14 crc kubenswrapper[4766]: E1002 11:18:14.882130 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:18:15 crc kubenswrapper[4766]: I1002 11:18:15.748744 4766 generic.go:334] "Generic (PLEG): container finished" podID="0be46caa-0351-4f60-b16b-a258b9874a6f" containerID="8b6a6f87bc8fecfe651496c29a84fcb9e8cce1f6c6be0ce4f876d4b4d799b5de" exitCode=0 Oct 02 11:18:15 crc kubenswrapper[4766]: I1002 11:18:15.748812 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j8gq7" event={"ID":"0be46caa-0351-4f60-b16b-a258b9874a6f","Type":"ContainerDied","Data":"8b6a6f87bc8fecfe651496c29a84fcb9e8cce1f6c6be0ce4f876d4b4d799b5de"} Oct 02 11:18:17 crc kubenswrapper[4766]: I1002 11:18:17.768255 4766 generic.go:334] "Generic (PLEG): container finished" podID="0be46caa-0351-4f60-b16b-a258b9874a6f" containerID="38264158d4d2ca7725c3821356e1e5e055e67de3a4b5aef6daa927a1cadd9932" exitCode=0 Oct 02 11:18:17 crc kubenswrapper[4766]: I1002 11:18:17.768323 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j8gq7" event={"ID":"0be46caa-0351-4f60-b16b-a258b9874a6f","Type":"ContainerDied","Data":"38264158d4d2ca7725c3821356e1e5e055e67de3a4b5aef6daa927a1cadd9932"} Oct 02 11:18:18 crc kubenswrapper[4766]: I1002 11:18:18.781195 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j8gq7" event={"ID":"0be46caa-0351-4f60-b16b-a258b9874a6f","Type":"ContainerStarted","Data":"5116c39cd5e1f3749deb2a96cecbb5c060c4b374035872069ea7256e1fb94c1e"} Oct 02 11:18:18 crc kubenswrapper[4766]: I1002 11:18:18.811656 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j8gq7" podStartSLOduration=3.39823203 podStartE2EDuration="5.811639935s" podCreationTimestamp="2025-10-02 11:18:13 +0000 UTC" firstStartedPulling="2025-10-02 11:18:15.752449609 +0000 UTC m=+1610.695320553" lastFinishedPulling="2025-10-02 11:18:18.165857514 +0000 UTC m=+1613.108728458" observedRunningTime="2025-10-02 11:18:18.805513809 +0000 UTC m=+1613.748384763" watchObservedRunningTime="2025-10-02 11:18:18.811639935 +0000 UTC m=+1613.754510879" Oct 02 11:18:24 crc kubenswrapper[4766]: I1002 11:18:24.207419 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j8gq7" Oct 02 11:18:24 crc kubenswrapper[4766]: I1002 11:18:24.207987 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j8gq7" Oct 02 11:18:24 crc kubenswrapper[4766]: I1002 11:18:24.254604 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j8gq7" Oct 02 11:18:24 crc kubenswrapper[4766]: I1002 11:18:24.869714 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j8gq7" Oct 02 11:18:24 crc kubenswrapper[4766]: I1002 11:18:24.914405 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j8gq7"] Oct 02 11:18:26 crc kubenswrapper[4766]: I1002 11:18:26.299143 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 02 11:18:26 crc kubenswrapper[4766]: I1002 11:18:26.300039 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="d896308d-0b8a-4cfc-ad92-311521c2e417" containerName="openstackclient" containerID="cri-o://3f59b786bb34a976dbad318bdbfed8db8ed47518b0fa788de2b6cb253622cf0d" gracePeriod=2 Oct 02 11:18:26 crc kubenswrapper[4766]: I1002 11:18:26.317437 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 02 11:18:26 crc kubenswrapper[4766]: I1002 11:18:26.789619 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder891f-account-delete-2hm8h"] Oct 02 11:18:26 crc kubenswrapper[4766]: E1002 11:18:26.790110 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d896308d-0b8a-4cfc-ad92-311521c2e417" containerName="openstackclient" Oct 02 11:18:26 crc kubenswrapper[4766]: I1002 11:18:26.790147 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d896308d-0b8a-4cfc-ad92-311521c2e417" containerName="openstackclient" Oct 02 11:18:26 crc kubenswrapper[4766]: I1002 11:18:26.790372 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d896308d-0b8a-4cfc-ad92-311521c2e417" containerName="openstackclient" Oct 02 11:18:26 crc kubenswrapper[4766]: I1002 11:18:26.797886 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder891f-account-delete-2hm8h" Oct 02 11:18:26 crc kubenswrapper[4766]: I1002 11:18:26.813697 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder891f-account-delete-2hm8h"] Oct 02 11:18:26 crc kubenswrapper[4766]: I1002 11:18:26.851715 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j8gq7" podUID="0be46caa-0351-4f60-b16b-a258b9874a6f" containerName="registry-server" containerID="cri-o://5116c39cd5e1f3749deb2a96cecbb5c060c4b374035872069ea7256e1fb94c1e" gracePeriod=2 Oct 02 11:18:26 crc kubenswrapper[4766]: I1002 11:18:26.860587 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:18:26 crc kubenswrapper[4766]: I1002 11:18:26.885728 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-s5j64"] Oct 02 11:18:26 crc kubenswrapper[4766]: I1002 11:18:26.895137 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-s5j64"] Oct 02 11:18:26 crc kubenswrapper[4766]: I1002 11:18:26.912839 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-tmhfd"] Oct 02 11:18:26 crc kubenswrapper[4766]: I1002 11:18:26.913049 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-tmhfd" podUID="95842554-1651-4c34-b934-d4eb21c6c52d" containerName="openstack-network-exporter" containerID="cri-o://85d92d1d1bf439fc0a8bfe76963371dba146b935670f36dd525197b9b9036410" gracePeriod=30 Oct 02 11:18:26 crc kubenswrapper[4766]: I1002 11:18:26.951722 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xd7k\" (UniqueName: \"kubernetes.io/projected/3fac2bf1-fb0f-4031-bfc8-34090cc90c8a-kube-api-access-2xd7k\") pod \"cinder891f-account-delete-2hm8h\" (UID: \"3fac2bf1-fb0f-4031-bfc8-34090cc90c8a\") " pod="openstack/cinder891f-account-delete-2hm8h" Oct 02 11:18:26 crc kubenswrapper[4766]: I1002 11:18:26.952098 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dp6x5"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.018177 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-8wzw9"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.061016 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xd7k\" (UniqueName: \"kubernetes.io/projected/3fac2bf1-fb0f-4031-bfc8-34090cc90c8a-kube-api-access-2xd7k\") pod \"cinder891f-account-delete-2hm8h\" (UID: \"3fac2bf1-fb0f-4031-bfc8-34090cc90c8a\") " pod="openstack/cinder891f-account-delete-2hm8h" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.093786 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placementd1c9-account-delete-j6zwl"] Oct 02 11:18:27 crc kubenswrapper[4766]: E1002 11:18:27.096732 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 02 11:18:27 crc kubenswrapper[4766]: E1002 11:18:27.100763 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-config-data podName:874d062e-d2f8-462c-95b3-8f630b7120af nodeName:}" failed. No retries permitted until 2025-10-02 11:18:27.600729526 +0000 UTC m=+1622.543600470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-config-data") pod "rabbitmq-cell1-server-0" (UID: "874d062e-d2f8-462c-95b3-8f630b7120af") : configmap "rabbitmq-cell1-config-data" not found Oct 02 11:18:27 crc kubenswrapper[4766]: E1002 11:18:27.097396 4766 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 02 11:18:27 crc kubenswrapper[4766]: E1002 11:18:27.101023 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data podName:d9339929-4331-4cd9-89bc-8350ef2f55f5 nodeName:}" failed. No retries permitted until 2025-10-02 11:18:27.601014935 +0000 UTC m=+1622.543885879 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data") pod "barbican-keystone-listener-664d98ccd8-hh5xk" (UID: "d9339929-4331-4cd9-89bc-8350ef2f55f5") : secret "barbican-config-data" not found Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.115820 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementd1c9-account-delete-j6zwl" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.147255 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placementd1c9-account-delete-j6zwl"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.150669 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xd7k\" (UniqueName: \"kubernetes.io/projected/3fac2bf1-fb0f-4031-bfc8-34090cc90c8a-kube-api-access-2xd7k\") pod \"cinder891f-account-delete-2hm8h\" (UID: \"3fac2bf1-fb0f-4031-bfc8-34090cc90c8a\") " pod="openstack/cinder891f-account-delete-2hm8h" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.175576 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gbg65"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.176207 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" podUID="9fdf37c9-9a32-4103-8418-198d45d14415" containerName="dnsmasq-dns" containerID="cri-o://a159f561aef681dec7a32ee79dcabb81731c25f79c9eafcbda290d83ab7ba093" gracePeriod=10 Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.200741 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g86jj\" (UniqueName: \"kubernetes.io/projected/52bdcfda-75b3-450b-9db4-1a443be18fa3-kube-api-access-g86jj\") pod \"placementd1c9-account-delete-j6zwl\" (UID: \"52bdcfda-75b3-450b-9db4-1a443be18fa3\") " pod="openstack/placementd1c9-account-delete-j6zwl" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.240522 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.240874 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="7498a37c-33a3-4a3a-9c72-64a0c533282c" containerName="ovn-northd" containerID="cri-o://84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca" gracePeriod=30 Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.241340 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="7498a37c-33a3-4a3a-9c72-64a0c533282c" containerName="openstack-network-exporter" containerID="cri-o://73791752d4b908acb0c2c84f95d0aaba2c3fbd09a79daaa50e9a88de0dd2d032" gracePeriod=30 Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.297989 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance0d4d-account-delete-7cb92"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.299667 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance0d4d-account-delete-7cb92" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.302416 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g86jj\" (UniqueName: \"kubernetes.io/projected/52bdcfda-75b3-450b-9db4-1a443be18fa3-kube-api-access-g86jj\") pod \"placementd1c9-account-delete-j6zwl\" (UID: \"52bdcfda-75b3-450b-9db4-1a443be18fa3\") " pod="openstack/placementd1c9-account-delete-j6zwl" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.311756 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-wjzg5"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.335617 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g86jj\" (UniqueName: \"kubernetes.io/projected/52bdcfda-75b3-450b-9db4-1a443be18fa3-kube-api-access-g86jj\") pod \"placementd1c9-account-delete-j6zwl\" (UID: \"52bdcfda-75b3-450b-9db4-1a443be18fa3\") " pod="openstack/placementd1c9-account-delete-j6zwl" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.348771 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-wjzg5"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.383088 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance0d4d-account-delete-7cb92"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.392067 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-xc8gb"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.404839 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhfrf\" (UniqueName: \"kubernetes.io/projected/3fd6760c-af87-4e2a-adcd-5fe3ca636fef-kube-api-access-mhfrf\") pod \"glance0d4d-account-delete-7cb92\" (UID: \"3fd6760c-af87-4e2a-adcd-5fe3ca636fef\") " pod="openstack/glance0d4d-account-delete-7cb92" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.432052 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder891f-account-delete-2hm8h" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.436004 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-xc8gb"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.452053 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbicance01-account-delete-jrx2r"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.453578 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicance01-account-delete-jrx2r" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.474911 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.475623 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6b3b4ffb-195e-4aba-b3bc-9969b56c01d6" containerName="cinder-scheduler" containerID="cri-o://e5e102f20a63a8d722e965baa46260978733d5085519f5210483d94b2bb40281" gracePeriod=30 Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.475718 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6b3b4ffb-195e-4aba-b3bc-9969b56c01d6" containerName="probe" containerID="cri-o://4fa69755c8413dc8a6865886029bf8ec092fa254162c708d148185b7305c545c" gracePeriod=30 Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.511697 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhfrf\" (UniqueName: \"kubernetes.io/projected/3fd6760c-af87-4e2a-adcd-5fe3ca636fef-kube-api-access-mhfrf\") pod \"glance0d4d-account-delete-7cb92\" (UID: \"3fd6760c-af87-4e2a-adcd-5fe3ca636fef\") " pod="openstack/glance0d4d-account-delete-7cb92" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.550002 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbicance01-account-delete-jrx2r"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.563148 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhfrf\" (UniqueName: \"kubernetes.io/projected/3fd6760c-af87-4e2a-adcd-5fe3ca636fef-kube-api-access-mhfrf\") pod \"glance0d4d-account-delete-7cb92\" (UID: \"3fd6760c-af87-4e2a-adcd-5fe3ca636fef\") " pod="openstack/glance0d4d-account-delete-7cb92" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.569188 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placementd1c9-account-delete-j6zwl" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.598712 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.599254 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d4d0079a-03e3-4e5f-81a2-81f5bceb795c" containerName="cinder-api-log" containerID="cri-o://11c17e3e9a13cd8d34cb7338544e7b564639e6d7cc5158ff63319715efa5d8b5" gracePeriod=30 Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.599850 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d4d0079a-03e3-4e5f-81a2-81f5bceb795c" containerName="cinder-api" containerID="cri-o://a12103d025e03ad958180a996161105d2c14a35f5ae1e7d1aabe48aab8387391" gracePeriod=30 Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.618009 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmmgg\" (UniqueName: \"kubernetes.io/projected/1a739206-d877-4212-9242-47a59c440b40-kube-api-access-xmmgg\") pod \"barbicance01-account-delete-jrx2r\" (UID: \"1a739206-d877-4212-9242-47a59c440b40\") " pod="openstack/barbicance01-account-delete-jrx2r" Oct 02 11:18:27 crc kubenswrapper[4766]: E1002 11:18:27.618725 4766 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 02 11:18:27 crc kubenswrapper[4766]: E1002 11:18:27.618768 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data podName:d9339929-4331-4cd9-89bc-8350ef2f55f5 nodeName:}" failed. No retries permitted until 2025-10-02 11:18:28.618754721 +0000 UTC m=+1623.561625655 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data") pod "barbican-keystone-listener-664d98ccd8-hh5xk" (UID: "d9339929-4331-4cd9-89bc-8350ef2f55f5") : secret "barbican-config-data" not found Oct 02 11:18:27 crc kubenswrapper[4766]: E1002 11:18:27.618821 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 02 11:18:27 crc kubenswrapper[4766]: E1002 11:18:27.618838 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-config-data podName:874d062e-d2f8-462c-95b3-8f630b7120af nodeName:}" failed. No retries permitted until 2025-10-02 11:18:28.618832903 +0000 UTC m=+1623.561703847 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-config-data") pod "rabbitmq-cell1-server-0" (UID: "874d062e-d2f8-462c-95b3-8f630b7120af") : configmap "rabbitmq-cell1-config-data" not found Oct 02 11:18:27 crc kubenswrapper[4766]: E1002 11:18:27.654142 4766 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-dp6x5" message=< Oct 02 11:18:27 crc kubenswrapper[4766]: Exiting ovn-controller (1) [ OK ] Oct 02 11:18:27 crc kubenswrapper[4766]: > Oct 02 11:18:27 crc kubenswrapper[4766]: E1002 11:18:27.654317 4766 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-dp6x5" podUID="9860354f-7494-4b02-bca3-adc731683f7f" containerName="ovn-controller" containerID="cri-o://ff40e39793793465f75fb22c204001262aaca34bc6d55d604cabba72ae0c9eb7" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.654410 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-dp6x5" podUID="9860354f-7494-4b02-bca3-adc731683f7f" containerName="ovn-controller" containerID="cri-o://ff40e39793793465f75fb22c204001262aaca34bc6d55d604cabba72ae0c9eb7" gracePeriod=30 Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.683579 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell16336-account-delete-8q58s"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.688828 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell16336-account-delete-8q58s" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.696001 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" podUID="9fdf37c9-9a32-4103-8418-198d45d14415" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.197:5353: connect: connection refused" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.710714 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell16336-account-delete-8q58s"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.720201 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmmgg\" (UniqueName: \"kubernetes.io/projected/1a739206-d877-4212-9242-47a59c440b40-kube-api-access-xmmgg\") pod \"barbicance01-account-delete-jrx2r\" (UID: \"1a739206-d877-4212-9242-47a59c440b40\") " pod="openstack/barbicance01-account-delete-jrx2r" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.742477 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.787073 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-k4x48"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.788336 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmmgg\" (UniqueName: \"kubernetes.io/projected/1a739206-d877-4212-9242-47a59c440b40-kube-api-access-xmmgg\") pod \"barbicance01-account-delete-jrx2r\" (UID: \"1a739206-d877-4212-9242-47a59c440b40\") " pod="openstack/barbicance01-account-delete-jrx2r" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.817543 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance0d4d-account-delete-7cb92" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.821578 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-k4x48"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.831863 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2chb\" (UniqueName: \"kubernetes.io/projected/3ae83487-5f24-4934-aba5-9ee2ca6ca657-kube-api-access-h2chb\") pod \"novacell16336-account-delete-8q58s\" (UID: \"3ae83487-5f24-4934-aba5-9ee2ca6ca657\") " pod="openstack/novacell16336-account-delete-8q58s" Oct 02 11:18:27 crc kubenswrapper[4766]: E1002 11:18:27.893655 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 11:18:27 crc kubenswrapper[4766]: E1002 11:18:27.900076 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.926281 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbicance01-account-delete-jrx2r" Oct 02 11:18:27 crc kubenswrapper[4766]: E1002 11:18:27.928221 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 11:18:27 crc kubenswrapper[4766]: E1002 11:18:27.928282 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="7498a37c-33a3-4a3a-9c72-64a0c533282c" containerName="ovn-northd" Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.928448 4766 generic.go:334] "Generic (PLEG): container finished" podID="0be46caa-0351-4f60-b16b-a258b9874a6f" containerID="5116c39cd5e1f3749deb2a96cecbb5c060c4b374035872069ea7256e1fb94c1e" exitCode=0 Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.946448 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2chb\" (UniqueName: \"kubernetes.io/projected/3ae83487-5f24-4934-aba5-9ee2ca6ca657-kube-api-access-h2chb\") pod \"novacell16336-account-delete-8q58s\" (UID: \"3ae83487-5f24-4934-aba5-9ee2ca6ca657\") " pod="openstack/novacell16336-account-delete-8q58s" Oct 02 11:18:27 crc kubenswrapper[4766]: E1002 11:18:27.948706 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 11:18:27 crc kubenswrapper[4766]: E1002 11:18:27.948762 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-config-data podName:1282b506-728d-4c6f-aa9c-3d3c1f826b71 nodeName:}" failed. No retries permitted until 2025-10-02 11:18:28.448741019 +0000 UTC m=+1623.391611963 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-config-data") pod "rabbitmq-server-0" (UID: "1282b506-728d-4c6f-aa9c-3d3c1f826b71") : configmap "rabbitmq-config-data" not found Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.989794 4766 generic.go:334] "Generic (PLEG): container finished" podID="9fdf37c9-9a32-4103-8418-198d45d14415" containerID="a159f561aef681dec7a32ee79dcabb81731c25f79c9eafcbda290d83ab7ba093" exitCode=0 Oct 02 11:18:27 crc kubenswrapper[4766]: I1002 11:18:27.994783 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2chb\" (UniqueName: \"kubernetes.io/projected/3ae83487-5f24-4934-aba5-9ee2ca6ca657-kube-api-access-h2chb\") pod \"novacell16336-account-delete-8q58s\" (UID: \"3ae83487-5f24-4934-aba5-9ee2ca6ca657\") " pod="openstack/novacell16336-account-delete-8q58s" Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.021224 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-tmhfd_95842554-1651-4c34-b934-d4eb21c6c52d/openstack-network-exporter/0.log" Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.021289 4766 generic.go:334] "Generic (PLEG): container finished" podID="95842554-1651-4c34-b934-d4eb21c6c52d" containerID="85d92d1d1bf439fc0a8bfe76963371dba146b935670f36dd525197b9b9036410" exitCode=2 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.103193 4766 generic.go:334] "Generic (PLEG): container finished" podID="7498a37c-33a3-4a3a-9c72-64a0c533282c" containerID="73791752d4b908acb0c2c84f95d0aaba2c3fbd09a79daaa50e9a88de0dd2d032" exitCode=2 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.136786 4766 generic.go:334] "Generic (PLEG): container finished" podID="d4d0079a-03e3-4e5f-81a2-81f5bceb795c" containerID="11c17e3e9a13cd8d34cb7338544e7b564639e6d7cc5158ff63319715efa5d8b5" exitCode=143 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.175492 4766 generic.go:334] "Generic (PLEG): container finished" podID="9860354f-7494-4b02-bca3-adc731683f7f" containerID="ff40e39793793465f75fb22c204001262aaca34bc6d55d604cabba72ae0c9eb7" exitCode=0 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.320409 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-8wzw9" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovs-vswitchd" containerID="cri-o://e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" gracePeriod=29 Oct 02 11:18:28 crc kubenswrapper[4766]: E1002 11:18:28.341596 4766 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Oct 02 11:18:28 crc kubenswrapper[4766]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 02 11:18:28 crc kubenswrapper[4766]: + source /usr/local/bin/container-scripts/functions Oct 02 11:18:28 crc kubenswrapper[4766]: ++ OVNBridge=br-int Oct 02 11:18:28 crc kubenswrapper[4766]: ++ OVNRemote=tcp:localhost:6642 Oct 02 11:18:28 crc kubenswrapper[4766]: ++ OVNEncapType=geneve Oct 02 11:18:28 crc kubenswrapper[4766]: ++ OVNAvailabilityZones= Oct 02 11:18:28 crc kubenswrapper[4766]: ++ EnableChassisAsGateway=true Oct 02 11:18:28 crc kubenswrapper[4766]: ++ PhysicalNetworks= Oct 02 11:18:28 crc kubenswrapper[4766]: ++ OVNHostName= Oct 02 11:18:28 crc kubenswrapper[4766]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 02 11:18:28 crc kubenswrapper[4766]: ++ ovs_dir=/var/lib/openvswitch Oct 02 11:18:28 crc kubenswrapper[4766]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 02 11:18:28 crc kubenswrapper[4766]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 02 11:18:28 crc kubenswrapper[4766]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 11:18:28 crc kubenswrapper[4766]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:18:28 crc kubenswrapper[4766]: + sleep 0.5 Oct 02 11:18:28 crc kubenswrapper[4766]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:18:28 crc kubenswrapper[4766]: + sleep 0.5 Oct 02 11:18:28 crc kubenswrapper[4766]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:18:28 crc kubenswrapper[4766]: + cleanup_ovsdb_server_semaphore Oct 02 11:18:28 crc kubenswrapper[4766]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 11:18:28 crc kubenswrapper[4766]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 02 11:18:28 crc kubenswrapper[4766]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-8wzw9" message=< Oct 02 11:18:28 crc kubenswrapper[4766]: Exiting ovsdb-server (5) [ OK ] Oct 02 11:18:28 crc kubenswrapper[4766]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 02 11:18:28 crc kubenswrapper[4766]: + source /usr/local/bin/container-scripts/functions Oct 02 11:18:28 crc kubenswrapper[4766]: ++ OVNBridge=br-int Oct 02 11:18:28 crc kubenswrapper[4766]: ++ OVNRemote=tcp:localhost:6642 Oct 02 11:18:28 crc kubenswrapper[4766]: ++ OVNEncapType=geneve Oct 02 11:18:28 crc kubenswrapper[4766]: ++ OVNAvailabilityZones= Oct 02 11:18:28 crc kubenswrapper[4766]: ++ EnableChassisAsGateway=true Oct 02 11:18:28 crc kubenswrapper[4766]: ++ PhysicalNetworks= Oct 02 11:18:28 crc kubenswrapper[4766]: ++ OVNHostName= Oct 02 11:18:28 crc kubenswrapper[4766]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 02 11:18:28 crc kubenswrapper[4766]: ++ ovs_dir=/var/lib/openvswitch Oct 02 11:18:28 crc kubenswrapper[4766]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 02 11:18:28 crc kubenswrapper[4766]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 02 11:18:28 crc kubenswrapper[4766]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 11:18:28 crc kubenswrapper[4766]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:18:28 crc kubenswrapper[4766]: + sleep 0.5 Oct 02 11:18:28 crc kubenswrapper[4766]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:18:28 crc kubenswrapper[4766]: + sleep 0.5 Oct 02 11:18:28 crc kubenswrapper[4766]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:18:28 crc kubenswrapper[4766]: + cleanup_ovsdb_server_semaphore Oct 02 11:18:28 crc kubenswrapper[4766]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 11:18:28 crc kubenswrapper[4766]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 02 11:18:28 crc kubenswrapper[4766]: > Oct 02 11:18:28 crc kubenswrapper[4766]: E1002 11:18:28.341629 4766 kuberuntime_container.go:691] "PreStop hook failed" err=< Oct 02 11:18:28 crc kubenswrapper[4766]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Oct 02 11:18:28 crc kubenswrapper[4766]: + source /usr/local/bin/container-scripts/functions Oct 02 11:18:28 crc kubenswrapper[4766]: ++ OVNBridge=br-int Oct 02 11:18:28 crc kubenswrapper[4766]: ++ OVNRemote=tcp:localhost:6642 Oct 02 11:18:28 crc kubenswrapper[4766]: ++ OVNEncapType=geneve Oct 02 11:18:28 crc kubenswrapper[4766]: ++ OVNAvailabilityZones= Oct 02 11:18:28 crc kubenswrapper[4766]: ++ EnableChassisAsGateway=true Oct 02 11:18:28 crc kubenswrapper[4766]: ++ PhysicalNetworks= Oct 02 11:18:28 crc kubenswrapper[4766]: ++ OVNHostName= Oct 02 11:18:28 crc kubenswrapper[4766]: ++ DB_FILE=/etc/openvswitch/conf.db Oct 02 11:18:28 crc kubenswrapper[4766]: ++ ovs_dir=/var/lib/openvswitch Oct 02 11:18:28 crc kubenswrapper[4766]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Oct 02 11:18:28 crc kubenswrapper[4766]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Oct 02 11:18:28 crc kubenswrapper[4766]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 11:18:28 crc kubenswrapper[4766]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:18:28 crc kubenswrapper[4766]: + sleep 0.5 Oct 02 11:18:28 crc kubenswrapper[4766]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:18:28 crc kubenswrapper[4766]: + sleep 0.5 Oct 02 11:18:28 crc kubenswrapper[4766]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Oct 02 11:18:28 crc kubenswrapper[4766]: + cleanup_ovsdb_server_semaphore Oct 02 11:18:28 crc kubenswrapper[4766]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Oct 02 11:18:28 crc kubenswrapper[4766]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Oct 02 11:18:28 crc kubenswrapper[4766]: > pod="openstack/ovn-controller-ovs-8wzw9" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovsdb-server" containerID="cri-o://3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.341661 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-8wzw9" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovsdb-server" containerID="cri-o://3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" gracePeriod=29 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.497400 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12786f1e-db55-4668-8e43-afa080dc0fa2" path="/var/lib/kubelet/pods/12786f1e-db55-4668-8e43-afa080dc0fa2/volumes" Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.508091 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a084ae6-94ba-4057-adcf-5d3d9b92c9ae" path="/var/lib/kubelet/pods/1a084ae6-94ba-4057-adcf-5d3d9b92c9ae/volumes" Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.508870 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c61ea8c-4cfc-4f0d-97eb-d33c62117db2" path="/var/lib/kubelet/pods/3c61ea8c-4cfc-4f0d-97eb-d33c62117db2/volumes" Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.510301 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fea98489-bbfa-4490-9e89-40a19bfb594f" path="/var/lib/kubelet/pods/fea98489-bbfa-4490-9e89-40a19bfb594f/volumes" Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.510887 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.510925 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapibccc-account-delete-n7pf6"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513178 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j8gq7" event={"ID":"0be46caa-0351-4f60-b16b-a258b9874a6f","Type":"ContainerDied","Data":"5116c39cd5e1f3749deb2a96cecbb5c060c4b374035872069ea7256e1fb94c1e"} Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513217 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" event={"ID":"9fdf37c9-9a32-4103-8418-198d45d14415","Type":"ContainerDied","Data":"a159f561aef681dec7a32ee79dcabb81731c25f79c9eafcbda290d83ab7ba093"} Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513237 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapibccc-account-delete-n7pf6"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513251 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-ztpl6"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513264 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-tmhfd" event={"ID":"95842554-1651-4c34-b934-d4eb21c6c52d","Type":"ContainerDied","Data":"85d92d1d1bf439fc0a8bfe76963371dba146b935670f36dd525197b9b9036410"} Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513277 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7498a37c-33a3-4a3a-9c72-64a0c533282c","Type":"ContainerDied","Data":"73791752d4b908acb0c2c84f95d0aaba2c3fbd09a79daaa50e9a88de0dd2d032"} Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513290 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4d0079a-03e3-4e5f-81a2-81f5bceb795c","Type":"ContainerDied","Data":"11c17e3e9a13cd8d34cb7338544e7b564639e6d7cc5158ff63319715efa5d8b5"} Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513303 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dp6x5" event={"ID":"9860354f-7494-4b02-bca3-adc731683f7f","Type":"ContainerDied","Data":"ff40e39793793465f75fb22c204001262aaca34bc6d55d604cabba72ae0c9eb7"} Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513321 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-ztpl6"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513339 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513356 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-rhczm"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513365 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-rhczm"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513375 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-84bf49766d-bbf2p"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513386 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513397 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-nbrf7"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513407 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-nbrf7"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513416 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55f8b9d7c-hfdcr"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513426 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-qxksg"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513435 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-qxksg"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513444 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-2jv7w"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513454 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-2jv7w"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.513657 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55f8b9d7c-hfdcr" podUID="ec8cdac7-81c9-41e7-a956-41d13e5b91a6" containerName="neutron-api" containerID="cri-o://b88eb59f2ed5fce10e26153dd82767f0f8e62d65f3120c04ecdf968f4b718053" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.514044 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapibccc-account-delete-n7pf6" Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.514420 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-84bf49766d-bbf2p" podUID="7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" containerName="placement-log" containerID="cri-o://3d18a6249adb9873ab43d08de93aedc55c827c33e60c6d8d86226fc15f98872a" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.514865 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="account-server" containerID="cri-o://2e00b32e05f94d6db2ce326a131639f65e69eaa8e091e4ac19058f4bc69304fa" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.515441 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="swift-recon-cron" containerID="cri-o://48f8731a29f544c073845eb8fcd06b0efc46da3e9e5d54fb23e339018591d7f7" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.515649 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55f8b9d7c-hfdcr" podUID="ec8cdac7-81c9-41e7-a956-41d13e5b91a6" containerName="neutron-httpd" containerID="cri-o://f5124828eb5c9f7610af6224039bb25da1c36b7f880181f818d2c8a8eefcf481" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.516003 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="2be5e935-0d64-4fed-a00a-bd0cb5891e75" containerName="openstack-network-exporter" containerID="cri-o://763fc40370b8c637b94e717b3d4022b399a9064357843aa69085ce7652a30954" gracePeriod=300 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.516291 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="rsync" containerID="cri-o://d4d2f3213d0d8347ad4fc0fda1819961e3e48ca84f6a1a08530c3e457a9145ee" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.516338 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-expirer" containerID="cri-o://1b4624fea8b62d613092718192f8a9e9faa6138904a93866528c86735b20b493" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.516370 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-updater" containerID="cri-o://018613b4ace3495529050cc51bbcbc25c762db064d56bf3f3b2fa7c0ac1213cf" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.516401 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-auditor" containerID="cri-o://711bf815d45f9226c3bfc08294214785c60a9b9bd2e213b6e5c41884b2a87710" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.516429 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-replicator" containerID="cri-o://2f0ac5d9172eee436579154fd4936b18259605ce7d7deaad27a10a2c5cdbf7f4" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.516457 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-server" containerID="cri-o://37549a28195c14e158d5a907b7e78a1e6d69aa6e6efd13ac3ffc312e52ea12e6" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.516483 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="container-updater" containerID="cri-o://2d24232fdec7040ef6a5964e5689deb80c9416e2b9a5928b6c44a05db6a24a58" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.516542 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="container-auditor" containerID="cri-o://3647b01248aaadded5263e23d36e7d1f488a1f32c6f5c9c8f0363cd9c896464c" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.516591 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="container-replicator" containerID="cri-o://a6e14799401d44ada3dfee676cdeb9c70cb0d90eacc2dca91f8d1079c24c183c" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.516628 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="container-server" containerID="cri-o://9da3dd83501b5003165c1f29947b2365058f97cf1d114c7cef65d3123aa7bf9a" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.516666 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="account-reaper" containerID="cri-o://33cb37528efacc79ce75b3a9c57737f3259f3f0c3402a200d781a3fa066c92aa" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.516697 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="account-auditor" containerID="cri-o://da7cf8c62c28a42c7de4196b70da74f94b7baed423d08b118379806f1e593aa6" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.516726 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="account-replicator" containerID="cri-o://3df0bb832b8ac1eab32d7e7a4d74b6f6630e1a9104832fae36b218c6eafdc058" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.516821 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="6b90dab1-a183-4adc-b415-b67bd0d782f7" containerName="openstack-network-exporter" containerID="cri-o://f7d477d76ec53de2f17dbd54641073519838399c06bbcb97d465a22ac565351a" gracePeriod=300 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.516858 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-84bf49766d-bbf2p" podUID="7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" containerName="placement-api" containerID="cri-o://0b8dae08b6fef80dba408e5df15861c9a4ec087115f964d50b2c13d7ce34c9a4" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: E1002 11:18:28.536282 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 11:18:28 crc kubenswrapper[4766]: E1002 11:18:28.536674 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-config-data podName:1282b506-728d-4c6f-aa9c-3d3c1f826b71 nodeName:}" failed. No retries permitted until 2025-10-02 11:18:29.536653774 +0000 UTC m=+1624.479524718 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-config-data") pod "rabbitmq-server-0" (UID: "1282b506-728d-4c6f-aa9c-3d3c1f826b71") : configmap "rabbitmq-config-data" not found Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.601728 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-891f-account-create-5stxg"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.615245 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-891f-account-create-5stxg"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.637807 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kx85\" (UniqueName: \"kubernetes.io/projected/6ea7203d-5727-485f-8a6a-5bde96d05078-kube-api-access-6kx85\") pod \"novaapibccc-account-delete-n7pf6\" (UID: \"6ea7203d-5727-485f-8a6a-5bde96d05078\") " pod="openstack/novaapibccc-account-delete-n7pf6" Oct 02 11:18:28 crc kubenswrapper[4766]: E1002 11:18:28.637959 4766 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 02 11:18:28 crc kubenswrapper[4766]: E1002 11:18:28.637999 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data podName:d9339929-4331-4cd9-89bc-8350ef2f55f5 nodeName:}" failed. No retries permitted until 2025-10-02 11:18:30.637986022 +0000 UTC m=+1625.580856966 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data") pod "barbican-keystone-listener-664d98ccd8-hh5xk" (UID: "d9339929-4331-4cd9-89bc-8350ef2f55f5") : secret "barbican-config-data" not found Oct 02 11:18:28 crc kubenswrapper[4766]: E1002 11:18:28.638030 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 02 11:18:28 crc kubenswrapper[4766]: E1002 11:18:28.638048 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-config-data podName:874d062e-d2f8-462c-95b3-8f630b7120af nodeName:}" failed. No retries permitted until 2025-10-02 11:18:30.638042564 +0000 UTC m=+1625.580913508 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-config-data") pod "rabbitmq-cell1-server-0" (UID: "874d062e-d2f8-462c-95b3-8f630b7120af") : configmap "rabbitmq-cell1-config-data" not found Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.638064 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder891f-account-delete-2hm8h"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.668303 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.668835 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c3384223-2ad0-4593-976c-54c2d3cce52e" containerName="glance-httpd" containerID="cri-o://fb1c0454ba668b962552208833c616de4db07019cb885d1cc5bdc0cc294a91b5" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.669027 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c3384223-2ad0-4593-976c-54c2d3cce52e" containerName="glance-log" containerID="cri-o://f56abd7e1794b77ea62792a0bd79f484e63160d0da2a2b9689b20ab708f80f3a" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.686144 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.686411 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" containerName="glance-log" containerID="cri-o://1e54d0211edb5bd37091707d979fb377b49c8ac9d64e30887c84f1c90a8d9682" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.686793 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" containerName="glance-httpd" containerID="cri-o://4a4f512030ce39ec49c91ab00c8b625423cbf01429bd4a06c2d5b12c2bf0ca55" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.739321 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kx85\" (UniqueName: \"kubernetes.io/projected/6ea7203d-5727-485f-8a6a-5bde96d05078-kube-api-access-6kx85\") pod \"novaapibccc-account-delete-n7pf6\" (UID: \"6ea7203d-5727-485f-8a6a-5bde96d05078\") " pod="openstack/novaapibccc-account-delete-n7pf6" Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.739675 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-5rwpr"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.746636 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="2be5e935-0d64-4fed-a00a-bd0cb5891e75" containerName="ovsdbserver-sb" containerID="cri-o://2b993a56ebc1b37c4f1f1ec65f43687a9297e994b331c1c27c7eef4dd4bf4c33" gracePeriod=300 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.789036 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-5rwpr"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.792383 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kx85\" (UniqueName: \"kubernetes.io/projected/6ea7203d-5727-485f-8a6a-5bde96d05078-kube-api-access-6kx85\") pod \"novaapibccc-account-delete-n7pf6\" (UID: \"6ea7203d-5727-485f-8a6a-5bde96d05078\") " pod="openstack/novaapibccc-account-delete-n7pf6" Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.814361 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="6b90dab1-a183-4adc-b415-b67bd0d782f7" containerName="ovsdbserver-nb" containerID="cri-o://a138202f6e7dfaa68f96f9c8f7a8329c46aee3fb87a8289a0c8e738c5d41167a" gracePeriod=300 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.821916 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c7d8-account-create-mmkqm"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.836647 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c7d8-account-create-mmkqm"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.840139 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-4lfj7"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.846618 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-4lfj7"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.852252 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d1c9-account-create-xdktv"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.859395 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d1c9-account-create-xdktv"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.865296 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicance01-account-delete-jrx2r"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.878032 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementd1c9-account-delete-j6zwl"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.886660 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-0d4d-account-create-thx8j"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.891487 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-wsj29"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.905360 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance0d4d-account-delete-7cb92"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.917729 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-r8tj5"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.927957 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-ce01-account-create-wr9nd"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.933032 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-wsj29"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.939144 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-0d4d-account-create-thx8j"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.946576 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-r8tj5"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.954138 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-ce01-account-create-wr9nd"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.965868 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.966179 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bbb13294-05c1-4a20-8265-5144efcd91cf" containerName="nova-metadata-log" containerID="cri-o://bd5acfd9f8b6e410882799f1e42f4fb64c5bdc13db56d36a3ca2d32eb57ddec0" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.966373 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bbb13294-05c1-4a20-8265-5144efcd91cf" containerName="nova-metadata-metadata" containerID="cri-o://f6cd0095fa2a61271a1c6b2812af96732964a92ed3e96a78c81919c9ef11e724" gracePeriod=30 Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.978564 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:18:28 crc kubenswrapper[4766]: I1002 11:18:28.987027 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.016569 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.016897 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b869942b-07a4-4a08-b312-2b09cee2abf1" containerName="nova-api-log" containerID="cri-o://1aee3ffe3cc0a7ee677e4497600a6cf21df562e9f835edca4df6d4e59c78e1d2" gracePeriod=30 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.017342 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b869942b-07a4-4a08-b312-2b09cee2abf1" containerName="nova-api-api" containerID="cri-o://814f365245b0829017a26fc43722610ee727739807d4d801998edff378def268" gracePeriod=30 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.091415 4766 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell16336-account-delete-8q58s" secret="" err="secret \"galera-openstack-cell1-dockercfg-q2pfv\" not found" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.091491 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell16336-account-delete-8q58s" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.120361 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j8gq7" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.129952 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="874d062e-d2f8-462c-95b3-8f630b7120af" containerName="rabbitmq" containerID="cri-o://197d65982becb7d1b3560136e8ef3d13532ea4c97a2f09eb585ab09256067519" gracePeriod=604800 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.144286 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.182916 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-lvlsh"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.186850 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapibccc-account-delete-n7pf6" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.211790 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-lvlsh"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.214941 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-ovsdbserver-nb\") pod \"9fdf37c9-9a32-4103-8418-198d45d14415\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.214991 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-ovsdbserver-sb\") pod \"9fdf37c9-9a32-4103-8418-198d45d14415\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.215168 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-85jtl"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.235631 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-dns-svc\") pod \"9fdf37c9-9a32-4103-8418-198d45d14415\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.235916 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-config\") pod \"9fdf37c9-9a32-4103-8418-198d45d14415\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.235956 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jhsh\" (UniqueName: \"kubernetes.io/projected/9fdf37c9-9a32-4103-8418-198d45d14415-kube-api-access-7jhsh\") pod \"9fdf37c9-9a32-4103-8418-198d45d14415\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.235997 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-dns-swift-storage-0\") pod \"9fdf37c9-9a32-4103-8418-198d45d14415\" (UID: \"9fdf37c9-9a32-4103-8418-198d45d14415\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.236081 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0be46caa-0351-4f60-b16b-a258b9874a6f-catalog-content\") pod \"0be46caa-0351-4f60-b16b-a258b9874a6f\" (UID: \"0be46caa-0351-4f60-b16b-a258b9874a6f\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.236101 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mkdd\" (UniqueName: \"kubernetes.io/projected/0be46caa-0351-4f60-b16b-a258b9874a6f-kube-api-access-8mkdd\") pod \"0be46caa-0351-4f60-b16b-a258b9874a6f\" (UID: \"0be46caa-0351-4f60-b16b-a258b9874a6f\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.236125 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0be46caa-0351-4f60-b16b-a258b9874a6f-utilities\") pod \"0be46caa-0351-4f60-b16b-a258b9874a6f\" (UID: \"0be46caa-0351-4f60-b16b-a258b9874a6f\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.244338 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0be46caa-0351-4f60-b16b-a258b9874a6f-utilities" (OuterVolumeSpecName: "utilities") pod "0be46caa-0351-4f60-b16b-a258b9874a6f" (UID: "0be46caa-0351-4f60-b16b-a258b9874a6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.245788 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fdf37c9-9a32-4103-8418-198d45d14415-kube-api-access-7jhsh" (OuterVolumeSpecName: "kube-api-access-7jhsh") pod "9fdf37c9-9a32-4103-8418-198d45d14415" (UID: "9fdf37c9-9a32-4103-8418-198d45d14415"). InnerVolumeSpecName "kube-api-access-7jhsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.248539 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-bccc-account-create-qv4hh"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.251694 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="4b9bc510-a878-4e06-8db9-fd6209039c75" containerName="galera" containerID="cri-o://b13934ad964a9b61c115770887bcefe16bbb075f526931fb0fa8c919bf1f20b6" gracePeriod=30 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.252768 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be46caa-0351-4f60-b16b-a258b9874a6f-kube-api-access-8mkdd" (OuterVolumeSpecName: "kube-api-access-8mkdd") pod "0be46caa-0351-4f60-b16b-a258b9874a6f" (UID: "0be46caa-0351-4f60-b16b-a258b9874a6f"). InnerVolumeSpecName "kube-api-access-8mkdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.253244 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-85jtl"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.260689 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-tmhfd_95842554-1651-4c34-b934-d4eb21c6c52d/openstack-network-exporter/0.log" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.260897 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.277764 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6b90dab1-a183-4adc-b415-b67bd0d782f7/ovsdbserver-nb/0.log" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.277823 4766 generic.go:334] "Generic (PLEG): container finished" podID="6b90dab1-a183-4adc-b415-b67bd0d782f7" containerID="f7d477d76ec53de2f17dbd54641073519838399c06bbcb97d465a22ac565351a" exitCode=2 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.277841 4766 generic.go:334] "Generic (PLEG): container finished" podID="6b90dab1-a183-4adc-b415-b67bd0d782f7" containerID="a138202f6e7dfaa68f96f9c8f7a8329c46aee3fb87a8289a0c8e738c5d41167a" exitCode=143 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.277879 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6b90dab1-a183-4adc-b415-b67bd0d782f7","Type":"ContainerDied","Data":"f7d477d76ec53de2f17dbd54641073519838399c06bbcb97d465a22ac565351a"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.277907 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6b90dab1-a183-4adc-b415-b67bd0d782f7","Type":"ContainerDied","Data":"a138202f6e7dfaa68f96f9c8f7a8329c46aee3fb87a8289a0c8e738c5d41167a"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.281286 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-bccc-account-create-qv4hh"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.282765 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dp6x5" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.294789 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0be46caa-0351-4f60-b16b-a258b9874a6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0be46caa-0351-4f60-b16b-a258b9874a6f" (UID: "0be46caa-0351-4f60-b16b-a258b9874a6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.294882 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-tmhfd_95842554-1651-4c34-b934-d4eb21c6c52d/openstack-network-exporter/0.log" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.294999 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-tmhfd" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.295006 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-tmhfd" event={"ID":"95842554-1651-4c34-b934-d4eb21c6c52d","Type":"ContainerDied","Data":"a99bc15d6e4c358750c4c8a43063b576afd891ed37e5422e08006106c20d54e3"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.295252 4766 scope.go:117] "RemoveContainer" containerID="85d92d1d1bf439fc0a8bfe76963371dba146b935670f36dd525197b9b9036410" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.297672 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9d41-account-create-gdwhr"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.307589 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapibccc-account-delete-n7pf6"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.311178 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9d41-account-create-gdwhr"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.345015 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-log-ovn\") pod \"9860354f-7494-4b02-bca3-adc731683f7f\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.345064 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95842554-1651-4c34-b934-d4eb21c6c52d-config\") pod \"95842554-1651-4c34-b934-d4eb21c6c52d\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.345118 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9860354f-7494-4b02-bca3-adc731683f7f-scripts\") pod \"9860354f-7494-4b02-bca3-adc731683f7f\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.345133 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/95842554-1651-4c34-b934-d4eb21c6c52d-ovn-rundir\") pod \"95842554-1651-4c34-b934-d4eb21c6c52d\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.345180 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/95842554-1651-4c34-b934-d4eb21c6c52d-ovs-rundir\") pod \"95842554-1651-4c34-b934-d4eb21c6c52d\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.345305 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9860354f-7494-4b02-bca3-adc731683f7f-combined-ca-bundle\") pod \"9860354f-7494-4b02-bca3-adc731683f7f\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.345343 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-run\") pod \"9860354f-7494-4b02-bca3-adc731683f7f\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.345376 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-265lf\" (UniqueName: \"kubernetes.io/projected/95842554-1651-4c34-b934-d4eb21c6c52d-kube-api-access-265lf\") pod \"95842554-1651-4c34-b934-d4eb21c6c52d\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.345399 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/95842554-1651-4c34-b934-d4eb21c6c52d-metrics-certs-tls-certs\") pod \"95842554-1651-4c34-b934-d4eb21c6c52d\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.349936 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9860354f-7494-4b02-bca3-adc731683f7f-ovn-controller-tls-certs\") pod \"9860354f-7494-4b02-bca3-adc731683f7f\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.350061 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw5zq\" (UniqueName: \"kubernetes.io/projected/9860354f-7494-4b02-bca3-adc731683f7f-kube-api-access-tw5zq\") pod \"9860354f-7494-4b02-bca3-adc731683f7f\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.350119 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-run-ovn\") pod \"9860354f-7494-4b02-bca3-adc731683f7f\" (UID: \"9860354f-7494-4b02-bca3-adc731683f7f\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.350177 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95842554-1651-4c34-b934-d4eb21c6c52d-combined-ca-bundle\") pod \"95842554-1651-4c34-b934-d4eb21c6c52d\" (UID: \"95842554-1651-4c34-b934-d4eb21c6c52d\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.350858 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jhsh\" (UniqueName: \"kubernetes.io/projected/9fdf37c9-9a32-4103-8418-198d45d14415-kube-api-access-7jhsh\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.350879 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0be46caa-0351-4f60-b16b-a258b9874a6f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.350888 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mkdd\" (UniqueName: \"kubernetes.io/projected/0be46caa-0351-4f60-b16b-a258b9874a6f-kube-api-access-8mkdd\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.350897 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0be46caa-0351-4f60-b16b-a258b9874a6f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: E1002 11:18:29.354805 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a138202f6e7dfaa68f96f9c8f7a8329c46aee3fb87a8289a0c8e738c5d41167a is running failed: container process not found" containerID="a138202f6e7dfaa68f96f9c8f7a8329c46aee3fb87a8289a0c8e738c5d41167a" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.354958 4766 generic.go:334] "Generic (PLEG): container finished" podID="bbb13294-05c1-4a20-8265-5144efcd91cf" containerID="bd5acfd9f8b6e410882799f1e42f4fb64c5bdc13db56d36a3ca2d32eb57ddec0" exitCode=143 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.355004 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bbb13294-05c1-4a20-8265-5144efcd91cf","Type":"ContainerDied","Data":"bd5acfd9f8b6e410882799f1e42f4fb64c5bdc13db56d36a3ca2d32eb57ddec0"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.355920 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-run" (OuterVolumeSpecName: "var-run") pod "9860354f-7494-4b02-bca3-adc731683f7f" (UID: "9860354f-7494-4b02-bca3-adc731683f7f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.359620 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9860354f-7494-4b02-bca3-adc731683f7f-scripts" (OuterVolumeSpecName: "scripts") pod "9860354f-7494-4b02-bca3-adc731683f7f" (UID: "9860354f-7494-4b02-bca3-adc731683f7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.363758 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "9860354f-7494-4b02-bca3-adc731683f7f" (UID: "9860354f-7494-4b02-bca3-adc731683f7f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.364040 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95842554-1651-4c34-b934-d4eb21c6c52d-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "95842554-1651-4c34-b934-d4eb21c6c52d" (UID: "95842554-1651-4c34-b934-d4eb21c6c52d"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.364053 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/95842554-1651-4c34-b934-d4eb21c6c52d-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "95842554-1651-4c34-b934-d4eb21c6c52d" (UID: "95842554-1651-4c34-b934-d4eb21c6c52d"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.364062 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "9860354f-7494-4b02-bca3-adc731683f7f" (UID: "9860354f-7494-4b02-bca3-adc731683f7f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.364555 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95842554-1651-4c34-b934-d4eb21c6c52d-config" (OuterVolumeSpecName: "config") pod "95842554-1651-4c34-b934-d4eb21c6c52d" (UID: "95842554-1651-4c34-b934-d4eb21c6c52d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: E1002 11:18:29.363906 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a138202f6e7dfaa68f96f9c8f7a8329c46aee3fb87a8289a0c8e738c5d41167a is running failed: container process not found" containerID="a138202f6e7dfaa68f96f9c8f7a8329c46aee3fb87a8289a0c8e738c5d41167a" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.367187 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9fdf37c9-9a32-4103-8418-198d45d14415" (UID: "9fdf37c9-9a32-4103-8418-198d45d14415"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: E1002 11:18:29.373277 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a138202f6e7dfaa68f96f9c8f7a8329c46aee3fb87a8289a0c8e738c5d41167a is running failed: container process not found" containerID="a138202f6e7dfaa68f96f9c8f7a8329c46aee3fb87a8289a0c8e738c5d41167a" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 02 11:18:29 crc kubenswrapper[4766]: E1002 11:18:29.373345 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a138202f6e7dfaa68f96f9c8f7a8329c46aee3fb87a8289a0c8e738c5d41167a is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-nb-0" podUID="6b90dab1-a183-4adc-b415-b67bd0d782f7" containerName="ovsdbserver-nb" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.373650 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-664d98ccd8-hh5xk"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.375815 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" podUID="d9339929-4331-4cd9-89bc-8350ef2f55f5" containerName="barbican-keystone-listener-log" containerID="cri-o://dc1353dac7e3a318b9bf88253e7621b0e0c300fbb3ed030d2c367fb3cffe1ca0" gracePeriod=30 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.376379 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" podUID="d9339929-4331-4cd9-89bc-8350ef2f55f5" containerName="barbican-keystone-listener" containerID="cri-o://973c651619479183947224e4242097f161cf673b8da2935544f44e1700d072b9" gracePeriod=30 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.381797 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95842554-1651-4c34-b934-d4eb21c6c52d-kube-api-access-265lf" (OuterVolumeSpecName: "kube-api-access-265lf") pod "95842554-1651-4c34-b934-d4eb21c6c52d" (UID: "95842554-1651-4c34-b934-d4eb21c6c52d"). InnerVolumeSpecName "kube-api-access-265lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.382876 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9860354f-7494-4b02-bca3-adc731683f7f-kube-api-access-tw5zq" (OuterVolumeSpecName: "kube-api-access-tw5zq") pod "9860354f-7494-4b02-bca3-adc731683f7f" (UID: "9860354f-7494-4b02-bca3-adc731683f7f"). InnerVolumeSpecName "kube-api-access-tw5zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.383198 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.384368 4766 generic.go:334] "Generic (PLEG): container finished" podID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" exitCode=0 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.384490 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8wzw9" event={"ID":"d90db976-cd03-4eb7-8e1d-361ef7c5045b","Type":"ContainerDied","Data":"3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.389194 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-75997cdf8b-nnlzj"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.389471 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-75997cdf8b-nnlzj" podUID="94c8c5ed-b069-4112-ae71-d9071bc15ff2" containerName="barbican-api-log" containerID="cri-o://1739e7ba00b0bf9bb771dca9d289d586ed12ce58d682ed8f6ac6cf039138a291" gracePeriod=30 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.389789 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-75997cdf8b-nnlzj" podUID="94c8c5ed-b069-4112-ae71-d9071bc15ff2" containerName="barbican-api" containerID="cri-o://6cd538b1cdf3993f6cd959aebdda72d9055a7730e3cb15b1a555f7da8b9b1353" gracePeriod=30 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.392044 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2be5e935-0d64-4fed-a00a-bd0cb5891e75/ovsdbserver-sb/0.log" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.399420 4766 generic.go:334] "Generic (PLEG): container finished" podID="2be5e935-0d64-4fed-a00a-bd0cb5891e75" containerID="763fc40370b8c637b94e717b3d4022b399a9064357843aa69085ce7652a30954" exitCode=2 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.401473 4766 generic.go:334] "Generic (PLEG): container finished" podID="2be5e935-0d64-4fed-a00a-bd0cb5891e75" containerID="2b993a56ebc1b37c4f1f1ec65f43687a9297e994b331c1c27c7eef4dd4bf4c33" exitCode=143 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.394640 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-f5f68d797-k4qqv"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.401789 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2be5e935-0d64-4fed-a00a-bd0cb5891e75","Type":"ContainerDied","Data":"763fc40370b8c637b94e717b3d4022b399a9064357843aa69085ce7652a30954"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.402664 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2be5e935-0d64-4fed-a00a-bd0cb5891e75","Type":"ContainerDied","Data":"2b993a56ebc1b37c4f1f1ec65f43687a9297e994b331c1c27c7eef4dd4bf4c33"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.403530 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-f5f68d797-k4qqv" podUID="8d43eab0-4595-42fc-8489-38792e0c6e19" containerName="barbican-worker-log" containerID="cri-o://f1a0b9913341b1fffedbb9296ca3e24f9abbcd44a25a0528cebe5da010d355e1" gracePeriod=30 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.404803 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-f5f68d797-k4qqv" podUID="8d43eab0-4595-42fc-8489-38792e0c6e19" containerName="barbican-worker" containerID="cri-o://433ae393df6f772a4b0964a7a633bfcdca8d7d78296edfa4ca875b807cacbd06" gracePeriod=30 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.424574 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9fdf37c9-9a32-4103-8418-198d45d14415" (UID: "9fdf37c9-9a32-4103-8418-198d45d14415"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.424960 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.426923 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="123b65f7-a8e8-434b-baf1-e9b0d3a985d9" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://805fa4defeec778eb8f810a670bb90a7f802c044adf651111138c5f240d5a4ad" gracePeriod=30 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.453384 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d896308d-0b8a-4cfc-ad92-311521c2e417-openstack-config\") pod \"d896308d-0b8a-4cfc-ad92-311521c2e417\" (UID: \"d896308d-0b8a-4cfc-ad92-311521c2e417\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.453483 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d896308d-0b8a-4cfc-ad92-311521c2e417-openstack-config-secret\") pod \"d896308d-0b8a-4cfc-ad92-311521c2e417\" (UID: \"d896308d-0b8a-4cfc-ad92-311521c2e417\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.453576 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvg8r\" (UniqueName: \"kubernetes.io/projected/d896308d-0b8a-4cfc-ad92-311521c2e417-kube-api-access-pvg8r\") pod \"d896308d-0b8a-4cfc-ad92-311521c2e417\" (UID: \"d896308d-0b8a-4cfc-ad92-311521c2e417\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.453628 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d896308d-0b8a-4cfc-ad92-311521c2e417-combined-ca-bundle\") pod \"d896308d-0b8a-4cfc-ad92-311521c2e417\" (UID: \"d896308d-0b8a-4cfc-ad92-311521c2e417\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.454136 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw5zq\" (UniqueName: \"kubernetes.io/projected/9860354f-7494-4b02-bca3-adc731683f7f-kube-api-access-tw5zq\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.454150 4766 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.454158 4766 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.454171 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95842554-1651-4c34-b934-d4eb21c6c52d-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.454178 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9860354f-7494-4b02-bca3-adc731683f7f-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.454186 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/95842554-1651-4c34-b934-d4eb21c6c52d-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.454194 4766 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/95842554-1651-4c34-b934-d4eb21c6c52d-ovs-rundir\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.454201 4766 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9860354f-7494-4b02-bca3-adc731683f7f-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.454210 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-265lf\" (UniqueName: \"kubernetes.io/projected/95842554-1651-4c34-b934-d4eb21c6c52d-kube-api-access-265lf\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.454218 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.454226 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.471306 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9fdf37c9-9a32-4103-8418-198d45d14415" (UID: "9fdf37c9-9a32-4103-8418-198d45d14415"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.478974 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.479423 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d896308d-0b8a-4cfc-ad92-311521c2e417-kube-api-access-pvg8r" (OuterVolumeSpecName: "kube-api-access-pvg8r") pod "d896308d-0b8a-4cfc-ad92-311521c2e417" (UID: "d896308d-0b8a-4cfc-ad92-311521c2e417"). InnerVolumeSpecName "kube-api-access-pvg8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491381 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerID="d4d2f3213d0d8347ad4fc0fda1819961e3e48ca84f6a1a08530c3e457a9145ee" exitCode=0 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491416 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerID="1b4624fea8b62d613092718192f8a9e9faa6138904a93866528c86735b20b493" exitCode=0 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491428 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerID="018613b4ace3495529050cc51bbcbc25c762db064d56bf3f3b2fa7c0ac1213cf" exitCode=0 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491438 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerID="711bf815d45f9226c3bfc08294214785c60a9b9bd2e213b6e5c41884b2a87710" exitCode=0 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491446 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerID="2f0ac5d9172eee436579154fd4936b18259605ce7d7deaad27a10a2c5cdbf7f4" exitCode=0 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491453 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerID="37549a28195c14e158d5a907b7e78a1e6d69aa6e6efd13ac3ffc312e52ea12e6" exitCode=0 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491461 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerID="2d24232fdec7040ef6a5964e5689deb80c9416e2b9a5928b6c44a05db6a24a58" exitCode=0 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491468 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerID="3647b01248aaadded5263e23d36e7d1f488a1f32c6f5c9c8f0363cd9c896464c" exitCode=0 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491475 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerID="a6e14799401d44ada3dfee676cdeb9c70cb0d90eacc2dca91f8d1079c24c183c" exitCode=0 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491483 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerID="9da3dd83501b5003165c1f29947b2365058f97cf1d114c7cef65d3123aa7bf9a" exitCode=0 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491490 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerID="33cb37528efacc79ce75b3a9c57737f3259f3f0c3402a200d781a3fa066c92aa" exitCode=0 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491516 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerID="da7cf8c62c28a42c7de4196b70da74f94b7baed423d08b118379806f1e593aa6" exitCode=0 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491526 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerID="3df0bb832b8ac1eab32d7e7a4d74b6f6630e1a9104832fae36b218c6eafdc058" exitCode=0 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491576 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerDied","Data":"d4d2f3213d0d8347ad4fc0fda1819961e3e48ca84f6a1a08530c3e457a9145ee"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491602 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerDied","Data":"1b4624fea8b62d613092718192f8a9e9faa6138904a93866528c86735b20b493"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491615 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerDied","Data":"018613b4ace3495529050cc51bbcbc25c762db064d56bf3f3b2fa7c0ac1213cf"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491628 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerDied","Data":"711bf815d45f9226c3bfc08294214785c60a9b9bd2e213b6e5c41884b2a87710"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491637 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerDied","Data":"2f0ac5d9172eee436579154fd4936b18259605ce7d7deaad27a10a2c5cdbf7f4"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491647 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerDied","Data":"37549a28195c14e158d5a907b7e78a1e6d69aa6e6efd13ac3ffc312e52ea12e6"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491658 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerDied","Data":"2d24232fdec7040ef6a5964e5689deb80c9416e2b9a5928b6c44a05db6a24a58"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491667 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerDied","Data":"3647b01248aaadded5263e23d36e7d1f488a1f32c6f5c9c8f0363cd9c896464c"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491677 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerDied","Data":"a6e14799401d44ada3dfee676cdeb9c70cb0d90eacc2dca91f8d1079c24c183c"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491686 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerDied","Data":"9da3dd83501b5003165c1f29947b2365058f97cf1d114c7cef65d3123aa7bf9a"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491695 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerDied","Data":"33cb37528efacc79ce75b3a9c57737f3259f3f0c3402a200d781a3fa066c92aa"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491707 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerDied","Data":"da7cf8c62c28a42c7de4196b70da74f94b7baed423d08b118379806f1e593aa6"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.491717 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerDied","Data":"3df0bb832b8ac1eab32d7e7a4d74b6f6630e1a9104832fae36b218c6eafdc058"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.493614 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.493807 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7dec5495-e66b-4e5e-90b6-82ee673ab269" containerName="nova-scheduler-scheduler" containerID="cri-o://07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936" gracePeriod=30 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.507280 4766 generic.go:334] "Generic (PLEG): container finished" podID="7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" containerID="3d18a6249adb9873ab43d08de93aedc55c827c33e60c6d8d86226fc15f98872a" exitCode=143 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.507365 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84bf49766d-bbf2p" event={"ID":"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca","Type":"ContainerDied","Data":"3d18a6249adb9873ab43d08de93aedc55c827c33e60c6d8d86226fc15f98872a"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.539421 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j8gq7" event={"ID":"0be46caa-0351-4f60-b16b-a258b9874a6f","Type":"ContainerDied","Data":"00a0052c7e29165f32eaa5c30c84a415bee37c4d36c746885bcd91e60ff7a3ac"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.539476 4766 scope.go:117] "RemoveContainer" containerID="5116c39cd5e1f3749deb2a96cecbb5c060c4b374035872069ea7256e1fb94c1e" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.539555 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j8gq7" Oct 02 11:18:29 crc kubenswrapper[4766]: E1002 11:18:29.546283 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b993a56ebc1b37c4f1f1ec65f43687a9297e994b331c1c27c7eef4dd4bf4c33 is running failed: container process not found" containerID="2b993a56ebc1b37c4f1f1ec65f43687a9297e994b331c1c27c7eef4dd4bf4c33" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 02 11:18:29 crc kubenswrapper[4766]: E1002 11:18:29.553913 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b993a56ebc1b37c4f1f1ec65f43687a9297e994b331c1c27c7eef4dd4bf4c33 is running failed: container process not found" containerID="2b993a56ebc1b37c4f1f1ec65f43687a9297e994b331c1c27c7eef4dd4bf4c33" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 02 11:18:29 crc kubenswrapper[4766]: E1002 11:18:29.556072 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 11:18:29 crc kubenswrapper[4766]: E1002 11:18:29.556437 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-config-data podName:1282b506-728d-4c6f-aa9c-3d3c1f826b71 nodeName:}" failed. No retries permitted until 2025-10-02 11:18:31.556416642 +0000 UTC m=+1626.499287586 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-config-data") pod "rabbitmq-server-0" (UID: "1282b506-728d-4c6f-aa9c-3d3c1f826b71") : configmap "rabbitmq-config-data" not found Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.556101 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvg8r\" (UniqueName: \"kubernetes.io/projected/d896308d-0b8a-4cfc-ad92-311521c2e417-kube-api-access-pvg8r\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.556474 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: E1002 11:18:29.561004 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b993a56ebc1b37c4f1f1ec65f43687a9297e994b331c1c27c7eef4dd4bf4c33 is running failed: container process not found" containerID="2b993a56ebc1b37c4f1f1ec65f43687a9297e994b331c1c27c7eef4dd4bf4c33" cmd=["/usr/bin/pidof","ovsdb-server"] Oct 02 11:18:29 crc kubenswrapper[4766]: E1002 11:18:29.561295 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b993a56ebc1b37c4f1f1ec65f43687a9297e994b331c1c27c7eef4dd4bf4c33 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="2be5e935-0d64-4fed-a00a-bd0cb5891e75" containerName="ovsdbserver-sb" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.561629 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95842554-1651-4c34-b934-d4eb21c6c52d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95842554-1651-4c34-b934-d4eb21c6c52d" (UID: "95842554-1651-4c34-b934-d4eb21c6c52d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.566109 4766 generic.go:334] "Generic (PLEG): container finished" podID="ec8cdac7-81c9-41e7-a956-41d13e5b91a6" containerID="f5124828eb5c9f7610af6224039bb25da1c36b7f880181f818d2c8a8eefcf481" exitCode=0 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.566237 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55f8b9d7c-hfdcr" event={"ID":"ec8cdac7-81c9-41e7-a956-41d13e5b91a6","Type":"ContainerDied","Data":"f5124828eb5c9f7610af6224039bb25da1c36b7f880181f818d2c8a8eefcf481"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.571404 4766 generic.go:334] "Generic (PLEG): container finished" podID="0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" containerID="1e54d0211edb5bd37091707d979fb377b49c8ac9d64e30887c84f1c90a8d9682" exitCode=143 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.572241 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598","Type":"ContainerDied","Data":"1e54d0211edb5bd37091707d979fb377b49c8ac9d64e30887c84f1c90a8d9682"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.576851 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d896308d-0b8a-4cfc-ad92-311521c2e417-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d896308d-0b8a-4cfc-ad92-311521c2e417" (UID: "d896308d-0b8a-4cfc-ad92-311521c2e417"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.579449 4766 generic.go:334] "Generic (PLEG): container finished" podID="c3384223-2ad0-4593-976c-54c2d3cce52e" containerID="f56abd7e1794b77ea62792a0bd79f484e63160d0da2a2b9689b20ab708f80f3a" exitCode=143 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.579558 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3384223-2ad0-4593-976c-54c2d3cce52e","Type":"ContainerDied","Data":"f56abd7e1794b77ea62792a0bd79f484e63160d0da2a2b9689b20ab708f80f3a"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.587952 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" event={"ID":"9fdf37c9-9a32-4103-8418-198d45d14415","Type":"ContainerDied","Data":"ce489e596e9ddd769df973bc1233a9dc847025a10294fe124d70398d646e919e"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.588042 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-gbg65" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.629984 4766 generic.go:334] "Generic (PLEG): container finished" podID="6b3b4ffb-195e-4aba-b3bc-9969b56c01d6" containerID="4fa69755c8413dc8a6865886029bf8ec092fa254162c708d148185b7305c545c" exitCode=0 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.630094 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6","Type":"ContainerDied","Data":"4fa69755c8413dc8a6865886029bf8ec092fa254162c708d148185b7305c545c"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.633716 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d896308d-0b8a-4cfc-ad92-311521c2e417-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d896308d-0b8a-4cfc-ad92-311521c2e417" (UID: "d896308d-0b8a-4cfc-ad92-311521c2e417"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.642182 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9fdf37c9-9a32-4103-8418-198d45d14415" (UID: "9fdf37c9-9a32-4103-8418-198d45d14415"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.653871 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="1282b506-728d-4c6f-aa9c-3d3c1f826b71" containerName="rabbitmq" containerID="cri-o://366bc0e5d2fdf94dff0819d1232ca88a875b2fd3ae879cab99a8d75f0ceae62c" gracePeriod=604800 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.660002 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d896308d-0b8a-4cfc-ad92-311521c2e417-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.660022 4766 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.660031 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95842554-1651-4c34-b934-d4eb21c6c52d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.660039 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d896308d-0b8a-4cfc-ad92-311521c2e417-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.678741 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9860354f-7494-4b02-bca3-adc731683f7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9860354f-7494-4b02-bca3-adc731683f7f" (UID: "9860354f-7494-4b02-bca3-adc731683f7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.688576 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dp6x5" event={"ID":"9860354f-7494-4b02-bca3-adc731683f7f","Type":"ContainerDied","Data":"c4d3c0190f6dd53c11762d08376b0bf12ee82e4e1a6d3719a4a765474ff672fe"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.691953 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dp6x5" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.694399 4766 scope.go:117] "RemoveContainer" containerID="ff40e39793793465f75fb22c204001262aaca34bc6d55d604cabba72ae0c9eb7" Oct 02 11:18:29 crc kubenswrapper[4766]: W1002 11:18:29.697677 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52bdcfda_75b3_450b_9db4_1a443be18fa3.slice/crio-27516caab2b7be54b72e47aeca7afacf37503a3cafd4099db0546c54520ba38e WatchSource:0}: Error finding container 27516caab2b7be54b72e47aeca7afacf37503a3cafd4099db0546c54520ba38e: Status 404 returned error can't find the container with id 27516caab2b7be54b72e47aeca7afacf37503a3cafd4099db0546c54520ba38e Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.707073 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementd1c9-account-delete-j6zwl"] Oct 02 11:18:29 crc kubenswrapper[4766]: W1002 11:18:29.712378 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fac2bf1_fb0f_4031_bfc8_34090cc90c8a.slice/crio-0b6b15f103a052f81b9c293f86b9646a141dc83affc6a945de5fce8dbcb53ba8 WatchSource:0}: Error finding container 0b6b15f103a052f81b9c293f86b9646a141dc83affc6a945de5fce8dbcb53ba8: Status 404 returned error can't find the container with id 0b6b15f103a052f81b9c293f86b9646a141dc83affc6a945de5fce8dbcb53ba8 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.713383 4766 generic.go:334] "Generic (PLEG): container finished" podID="b869942b-07a4-4a08-b312-2b09cee2abf1" containerID="1aee3ffe3cc0a7ee677e4497600a6cf21df562e9f835edca4df6d4e59c78e1d2" exitCode=143 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.713448 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b869942b-07a4-4a08-b312-2b09cee2abf1","Type":"ContainerDied","Data":"1aee3ffe3cc0a7ee677e4497600a6cf21df562e9f835edca4df6d4e59c78e1d2"} Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.714461 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder891f-account-delete-2hm8h"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.722731 4766 generic.go:334] "Generic (PLEG): container finished" podID="d896308d-0b8a-4cfc-ad92-311521c2e417" containerID="3f59b786bb34a976dbad318bdbfed8db8ed47518b0fa788de2b6cb253622cf0d" exitCode=137 Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.722823 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.738430 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-config" (OuterVolumeSpecName: "config") pod "9fdf37c9-9a32-4103-8418-198d45d14415" (UID: "9fdf37c9-9a32-4103-8418-198d45d14415"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.753165 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicance01-account-delete-jrx2r"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.762121 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9860354f-7494-4b02-bca3-adc731683f7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.762144 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fdf37c9-9a32-4103-8418-198d45d14415-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.773577 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance0d4d-account-delete-7cb92"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.787836 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell16336-account-delete-8q58s"] Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.791747 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d896308d-0b8a-4cfc-ad92-311521c2e417-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d896308d-0b8a-4cfc-ad92-311521c2e417" (UID: "d896308d-0b8a-4cfc-ad92-311521c2e417"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.800849 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95842554-1651-4c34-b934-d4eb21c6c52d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "95842554-1651-4c34-b934-d4eb21c6c52d" (UID: "95842554-1651-4c34-b934-d4eb21c6c52d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.809973 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9860354f-7494-4b02-bca3-adc731683f7f-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "9860354f-7494-4b02-bca3-adc731683f7f" (UID: "9860354f-7494-4b02-bca3-adc731683f7f"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: E1002 11:18:29.813568 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:18:29 crc kubenswrapper[4766]: E1002 11:18:29.821641 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:18:29 crc kubenswrapper[4766]: E1002 11:18:29.823611 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:18:29 crc kubenswrapper[4766]: E1002 11:18:29.823685 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="7dec5495-e66b-4e5e-90b6-82ee673ab269" containerName="nova-scheduler-scheduler" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.863820 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/95842554-1651-4c34-b934-d4eb21c6c52d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.863858 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/9860354f-7494-4b02-bca3-adc731683f7f-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.863870 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d896308d-0b8a-4cfc-ad92-311521c2e417-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.881580 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:18:29 crc kubenswrapper[4766]: E1002 11:18:29.881818 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.904236 4766 scope.go:117] "RemoveContainer" containerID="38264158d4d2ca7725c3821356e1e5e055e67de3a4b5aef6daa927a1cadd9932" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.947566 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ecd07b2-f47c-44c2-8c54-943c8c91ef0f" path="/var/lib/kubelet/pods/0ecd07b2-f47c-44c2-8c54-943c8c91ef0f/volumes" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.949030 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_6b90dab1-a183-4adc-b415-b67bd0d782f7/ovsdbserver-nb/0.log" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.949106 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.950072 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a0d201d-b9b8-49e0-b51a-9e187d4b4441" path="/var/lib/kubelet/pods/1a0d201d-b9b8-49e0-b51a-9e187d4b4441/volumes" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.952204 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e6935d6-839b-4f02-86f4-b79ad98cf891" path="/var/lib/kubelet/pods/2e6935d6-839b-4f02-86f4-b79ad98cf891/volumes" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.965151 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6b90dab1-a183-4adc-b415-b67bd0d782f7-ovsdb-rundir\") pod \"6b90dab1-a183-4adc-b415-b67bd0d782f7\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.965210 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"6b90dab1-a183-4adc-b415-b67bd0d782f7\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.965258 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdkkt\" (UniqueName: \"kubernetes.io/projected/6b90dab1-a183-4adc-b415-b67bd0d782f7-kube-api-access-zdkkt\") pod \"6b90dab1-a183-4adc-b415-b67bd0d782f7\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.965314 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-metrics-certs-tls-certs\") pod \"6b90dab1-a183-4adc-b415-b67bd0d782f7\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.965354 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b90dab1-a183-4adc-b415-b67bd0d782f7-scripts\") pod \"6b90dab1-a183-4adc-b415-b67bd0d782f7\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.965404 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-ovsdbserver-nb-tls-certs\") pod \"6b90dab1-a183-4adc-b415-b67bd0d782f7\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.965439 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-combined-ca-bundle\") pod \"6b90dab1-a183-4adc-b415-b67bd0d782f7\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.965486 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b90dab1-a183-4adc-b415-b67bd0d782f7-config\") pod \"6b90dab1-a183-4adc-b415-b67bd0d782f7\" (UID: \"6b90dab1-a183-4adc-b415-b67bd0d782f7\") " Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.966471 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b90dab1-a183-4adc-b415-b67bd0d782f7-scripts" (OuterVolumeSpecName: "scripts") pod "6b90dab1-a183-4adc-b415-b67bd0d782f7" (UID: "6b90dab1-a183-4adc-b415-b67bd0d782f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.966788 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3061df7e-4dd6-4340-88af-67b6d9b3a6b7" path="/var/lib/kubelet/pods/3061df7e-4dd6-4340-88af-67b6d9b3a6b7/volumes" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.967912 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58a480c5-a9e3-46da-b3df-4d73473d4b12" path="/var/lib/kubelet/pods/58a480c5-a9e3-46da-b3df-4d73473d4b12/volumes" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.971096 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b90dab1-a183-4adc-b415-b67bd0d782f7-kube-api-access-zdkkt" (OuterVolumeSpecName: "kube-api-access-zdkkt") pod "6b90dab1-a183-4adc-b415-b67bd0d782f7" (UID: "6b90dab1-a183-4adc-b415-b67bd0d782f7"). InnerVolumeSpecName "kube-api-access-zdkkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.971173 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d1ba928-5e2f-45dc-a660-f1c9fc375829" path="/var/lib/kubelet/pods/5d1ba928-5e2f-45dc-a660-f1c9fc375829/volumes" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.971698 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "6b90dab1-a183-4adc-b415-b67bd0d782f7" (UID: "6b90dab1-a183-4adc-b415-b67bd0d782f7"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.972366 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60439b73-f45d-40e6-a492-8135da66a9a4" path="/var/lib/kubelet/pods/60439b73-f45d-40e6-a492-8135da66a9a4/volumes" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.974002 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe" path="/var/lib/kubelet/pods/7f9aceb9-6475-4e50-bcd6-eb9c897b2dbe/volumes" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.975010 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="858ff098-bb65-418c-832e-9cd9d8cd75d6" path="/var/lib/kubelet/pods/858ff098-bb65-418c-832e-9cd9d8cd75d6/volumes" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.976611 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b90dab1-a183-4adc-b415-b67bd0d782f7-config" (OuterVolumeSpecName: "config") pod "6b90dab1-a183-4adc-b415-b67bd0d782f7" (UID: "6b90dab1-a183-4adc-b415-b67bd0d782f7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.976912 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b90dab1-a183-4adc-b415-b67bd0d782f7-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "6b90dab1-a183-4adc-b415-b67bd0d782f7" (UID: "6b90dab1-a183-4adc-b415-b67bd0d782f7"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.977019 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b6e3b29-de5d-4b1e-a14f-b942f0653bbb" path="/var/lib/kubelet/pods/8b6e3b29-de5d-4b1e-a14f-b942f0653bbb/volumes" Oct 02 11:18:29 crc kubenswrapper[4766]: I1002 11:18:29.977687 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98d83c0b-9e98-49c8-9764-1c4a5792d1b5" path="/var/lib/kubelet/pods/98d83c0b-9e98-49c8-9764-1c4a5792d1b5/volumes" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.011107 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc2e60b1-7e85-49d4-8efb-836891b9acba" path="/var/lib/kubelet/pods/bc2e60b1-7e85-49d4-8efb-836891b9acba/volumes" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.017977 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdeac64a-3b04-4d56-af88-35d4cdddeaac" path="/var/lib/kubelet/pods/bdeac64a-3b04-4d56-af88-35d4cdddeaac/volumes" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.025969 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7037a9b-8f0f-4595-892e-3106080371d6" path="/var/lib/kubelet/pods/d7037a9b-8f0f-4595-892e-3106080371d6/volumes" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.026730 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d896308d-0b8a-4cfc-ad92-311521c2e417" path="/var/lib/kubelet/pods/d896308d-0b8a-4cfc-ad92-311521c2e417/volumes" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.034782 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da4d9512-afe1-45d8-ba7c-77c398a7955d" path="/var/lib/kubelet/pods/da4d9512-afe1-45d8-ba7c-77c398a7955d/volumes" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.038632 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf4b594-4776-4a98-acc2-75de39597d5e" path="/var/lib/kubelet/pods/dbf4b594-4776-4a98-acc2-75de39597d5e/volumes" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.049325 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e870bd08-c21d-4537-b2bd-f19eff7e3877" path="/var/lib/kubelet/pods/e870bd08-c21d-4537-b2bd-f19eff7e3877/volumes" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.062907 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed4fc73c-1d8b-4209-871c-2971f74c963a" path="/var/lib/kubelet/pods/ed4fc73c-1d8b-4209-871c-2971f74c963a/volumes" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.052736 4766 scope.go:117] "RemoveContainer" containerID="f7d477d76ec53de2f17dbd54641073519838399c06bbcb97d465a22ac565351a" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.072472 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapibccc-account-delete-n7pf6"] Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.072514 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j8gq7"] Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.072529 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j8gq7"] Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.072543 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gbg65"] Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.072553 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-gbg65"] Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.072754 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6b90dab1-a183-4adc-b415-b67bd0d782f7-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.072777 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.072787 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdkkt\" (UniqueName: \"kubernetes.io/projected/6b90dab1-a183-4adc-b415-b67bd0d782f7-kube-api-access-zdkkt\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.072796 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b90dab1-a183-4adc-b415-b67bd0d782f7-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.072805 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b90dab1-a183-4adc-b415-b67bd0d782f7-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:30 crc kubenswrapper[4766]: W1002 11:18:30.085602 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ea7203d_5727_485f_8a6a_5bde96d05078.slice/crio-46bf389b5fc88b6dbfa295f51f35ca0b5b4f0b242188fa16176efa6b752f69c6 WatchSource:0}: Error finding container 46bf389b5fc88b6dbfa295f51f35ca0b5b4f0b242188fa16176efa6b752f69c6: Status 404 returned error can't find the container with id 46bf389b5fc88b6dbfa295f51f35ca0b5b4f0b242188fa16176efa6b752f69c6 Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.130542 4766 scope.go:117] "RemoveContainer" containerID="8b6a6f87bc8fecfe651496c29a84fcb9e8cce1f6c6be0ce4f876d4b4d799b5de" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.134542 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.146458 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-tmhfd"] Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.156850 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-tmhfd"] Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.174683 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.214542 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b90dab1-a183-4adc-b415-b67bd0d782f7" (UID: "6b90dab1-a183-4adc-b415-b67bd0d782f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.226520 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dp6x5"] Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.228603 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2be5e935-0d64-4fed-a00a-bd0cb5891e75/ovsdbserver-sb/0.log" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.228686 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.229141 4766 scope.go:117] "RemoveContainer" containerID="c89f5c84d0409c16045c6c291ae1e61451bea1df55e8d8660824f0b613db853e" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.247649 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dp6x5"] Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.256601 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-67bd9fd99f-qbp28"] Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.257305 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-67bd9fd99f-qbp28" podUID="44893df1-77c5-494c-bae0-253447abc8f4" containerName="proxy-httpd" containerID="cri-o://ff59b3f5c87557d99a67228679782023c256d54647b29c27032c95dfbc2bd77b" gracePeriod=30 Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.257822 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-67bd9fd99f-qbp28" podUID="44893df1-77c5-494c-bae0-253447abc8f4" containerName="proxy-server" containerID="cri-o://15c74c8e7a8896b2165a6a68dd5ee3e1b21f3fcb860feba3289044c93dbf1f19" gracePeriod=30 Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.278085 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.319688 4766 scope.go:117] "RemoveContainer" containerID="a159f561aef681dec7a32ee79dcabb81731c25f79c9eafcbda290d83ab7ba093" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.356947 4766 scope.go:117] "RemoveContainer" containerID="2b993a56ebc1b37c4f1f1ec65f43687a9297e994b331c1c27c7eef4dd4bf4c33" Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.373019 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b13934ad964a9b61c115770887bcefe16bbb075f526931fb0fa8c919bf1f20b6" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.375432 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b13934ad964a9b61c115770887bcefe16bbb075f526931fb0fa8c919bf1f20b6" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.376576 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b13934ad964a9b61c115770887bcefe16bbb075f526931fb0fa8c919bf1f20b6" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.376616 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="4b9bc510-a878-4e06-8db9-fd6209039c75" containerName="galera" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.381008 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be5e935-0d64-4fed-a00a-bd0cb5891e75-config\") pod \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.381075 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2be5e935-0d64-4fed-a00a-bd0cb5891e75-scripts\") pod \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.381127 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-combined-ca-bundle\") pod \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.381224 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-ovsdbserver-sb-tls-certs\") pod \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.381449 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2be5e935-0d64-4fed-a00a-bd0cb5891e75-ovsdb-rundir\") pod \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.381486 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-metrics-certs-tls-certs\") pod \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.381614 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64cj7\" (UniqueName: \"kubernetes.io/projected/2be5e935-0d64-4fed-a00a-bd0cb5891e75-kube-api-access-64cj7\") pod \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.381747 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\" (UID: \"2be5e935-0d64-4fed-a00a-bd0cb5891e75\") " Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.384582 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2be5e935-0d64-4fed-a00a-bd0cb5891e75-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "2be5e935-0d64-4fed-a00a-bd0cb5891e75" (UID: "2be5e935-0d64-4fed-a00a-bd0cb5891e75"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.384949 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be5e935-0d64-4fed-a00a-bd0cb5891e75-config" (OuterVolumeSpecName: "config") pod "2be5e935-0d64-4fed-a00a-bd0cb5891e75" (UID: "2be5e935-0d64-4fed-a00a-bd0cb5891e75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.384670 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be5e935-0d64-4fed-a00a-bd0cb5891e75-scripts" (OuterVolumeSpecName: "scripts") pod "2be5e935-0d64-4fed-a00a-bd0cb5891e75" (UID: "2be5e935-0d64-4fed-a00a-bd0cb5891e75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.409667 4766 scope.go:117] "RemoveContainer" containerID="2facd81316d0bb433c5134538dde848cf53f8502a7244939d8aa6ec3bc9bf2db" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.414228 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "6b90dab1-a183-4adc-b415-b67bd0d782f7" (UID: "6b90dab1-a183-4adc-b415-b67bd0d782f7"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.415169 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "2be5e935-0d64-4fed-a00a-bd0cb5891e75" (UID: "2be5e935-0d64-4fed-a00a-bd0cb5891e75"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.415727 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2be5e935-0d64-4fed-a00a-bd0cb5891e75-kube-api-access-64cj7" (OuterVolumeSpecName: "kube-api-access-64cj7") pod "2be5e935-0d64-4fed-a00a-bd0cb5891e75" (UID: "2be5e935-0d64-4fed-a00a-bd0cb5891e75"). InnerVolumeSpecName "kube-api-access-64cj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.456575 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "6b90dab1-a183-4adc-b415-b67bd0d782f7" (UID: "6b90dab1-a183-4adc-b415-b67bd0d782f7"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.469795 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2be5e935-0d64-4fed-a00a-bd0cb5891e75" (UID: "2be5e935-0d64-4fed-a00a-bd0cb5891e75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.485108 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64cj7\" (UniqueName: \"kubernetes.io/projected/2be5e935-0d64-4fed-a00a-bd0cb5891e75-kube-api-access-64cj7\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.485131 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.485140 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b90dab1-a183-4adc-b415-b67bd0d782f7-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.485165 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.485174 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be5e935-0d64-4fed-a00a-bd0cb5891e75-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.495745 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2be5e935-0d64-4fed-a00a-bd0cb5891e75-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.495772 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.495782 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2be5e935-0d64-4fed-a00a-bd0cb5891e75-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.524578 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.549691 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "2be5e935-0d64-4fed-a00a-bd0cb5891e75" (UID: "2be5e935-0d64-4fed-a00a-bd0cb5891e75"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.586041 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "2be5e935-0d64-4fed-a00a-bd0cb5891e75" (UID: "2be5e935-0d64-4fed-a00a-bd0cb5891e75"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.597635 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.597669 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.597683 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2be5e935-0d64-4fed-a00a-bd0cb5891e75-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.653893 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.655571 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.655608 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.656790 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.656866 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8wzw9" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovsdb-server" Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.659094 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.666553 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.666642 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8wzw9" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovs-vswitchd" Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.700793 4766 secret.go:188] Couldn't get secret openstack/barbican-config-data: secret "barbican-config-data" not found Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.700883 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data podName:d9339929-4331-4cd9-89bc-8350ef2f55f5 nodeName:}" failed. No retries permitted until 2025-10-02 11:18:34.700858557 +0000 UTC m=+1629.643729551 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data") pod "barbican-keystone-listener-664d98ccd8-hh5xk" (UID: "d9339929-4331-4cd9-89bc-8350ef2f55f5") : secret "barbican-config-data" not found Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.700800 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.701703 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-config-data podName:874d062e-d2f8-462c-95b3-8f630b7120af nodeName:}" failed. No retries permitted until 2025-10-02 11:18:34.701680953 +0000 UTC m=+1629.644551897 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-config-data") pod "rabbitmq-cell1-server-0" (UID: "874d062e-d2f8-462c-95b3-8f630b7120af") : configmap "rabbitmq-cell1-config-data" not found Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.750623 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicance01-account-delete-jrx2r" event={"ID":"1a739206-d877-4212-9242-47a59c440b40","Type":"ContainerStarted","Data":"94e39b689fdb16a6f178e626a2e36dbe55d34173f5cc36767671775dca5198f8"} Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.793460 4766 scope.go:117] "RemoveContainer" containerID="a138202f6e7dfaa68f96f9c8f7a8329c46aee3fb87a8289a0c8e738c5d41167a" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.794294 4766 generic.go:334] "Generic (PLEG): container finished" podID="d9339929-4331-4cd9-89bc-8350ef2f55f5" containerID="dc1353dac7e3a318b9bf88253e7621b0e0c300fbb3ed030d2c367fb3cffe1ca0" exitCode=143 Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.794388 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" event={"ID":"d9339929-4331-4cd9-89bc-8350ef2f55f5","Type":"ContainerDied","Data":"dc1353dac7e3a318b9bf88253e7621b0e0c300fbb3ed030d2c367fb3cffe1ca0"} Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.801737 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell16336-account-delete-8q58s" event={"ID":"3ae83487-5f24-4934-aba5-9ee2ca6ca657","Type":"ContainerStarted","Data":"15363bcff6154d63ab66c0ff475bf73a8f2f486bb90162a1c1f06e57540daa32"} Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.817793 4766 scope.go:117] "RemoveContainer" containerID="ff40e39793793465f75fb22c204001262aaca34bc6d55d604cabba72ae0c9eb7" Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.828889 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff40e39793793465f75fb22c204001262aaca34bc6d55d604cabba72ae0c9eb7\": container with ID starting with ff40e39793793465f75fb22c204001262aaca34bc6d55d604cabba72ae0c9eb7 not found: ID does not exist" containerID="ff40e39793793465f75fb22c204001262aaca34bc6d55d604cabba72ae0c9eb7" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.828937 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff40e39793793465f75fb22c204001262aaca34bc6d55d604cabba72ae0c9eb7"} err="failed to get container status \"ff40e39793793465f75fb22c204001262aaca34bc6d55d604cabba72ae0c9eb7\": rpc error: code = NotFound desc = could not find container \"ff40e39793793465f75fb22c204001262aaca34bc6d55d604cabba72ae0c9eb7\": container with ID starting with ff40e39793793465f75fb22c204001262aaca34bc6d55d604cabba72ae0c9eb7 not found: ID does not exist" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.828968 4766 scope.go:117] "RemoveContainer" containerID="3f59b786bb34a976dbad318bdbfed8db8ed47518b0fa788de2b6cb253622cf0d" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.840288 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2be5e935-0d64-4fed-a00a-bd0cb5891e75","Type":"ContainerDied","Data":"74e9d0a9036f8f32f58b39c5d0babef8cb96a43502965fde9b8fa4d0219cd980"} Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.840366 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.858569 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance0d4d-account-delete-7cb92" event={"ID":"3fd6760c-af87-4e2a-adcd-5fe3ca636fef","Type":"ContainerStarted","Data":"88f9df4a8e65a4cc84c381dac9166d9d593a856fad49af2d2c681017092969b2"} Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.883338 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.890203 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.897332 4766 scope.go:117] "RemoveContainer" containerID="bf6025053b21cd7f8ba7a1e2074432dfda875d41206fe08b26d3183f068fdeaf" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.922730 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerID="2e00b32e05f94d6db2ce326a131639f65e69eaa8e091e4ac19058f4bc69304fa" exitCode=0 Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.922965 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerDied","Data":"2e00b32e05f94d6db2ce326a131639f65e69eaa8e091e4ac19058f4bc69304fa"} Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.923209 4766 scope.go:117] "RemoveContainer" containerID="763fc40370b8c637b94e717b3d4022b399a9064357843aa69085ce7652a30954" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.926625 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"6b90dab1-a183-4adc-b415-b67bd0d782f7","Type":"ContainerDied","Data":"9616b5ba569769404f649650902f1ec701ec8ad0ab367da6642ad863f86a8592"} Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.926656 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.931378 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder891f-account-delete-2hm8h" event={"ID":"3fac2bf1-fb0f-4031-bfc8-34090cc90c8a","Type":"ContainerStarted","Data":"0b6b15f103a052f81b9c293f86b9646a141dc83affc6a945de5fce8dbcb53ba8"} Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.933395 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapibccc-account-delete-n7pf6" event={"ID":"6ea7203d-5727-485f-8a6a-5bde96d05078","Type":"ContainerStarted","Data":"46bf389b5fc88b6dbfa295f51f35ca0b5b4f0b242188fa16176efa6b752f69c6"} Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.939880 4766 generic.go:334] "Generic (PLEG): container finished" podID="123b65f7-a8e8-434b-baf1-e9b0d3a985d9" containerID="805fa4defeec778eb8f810a670bb90a7f802c044adf651111138c5f240d5a4ad" exitCode=0 Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.939960 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"123b65f7-a8e8-434b-baf1-e9b0d3a985d9","Type":"ContainerDied","Data":"805fa4defeec778eb8f810a670bb90a7f802c044adf651111138c5f240d5a4ad"} Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.944734 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementd1c9-account-delete-j6zwl" event={"ID":"52bdcfda-75b3-450b-9db4-1a443be18fa3","Type":"ContainerStarted","Data":"5ce14e70f72aa571a01db0570e56eaaa769ab4855145e42bd80cfba66de691f5"} Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.944779 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementd1c9-account-delete-j6zwl" event={"ID":"52bdcfda-75b3-450b-9db4-1a443be18fa3","Type":"ContainerStarted","Data":"27516caab2b7be54b72e47aeca7afacf37503a3cafd4099db0546c54520ba38e"} Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.944910 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placementd1c9-account-delete-j6zwl" podUID="52bdcfda-75b3-450b-9db4-1a443be18fa3" containerName="mariadb-account-delete" containerID="cri-o://5ce14e70f72aa571a01db0570e56eaaa769ab4855145e42bd80cfba66de691f5" gracePeriod=30 Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.958342 4766 generic.go:334] "Generic (PLEG): container finished" podID="94c8c5ed-b069-4112-ae71-d9071bc15ff2" containerID="1739e7ba00b0bf9bb771dca9d289d586ed12ce58d682ed8f6ac6cf039138a291" exitCode=143 Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.958394 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75997cdf8b-nnlzj" event={"ID":"94c8c5ed-b069-4112-ae71-d9071bc15ff2","Type":"ContainerDied","Data":"1739e7ba00b0bf9bb771dca9d289d586ed12ce58d682ed8f6ac6cf039138a291"} Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.971488 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placementd1c9-account-delete-j6zwl" podStartSLOduration=4.971463601 podStartE2EDuration="4.971463601s" podCreationTimestamp="2025-10-02 11:18:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 11:18:30.961187342 +0000 UTC m=+1625.904058306" watchObservedRunningTime="2025-10-02 11:18:30.971463601 +0000 UTC m=+1625.914334545" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.979999 4766 scope.go:117] "RemoveContainer" containerID="3f59b786bb34a976dbad318bdbfed8db8ed47518b0fa788de2b6cb253622cf0d" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.980110 4766 generic.go:334] "Generic (PLEG): container finished" podID="8d43eab0-4595-42fc-8489-38792e0c6e19" containerID="f1a0b9913341b1fffedbb9296ca3e24f9abbcd44a25a0528cebe5da010d355e1" exitCode=143 Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.980158 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f5f68d797-k4qqv" event={"ID":"8d43eab0-4595-42fc-8489-38792e0c6e19","Type":"ContainerDied","Data":"f1a0b9913341b1fffedbb9296ca3e24f9abbcd44a25a0528cebe5da010d355e1"} Oct 02 11:18:30 crc kubenswrapper[4766]: E1002 11:18:30.981135 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f59b786bb34a976dbad318bdbfed8db8ed47518b0fa788de2b6cb253622cf0d\": container with ID starting with 3f59b786bb34a976dbad318bdbfed8db8ed47518b0fa788de2b6cb253622cf0d not found: ID does not exist" containerID="3f59b786bb34a976dbad318bdbfed8db8ed47518b0fa788de2b6cb253622cf0d" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.981192 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f59b786bb34a976dbad318bdbfed8db8ed47518b0fa788de2b6cb253622cf0d"} err="failed to get container status \"3f59b786bb34a976dbad318bdbfed8db8ed47518b0fa788de2b6cb253622cf0d\": rpc error: code = NotFound desc = could not find container \"3f59b786bb34a976dbad318bdbfed8db8ed47518b0fa788de2b6cb253622cf0d\": container with ID starting with 3f59b786bb34a976dbad318bdbfed8db8ed47518b0fa788de2b6cb253622cf0d not found: ID does not exist" Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.988983 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.996036 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.996094 4766 generic.go:334] "Generic (PLEG): container finished" podID="44893df1-77c5-494c-bae0-253447abc8f4" containerID="ff59b3f5c87557d99a67228679782023c256d54647b29c27032c95dfbc2bd77b" exitCode=0 Oct 02 11:18:30 crc kubenswrapper[4766]: I1002 11:18:30.996105 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67bd9fd99f-qbp28" event={"ID":"44893df1-77c5-494c-bae0-253447abc8f4","Type":"ContainerDied","Data":"ff59b3f5c87557d99a67228679782023c256d54647b29c27032c95dfbc2bd77b"} Oct 02 11:18:31 crc kubenswrapper[4766]: E1002 11:18:31.626485 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 11:18:31 crc kubenswrapper[4766]: E1002 11:18:31.626800 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-config-data podName:1282b506-728d-4c6f-aa9c-3d3c1f826b71 nodeName:}" failed. No retries permitted until 2025-10-02 11:18:35.626782577 +0000 UTC m=+1630.569653521 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-config-data") pod "rabbitmq-server-0" (UID: "1282b506-728d-4c6f-aa9c-3d3c1f826b71") : configmap "rabbitmq-config-data" not found Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.650839 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.727390 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-config-data\") pod \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.727559 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-vencrypt-tls-certs\") pod \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.728312 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-combined-ca-bundle\") pod \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.728403 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8rzm\" (UniqueName: \"kubernetes.io/projected/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-kube-api-access-w8rzm\") pod \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.728446 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-nova-novncproxy-tls-certs\") pod \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\" (UID: \"123b65f7-a8e8-434b-baf1-e9b0d3a985d9\") " Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.738709 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-kube-api-access-w8rzm" (OuterVolumeSpecName: "kube-api-access-w8rzm") pod "123b65f7-a8e8-434b-baf1-e9b0d3a985d9" (UID: "123b65f7-a8e8-434b-baf1-e9b0d3a985d9"). InnerVolumeSpecName "kube-api-access-w8rzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.774428 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.804005 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "123b65f7-a8e8-434b-baf1-e9b0d3a985d9" (UID: "123b65f7-a8e8-434b-baf1-e9b0d3a985d9"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.820089 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementd1c9-account-delete-j6zwl" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.829610 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44893df1-77c5-494c-bae0-253447abc8f4-run-httpd\") pod \"44893df1-77c5-494c-bae0-253447abc8f4\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.829755 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-config-data\") pod \"44893df1-77c5-494c-bae0-253447abc8f4\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.829840 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44893df1-77c5-494c-bae0-253447abc8f4-log-httpd\") pod \"44893df1-77c5-494c-bae0-253447abc8f4\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.829870 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-internal-tls-certs\") pod \"44893df1-77c5-494c-bae0-253447abc8f4\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.829888 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-public-tls-certs\") pod \"44893df1-77c5-494c-bae0-253447abc8f4\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.829920 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/44893df1-77c5-494c-bae0-253447abc8f4-etc-swift\") pod \"44893df1-77c5-494c-bae0-253447abc8f4\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.829958 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-combined-ca-bundle\") pod \"44893df1-77c5-494c-bae0-253447abc8f4\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.829988 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stv6j\" (UniqueName: \"kubernetes.io/projected/44893df1-77c5-494c-bae0-253447abc8f4-kube-api-access-stv6j\") pod \"44893df1-77c5-494c-bae0-253447abc8f4\" (UID: \"44893df1-77c5-494c-bae0-253447abc8f4\") " Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.829991 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44893df1-77c5-494c-bae0-253447abc8f4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "44893df1-77c5-494c-bae0-253447abc8f4" (UID: "44893df1-77c5-494c-bae0-253447abc8f4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.830325 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44893df1-77c5-494c-bae0-253447abc8f4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "44893df1-77c5-494c-bae0-253447abc8f4" (UID: "44893df1-77c5-494c-bae0-253447abc8f4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.830747 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44893df1-77c5-494c-bae0-253447abc8f4-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.830800 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8rzm\" (UniqueName: \"kubernetes.io/projected/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-kube-api-access-w8rzm\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.830817 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/44893df1-77c5-494c-bae0-253447abc8f4-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.831331 4766 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.858376 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.858753 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b5ac374-df46-4a36-947d-de07af25426c" containerName="ceilometer-central-agent" containerID="cri-o://e7e7c6bcc90173226bb27fb9e8539546111c6ddeb92e1200c4fa61b8a055e508" gracePeriod=30 Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.858867 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b5ac374-df46-4a36-947d-de07af25426c" containerName="proxy-httpd" containerID="cri-o://56bd28360a526e730981b495f3d32920b04b2ee9b375daef8362b798d8067eaf" gracePeriod=30 Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.858901 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b5ac374-df46-4a36-947d-de07af25426c" containerName="sg-core" containerID="cri-o://76a65e0801ebae9343bc82e73244baaf4bf39324be0c967cb15033840367ed33" gracePeriod=30 Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.858930 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7b5ac374-df46-4a36-947d-de07af25426c" containerName="ceilometer-notification-agent" containerID="cri-o://6384876206ceb62e698ffec3b4534e9b368119d727684623f371cd380375cf5c" gracePeriod=30 Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.885967 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44893df1-77c5-494c-bae0-253447abc8f4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "44893df1-77c5-494c-bae0-253447abc8f4" (UID: "44893df1-77c5-494c-bae0-253447abc8f4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.896144 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44893df1-77c5-494c-bae0-253447abc8f4-kube-api-access-stv6j" (OuterVolumeSpecName: "kube-api-access-stv6j") pod "44893df1-77c5-494c-bae0-253447abc8f4" (UID: "44893df1-77c5-494c-bae0-253447abc8f4"). InnerVolumeSpecName "kube-api-access-stv6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.904040 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-config-data" (OuterVolumeSpecName: "config-data") pod "123b65f7-a8e8-434b-baf1-e9b0d3a985d9" (UID: "123b65f7-a8e8-434b-baf1-e9b0d3a985d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.904102 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "123b65f7-a8e8-434b-baf1-e9b0d3a985d9" (UID: "123b65f7-a8e8-434b-baf1-e9b0d3a985d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.933790 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g86jj\" (UniqueName: \"kubernetes.io/projected/52bdcfda-75b3-450b-9db4-1a443be18fa3-kube-api-access-g86jj\") pod \"52bdcfda-75b3-450b-9db4-1a443be18fa3\" (UID: \"52bdcfda-75b3-450b-9db4-1a443be18fa3\") " Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.939615 4766 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/44893df1-77c5-494c-bae0-253447abc8f4-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.939649 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stv6j\" (UniqueName: \"kubernetes.io/projected/44893df1-77c5-494c-bae0-253447abc8f4-kube-api-access-stv6j\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.939666 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.939677 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.951390 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52bdcfda-75b3-450b-9db4-1a443be18fa3-kube-api-access-g86jj" (OuterVolumeSpecName: "kube-api-access-g86jj") pod "52bdcfda-75b3-450b-9db4-1a443be18fa3" (UID: "52bdcfda-75b3-450b-9db4-1a443be18fa3"). InnerVolumeSpecName "kube-api-access-g86jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.955490 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.956029 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0be46caa-0351-4f60-b16b-a258b9874a6f" path="/var/lib/kubelet/pods/0be46caa-0351-4f60-b16b-a258b9874a6f/volumes" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.958057 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2be5e935-0d64-4fed-a00a-bd0cb5891e75" path="/var/lib/kubelet/pods/2be5e935-0d64-4fed-a00a-bd0cb5891e75/volumes" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.966835 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b90dab1-a183-4adc-b415-b67bd0d782f7" path="/var/lib/kubelet/pods/6b90dab1-a183-4adc-b415-b67bd0d782f7/volumes" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.967863 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95842554-1651-4c34-b934-d4eb21c6c52d" path="/var/lib/kubelet/pods/95842554-1651-4c34-b934-d4eb21c6c52d/volumes" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.971465 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.971892 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9860354f-7494-4b02-bca3-adc731683f7f" path="/var/lib/kubelet/pods/9860354f-7494-4b02-bca3-adc731683f7f/volumes" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.972683 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fdf37c9-9a32-4103-8418-198d45d14415" path="/var/lib/kubelet/pods/9fdf37c9-9a32-4103-8418-198d45d14415/volumes" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.978438 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "44893df1-77c5-494c-bae0-253447abc8f4" (UID: "44893df1-77c5-494c-bae0-253447abc8f4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:31 crc kubenswrapper[4766]: I1002 11:18:31.979884 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "123b65f7-a8e8-434b-baf1-e9b0d3a985d9" (UID: "123b65f7-a8e8-434b-baf1-e9b0d3a985d9"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.006286 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.006782 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5a8ba140-6dc8-4023-9789-7f288b85159b" containerName="kube-state-metrics" containerID="cri-o://24dcbf81a6048e4223093aaf313d135dc7e342e1aad2567595ccd81590fd91ce" gracePeriod=30 Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.035767 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "44893df1-77c5-494c-bae0-253447abc8f4" (UID: "44893df1-77c5-494c-bae0-253447abc8f4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.042734 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-config-data" (OuterVolumeSpecName: "config-data") pod "44893df1-77c5-494c-bae0-253447abc8f4" (UID: "44893df1-77c5-494c-bae0-253447abc8f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043203 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-config-data-custom\") pod \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043246 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4b9bc510-a878-4e06-8db9-fd6209039c75-config-data-generated\") pod \"4b9bc510-a878-4e06-8db9-fd6209039c75\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043304 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-logs\") pod \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043351 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bglnx\" (UniqueName: \"kubernetes.io/projected/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-kube-api-access-bglnx\") pod \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043374 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-public-tls-certs\") pod \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043405 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"4b9bc510-a878-4e06-8db9-fd6209039c75\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043430 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-combined-ca-bundle\") pod \"4b9bc510-a878-4e06-8db9-fd6209039c75\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043446 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-combined-ca-bundle\") pod \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043496 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-config-data-default\") pod \"4b9bc510-a878-4e06-8db9-fd6209039c75\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043529 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-galera-tls-certs\") pod \"4b9bc510-a878-4e06-8db9-fd6209039c75\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043570 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4cbv\" (UniqueName: \"kubernetes.io/projected/4b9bc510-a878-4e06-8db9-fd6209039c75-kube-api-access-f4cbv\") pod \"4b9bc510-a878-4e06-8db9-fd6209039c75\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043595 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-etc-machine-id\") pod \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043635 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-internal-tls-certs\") pod \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043663 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-kolla-config\") pod \"4b9bc510-a878-4e06-8db9-fd6209039c75\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043693 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-secrets\") pod \"4b9bc510-a878-4e06-8db9-fd6209039c75\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043717 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-operator-scripts\") pod \"4b9bc510-a878-4e06-8db9-fd6209039c75\" (UID: \"4b9bc510-a878-4e06-8db9-fd6209039c75\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043745 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-scripts\") pod \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.043762 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-config-data\") pod \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\" (UID: \"d4d0079a-03e3-4e5f-81a2-81f5bceb795c\") " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.048730 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-logs" (OuterVolumeSpecName: "logs") pod "d4d0079a-03e3-4e5f-81a2-81f5bceb795c" (UID: "d4d0079a-03e3-4e5f-81a2-81f5bceb795c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.049158 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.049175 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.049185 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g86jj\" (UniqueName: \"kubernetes.io/projected/52bdcfda-75b3-450b-9db4-1a443be18fa3-kube-api-access-g86jj\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.049195 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.049204 4766 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/123b65f7-a8e8-434b-baf1-e9b0d3a985d9-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.049213 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.050555 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "4b9bc510-a878-4e06-8db9-fd6209039c75" (UID: "4b9bc510-a878-4e06-8db9-fd6209039c75"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.054821 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d4d0079a-03e3-4e5f-81a2-81f5bceb795c" (UID: "d4d0079a-03e3-4e5f-81a2-81f5bceb795c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.055169 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "4b9bc510-a878-4e06-8db9-fd6209039c75" (UID: "4b9bc510-a878-4e06-8db9-fd6209039c75"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.055383 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4b9bc510-a878-4e06-8db9-fd6209039c75" (UID: "4b9bc510-a878-4e06-8db9-fd6209039c75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.057842 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b9bc510-a878-4e06-8db9-fd6209039c75-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "4b9bc510-a878-4e06-8db9-fd6209039c75" (UID: "4b9bc510-a878-4e06-8db9-fd6209039c75"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.059537 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-secrets" (OuterVolumeSpecName: "secrets") pod "4b9bc510-a878-4e06-8db9-fd6209039c75" (UID: "4b9bc510-a878-4e06-8db9-fd6209039c75"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.059678 4766 generic.go:334] "Generic (PLEG): container finished" podID="1a739206-d877-4212-9242-47a59c440b40" containerID="c160368cae86319957eb982a337f43c9e94cfdb7fb7fcd90fcf28831821e07b1" exitCode=0 Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.059806 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicance01-account-delete-jrx2r" event={"ID":"1a739206-d877-4212-9242-47a59c440b40","Type":"ContainerDied","Data":"c160368cae86319957eb982a337f43c9e94cfdb7fb7fcd90fcf28831821e07b1"} Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.065983 4766 generic.go:334] "Generic (PLEG): container finished" podID="4b9bc510-a878-4e06-8db9-fd6209039c75" containerID="b13934ad964a9b61c115770887bcefe16bbb075f526931fb0fa8c919bf1f20b6" exitCode=0 Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.066110 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4b9bc510-a878-4e06-8db9-fd6209039c75","Type":"ContainerDied","Data":"b13934ad964a9b61c115770887bcefe16bbb075f526931fb0fa8c919bf1f20b6"} Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.066149 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4b9bc510-a878-4e06-8db9-fd6209039c75","Type":"ContainerDied","Data":"6b77f3a526b5379881fc70c788250bd6ccaa071afb0137dd9dd28685b0ef78b0"} Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.066172 4766 scope.go:117] "RemoveContainer" containerID="b13934ad964a9b61c115770887bcefe16bbb075f526931fb0fa8c919bf1f20b6" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.066175 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.088898 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d4d0079a-03e3-4e5f-81a2-81f5bceb795c" (UID: "d4d0079a-03e3-4e5f-81a2-81f5bceb795c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.088967 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-kube-api-access-bglnx" (OuterVolumeSpecName: "kube-api-access-bglnx") pod "d4d0079a-03e3-4e5f-81a2-81f5bceb795c" (UID: "d4d0079a-03e3-4e5f-81a2-81f5bceb795c"). InnerVolumeSpecName "kube-api-access-bglnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.100194 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-scripts" (OuterVolumeSpecName: "scripts") pod "d4d0079a-03e3-4e5f-81a2-81f5bceb795c" (UID: "d4d0079a-03e3-4e5f-81a2-81f5bceb795c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.100647 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b9bc510-a878-4e06-8db9-fd6209039c75-kube-api-access-f4cbv" (OuterVolumeSpecName: "kube-api-access-f4cbv") pod "4b9bc510-a878-4e06-8db9-fd6209039c75" (UID: "4b9bc510-a878-4e06-8db9-fd6209039c75"). InnerVolumeSpecName "kube-api-access-f4cbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.122166 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "mysql-db") pod "4b9bc510-a878-4e06-8db9-fd6209039c75" (UID: "4b9bc510-a878-4e06-8db9-fd6209039c75"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.131658 4766 generic.go:334] "Generic (PLEG): container finished" podID="7b5ac374-df46-4a36-947d-de07af25426c" containerID="76a65e0801ebae9343bc82e73244baaf4bf39324be0c967cb15033840367ed33" exitCode=2 Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.131742 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b5ac374-df46-4a36-947d-de07af25426c","Type":"ContainerDied","Data":"76a65e0801ebae9343bc82e73244baaf4bf39324be0c967cb15033840367ed33"} Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.157166 4766 generic.go:334] "Generic (PLEG): container finished" podID="d4d0079a-03e3-4e5f-81a2-81f5bceb795c" containerID="a12103d025e03ad958180a996161105d2c14a35f5ae1e7d1aabe48aab8387391" exitCode=0 Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.157263 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4d0079a-03e3-4e5f-81a2-81f5bceb795c","Type":"ContainerDied","Data":"a12103d025e03ad958180a996161105d2c14a35f5ae1e7d1aabe48aab8387391"} Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.157292 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4d0079a-03e3-4e5f-81a2-81f5bceb795c","Type":"ContainerDied","Data":"60cef1d4fbe94825d19e68a8010ef54485ba97004cddf6cefcf03ef0eea9192b"} Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.157380 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.158206 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.171:9292/healthcheck\": read tcp 10.217.0.2:47390->10.217.0.171:9292: read: connection reset by peer" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.158572 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.171:9292/healthcheck\": read tcp 10.217.0.2:47406->10.217.0.171:9292: read: connection reset by peer" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.159058 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="c3384223-2ad0-4593-976c-54c2d3cce52e" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.169:9292/healthcheck\": read tcp 10.217.0.2:48114->10.217.0.169:9292: read: connection reset by peer" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.159179 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="c3384223-2ad0-4593-976c-54c2d3cce52e" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.169:9292/healthcheck\": read tcp 10.217.0.2:48100->10.217.0.169:9292: read: connection reset by peer" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.164095 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.164131 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.164140 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.164149 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4b9bc510-a878-4e06-8db9-fd6209039c75-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.164159 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bglnx\" (UniqueName: \"kubernetes.io/projected/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-kube-api-access-bglnx\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.164179 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.164190 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.164198 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4cbv\" (UniqueName: \"kubernetes.io/projected/4b9bc510-a878-4e06-8db9-fd6209039c75-kube-api-access-f4cbv\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.164206 4766 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.164213 4766 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4b9bc510-a878-4e06-8db9-fd6209039c75-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.164223 4766 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.212735 4766 scope.go:117] "RemoveContainer" containerID="3b287664b7e8befe1a7a58d7e649bb1c340794784b406077aa1021e801ae5c34" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.213153 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"123b65f7-a8e8-434b-baf1-e9b0d3a985d9","Type":"ContainerDied","Data":"0f940b03ab75c092d0d62f5f74a4f6f7d16c0c06588333446c8859d2097afec1"} Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.213301 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.287778 4766 generic.go:334] "Generic (PLEG): container finished" podID="3fd6760c-af87-4e2a-adcd-5fe3ca636fef" containerID="0e31d1e220ac761acdc5da9aa0f9af96e22aabf5bfdb6457f1154c8742a0295b" exitCode=0 Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.287878 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance0d4d-account-delete-7cb92" event={"ID":"3fd6760c-af87-4e2a-adcd-5fe3ca636fef","Type":"ContainerDied","Data":"0e31d1e220ac761acdc5da9aa0f9af96e22aabf5bfdb6457f1154c8742a0295b"} Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.318536 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-6xbv2"] Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.340229 4766 generic.go:334] "Generic (PLEG): container finished" podID="6ea7203d-5727-485f-8a6a-5bde96d05078" containerID="6135436a83b835e03c70b30a3e8fc15e4387e379664e3d6e6ac0c954d565a595" exitCode=0 Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.340382 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapibccc-account-delete-n7pf6" event={"ID":"6ea7203d-5727-485f-8a6a-5bde96d05078","Type":"ContainerDied","Data":"6135436a83b835e03c70b30a3e8fc15e4387e379664e3d6e6ac0c954d565a595"} Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.350211 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d4d0079a-03e3-4e5f-81a2-81f5bceb795c" (UID: "d4d0079a-03e3-4e5f-81a2-81f5bceb795c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.353943 4766 generic.go:334] "Generic (PLEG): container finished" podID="3fac2bf1-fb0f-4031-bfc8-34090cc90c8a" containerID="cc2390f57c0079f23233bafd281f8a6e5358940b44e887d48c5d270690334bf0" exitCode=0 Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.354079 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder891f-account-delete-2hm8h" event={"ID":"3fac2bf1-fb0f-4031-bfc8-34090cc90c8a","Type":"ContainerDied","Data":"cc2390f57c0079f23233bafd281f8a6e5358940b44e887d48c5d270690334bf0"} Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.359129 4766 generic.go:334] "Generic (PLEG): container finished" podID="3ae83487-5f24-4934-aba5-9ee2ca6ca657" containerID="c49ddbab603ee44d8b491bab6b3b0f05fc4a499c55527957a337c027a9b1320d" exitCode=1 Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.359289 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell16336-account-delete-8q58s" event={"ID":"3ae83487-5f24-4934-aba5-9ee2ca6ca657","Type":"ContainerDied","Data":"c49ddbab603ee44d8b491bab6b3b0f05fc4a499c55527957a337c027a9b1320d"} Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.372131 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44893df1-77c5-494c-bae0-253447abc8f4" (UID: "44893df1-77c5-494c-bae0-253447abc8f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.386120 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.386150 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44893df1-77c5-494c-bae0-253447abc8f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.387924 4766 generic.go:334] "Generic (PLEG): container finished" podID="52bdcfda-75b3-450b-9db4-1a443be18fa3" containerID="5ce14e70f72aa571a01db0570e56eaaa769ab4855145e42bd80cfba66de691f5" exitCode=0 Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.388043 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementd1c9-account-delete-j6zwl" event={"ID":"52bdcfda-75b3-450b-9db4-1a443be18fa3","Type":"ContainerDied","Data":"5ce14e70f72aa571a01db0570e56eaaa769ab4855145e42bd80cfba66de691f5"} Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.388087 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placementd1c9-account-delete-j6zwl" event={"ID":"52bdcfda-75b3-450b-9db4-1a443be18fa3","Type":"ContainerDied","Data":"27516caab2b7be54b72e47aeca7afacf37503a3cafd4099db0546c54520ba38e"} Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.388158 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placementd1c9-account-delete-j6zwl" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.388987 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.389156 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-6xbv2"] Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.393150 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="bbb13294-05c1-4a20-8265-5144efcd91cf" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:60546->10.217.0.203:8775: read: connection reset by peer" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.393329 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="bbb13294-05c1-4a20-8265-5144efcd91cf" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": read tcp 10.217.0.2:60552->10.217.0.203:8775: read: connection reset by peer" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.396203 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-config-data" (OuterVolumeSpecName: "config-data") pod "d4d0079a-03e3-4e5f-81a2-81f5bceb795c" (UID: "d4d0079a-03e3-4e5f-81a2-81f5bceb795c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.399064 4766 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/novacell16336-account-delete-8q58s" secret="" err="secret \"galera-openstack-cell1-dockercfg-q2pfv\" not found" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.399147 4766 scope.go:117] "RemoveContainer" containerID="c49ddbab603ee44d8b491bab6b3b0f05fc4a499c55527957a337c027a9b1320d" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.399824 4766 generic.go:334] "Generic (PLEG): container finished" podID="44893df1-77c5-494c-bae0-253447abc8f4" containerID="15c74c8e7a8896b2165a6a68dd5ee3e1b21f3fcb860feba3289044c93dbf1f19" exitCode=0 Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.399862 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67bd9fd99f-qbp28" event={"ID":"44893df1-77c5-494c-bae0-253447abc8f4","Type":"ContainerDied","Data":"15c74c8e7a8896b2165a6a68dd5ee3e1b21f3fcb860feba3289044c93dbf1f19"} Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.399888 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-67bd9fd99f-qbp28" event={"ID":"44893df1-77c5-494c-bae0-253447abc8f4","Type":"ContainerDied","Data":"a6e259f88b512ed89d5740fea0631a92320aab896a4a089d15e2abdb5846627e"} Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.399950 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-67bd9fd99f-qbp28" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.407371 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jzlb4"] Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.428042 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jzlb4"] Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.431592 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b9bc510-a878-4e06-8db9-fd6209039c75" (UID: "4b9bc510-a878-4e06-8db9-fd6209039c75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.448456 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-576797c867-n7r4b"] Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.448867 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-576797c867-n7r4b" podUID="07e3dfd6-c718-4304-9770-edbbfaca9cf4" containerName="keystone-api" containerID="cri-o://e6c3d2e041b5f5a0a93635dedbb2e7ad90fbe97c81c3d583742c3fd4c3beb5a3" gracePeriod=30 Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.467649 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.468945 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4d0079a-03e3-4e5f-81a2-81f5bceb795c" (UID: "d4d0079a-03e3-4e5f-81a2-81f5bceb795c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.479852 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "4b9bc510-a878-4e06-8db9-fd6209039c75" (UID: "4b9bc510-a878-4e06-8db9-fd6209039c75"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.488362 4766 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.488389 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.488398 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.488407 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b9bc510-a878-4e06-8db9-fd6209039c75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.488416 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.509575 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-6hj8b"] Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.533127 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-6hj8b"] Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.533962 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d4d0079a-03e3-4e5f-81a2-81f5bceb795c" (UID: "d4d0079a-03e3-4e5f-81a2-81f5bceb795c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.542740 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c257-account-create-k72jx"] Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.559576 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c257-account-create-k72jx"] Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.588148 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-qhzpn"] Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.591557 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-qhzpn"] Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.595929 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d0079a-03e3-4e5f-81a2-81f5bceb795c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.614561 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-6336-account-create-vcx9z"] Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.623596 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell16336-account-delete-8q58s"] Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.628104 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-6336-account-create-vcx9z"] Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.634000 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-75997cdf8b-nnlzj" podUID="94c8c5ed-b069-4112-ae71-d9071bc15ff2" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.148:9311/healthcheck\": read tcp 10.217.0.2:55380->10.217.0.148:9311: read: connection reset by peer" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.634048 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-75997cdf8b-nnlzj" podUID="94c8c5ed-b069-4112-ae71-d9071bc15ff2" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.148:9311/healthcheck\": read tcp 10.217.0.2:55382->10.217.0.148:9311: read: connection reset by peer" Oct 02 11:18:32 crc kubenswrapper[4766]: E1002 11:18:32.803074 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 11:18:32 crc kubenswrapper[4766]: E1002 11:18:32.808658 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 11:18:32 crc kubenswrapper[4766]: E1002 11:18:32.825660 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Oct 02 11:18:32 crc kubenswrapper[4766]: E1002 11:18:32.825742 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="7498a37c-33a3-4a3a-9c72-64a0c533282c" containerName="ovn-northd" Oct 02 11:18:32 crc kubenswrapper[4766]: I1002 11:18:32.876106 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="e63ea453-c8bd-4128-a47e-7b0d740a6066" containerName="galera" containerID="cri-o://059c180c1ce2eff2818e38ff90b0da551c9f6423a5517609968f6e8b7d8825e4" gracePeriod=30 Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.395664 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicance01-account-delete-jrx2r" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.404782 4766 scope.go:117] "RemoveContainer" containerID="b13934ad964a9b61c115770887bcefe16bbb075f526931fb0fa8c919bf1f20b6" Oct 02 11:18:33 crc kubenswrapper[4766]: E1002 11:18:33.406958 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b13934ad964a9b61c115770887bcefe16bbb075f526931fb0fa8c919bf1f20b6\": container with ID starting with b13934ad964a9b61c115770887bcefe16bbb075f526931fb0fa8c919bf1f20b6 not found: ID does not exist" containerID="b13934ad964a9b61c115770887bcefe16bbb075f526931fb0fa8c919bf1f20b6" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.407003 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b13934ad964a9b61c115770887bcefe16bbb075f526931fb0fa8c919bf1f20b6"} err="failed to get container status \"b13934ad964a9b61c115770887bcefe16bbb075f526931fb0fa8c919bf1f20b6\": rpc error: code = NotFound desc = could not find container \"b13934ad964a9b61c115770887bcefe16bbb075f526931fb0fa8c919bf1f20b6\": container with ID starting with b13934ad964a9b61c115770887bcefe16bbb075f526931fb0fa8c919bf1f20b6 not found: ID does not exist" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.407029 4766 scope.go:117] "RemoveContainer" containerID="3b287664b7e8befe1a7a58d7e649bb1c340794784b406077aa1021e801ae5c34" Oct 02 11:18:33 crc kubenswrapper[4766]: E1002 11:18:33.407787 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b287664b7e8befe1a7a58d7e649bb1c340794784b406077aa1021e801ae5c34\": container with ID starting with 3b287664b7e8befe1a7a58d7e649bb1c340794784b406077aa1021e801ae5c34 not found: ID does not exist" containerID="3b287664b7e8befe1a7a58d7e649bb1c340794784b406077aa1021e801ae5c34" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.407837 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b287664b7e8befe1a7a58d7e649bb1c340794784b406077aa1021e801ae5c34"} err="failed to get container status \"3b287664b7e8befe1a7a58d7e649bb1c340794784b406077aa1021e801ae5c34\": rpc error: code = NotFound desc = could not find container \"3b287664b7e8befe1a7a58d7e649bb1c340794784b406077aa1021e801ae5c34\": container with ID starting with 3b287664b7e8befe1a7a58d7e649bb1c340794784b406077aa1021e801ae5c34 not found: ID does not exist" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.407866 4766 scope.go:117] "RemoveContainer" containerID="a12103d025e03ad958180a996161105d2c14a35f5ae1e7d1aabe48aab8387391" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.429236 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmmgg\" (UniqueName: \"kubernetes.io/projected/1a739206-d877-4212-9242-47a59c440b40-kube-api-access-xmmgg\") pod \"1a739206-d877-4212-9242-47a59c440b40\" (UID: \"1a739206-d877-4212-9242-47a59c440b40\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.440281 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder891f-account-delete-2hm8h" event={"ID":"3fac2bf1-fb0f-4031-bfc8-34090cc90c8a","Type":"ContainerDied","Data":"0b6b15f103a052f81b9c293f86b9646a141dc83affc6a945de5fce8dbcb53ba8"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.440322 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b6b15f103a052f81b9c293f86b9646a141dc83affc6a945de5fce8dbcb53ba8" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.451634 4766 generic.go:334] "Generic (PLEG): container finished" podID="c3384223-2ad0-4593-976c-54c2d3cce52e" containerID="fb1c0454ba668b962552208833c616de4db07019cb885d1cc5bdc0cc294a91b5" exitCode=0 Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.451726 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3384223-2ad0-4593-976c-54c2d3cce52e","Type":"ContainerDied","Data":"fb1c0454ba668b962552208833c616de4db07019cb885d1cc5bdc0cc294a91b5"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.451757 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c3384223-2ad0-4593-976c-54c2d3cce52e","Type":"ContainerDied","Data":"2678f39c8d212c3ab839bf56d4dd3f9aa1f4b268f5e1cbe92ca7cb046f6d3082"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.451772 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2678f39c8d212c3ab839bf56d4dd3f9aa1f4b268f5e1cbe92ca7cb046f6d3082" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.453087 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a739206-d877-4212-9242-47a59c440b40-kube-api-access-xmmgg" (OuterVolumeSpecName: "kube-api-access-xmmgg") pod "1a739206-d877-4212-9242-47a59c440b40" (UID: "1a739206-d877-4212-9242-47a59c440b40"). InnerVolumeSpecName "kube-api-access-xmmgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.456633 4766 generic.go:334] "Generic (PLEG): container finished" podID="7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" containerID="0b8dae08b6fef80dba408e5df15861c9a4ec087115f964d50b2c13d7ce34c9a4" exitCode=0 Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.456695 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84bf49766d-bbf2p" event={"ID":"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca","Type":"ContainerDied","Data":"0b8dae08b6fef80dba408e5df15861c9a4ec087115f964d50b2c13d7ce34c9a4"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.456722 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-84bf49766d-bbf2p" event={"ID":"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca","Type":"ContainerDied","Data":"fc14bb35873b38c4cb42c7f4f961d971819f9c86d26a74108f67b78c72373197"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.456733 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc14bb35873b38c4cb42c7f4f961d971819f9c86d26a74108f67b78c72373197" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.461689 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapibccc-account-delete-n7pf6" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.465460 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbicance01-account-delete-jrx2r" event={"ID":"1a739206-d877-4212-9242-47a59c440b40","Type":"ContainerDied","Data":"94e39b689fdb16a6f178e626a2e36dbe55d34173f5cc36767671775dca5198f8"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.465545 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbicance01-account-delete-jrx2r" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.466281 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placementd1c9-account-delete-j6zwl"] Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.476611 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance0d4d-account-delete-7cb92" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.476996 4766 generic.go:334] "Generic (PLEG): container finished" podID="d9339929-4331-4cd9-89bc-8350ef2f55f5" containerID="973c651619479183947224e4242097f161cf673b8da2935544f44e1700d072b9" exitCode=0 Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.477288 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" event={"ID":"d9339929-4331-4cd9-89bc-8350ef2f55f5","Type":"ContainerDied","Data":"973c651619479183947224e4242097f161cf673b8da2935544f44e1700d072b9"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.490874 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placementd1c9-account-delete-j6zwl"] Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.492133 4766 generic.go:334] "Generic (PLEG): container finished" podID="7b5ac374-df46-4a36-947d-de07af25426c" containerID="56bd28360a526e730981b495f3d32920b04b2ee9b375daef8362b798d8067eaf" exitCode=0 Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.492155 4766 generic.go:334] "Generic (PLEG): container finished" podID="7b5ac374-df46-4a36-947d-de07af25426c" containerID="e7e7c6bcc90173226bb27fb9e8539546111c6ddeb92e1200c4fa61b8a055e508" exitCode=0 Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.492197 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b5ac374-df46-4a36-947d-de07af25426c","Type":"ContainerDied","Data":"56bd28360a526e730981b495f3d32920b04b2ee9b375daef8362b798d8067eaf"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.492217 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b5ac374-df46-4a36-947d-de07af25426c","Type":"ContainerDied","Data":"e7e7c6bcc90173226bb27fb9e8539546111c6ddeb92e1200c4fa61b8a055e508"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.495470 4766 generic.go:334] "Generic (PLEG): container finished" podID="5a8ba140-6dc8-4023-9789-7f288b85159b" containerID="24dcbf81a6048e4223093aaf313d135dc7e342e1aad2567595ccd81590fd91ce" exitCode=2 Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.495586 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5a8ba140-6dc8-4023-9789-7f288b85159b","Type":"ContainerDied","Data":"24dcbf81a6048e4223093aaf313d135dc7e342e1aad2567595ccd81590fd91ce"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.495618 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5a8ba140-6dc8-4023-9789-7f288b85159b","Type":"ContainerDied","Data":"bae717950829c29120d95be44d719c6349367f0c2fd854a2a1a7e62843f94457"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.495632 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bae717950829c29120d95be44d719c6349367f0c2fd854a2a1a7e62843f94457" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.498652 4766 scope.go:117] "RemoveContainer" containerID="11c17e3e9a13cd8d34cb7338544e7b564639e6d7cc5158ff63319715efa5d8b5" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.501817 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.503492 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.510767 4766 generic.go:334] "Generic (PLEG): container finished" podID="bbb13294-05c1-4a20-8265-5144efcd91cf" containerID="f6cd0095fa2a61271a1c6b2812af96732964a92ed3e96a78c81919c9ef11e724" exitCode=0 Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.510867 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bbb13294-05c1-4a20-8265-5144efcd91cf","Type":"ContainerDied","Data":"f6cd0095fa2a61271a1c6b2812af96732964a92ed3e96a78c81919c9ef11e724"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.510902 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bbb13294-05c1-4a20-8265-5144efcd91cf","Type":"ContainerDied","Data":"fcfd252f8c9de1f953452d449be9ba00a0d24514484b0499918450f28b305d8f"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.510913 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcfd252f8c9de1f953452d449be9ba00a0d24514484b0499918450f28b305d8f" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.512895 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder891f-account-delete-2hm8h" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.513034 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.519734 4766 generic.go:334] "Generic (PLEG): container finished" podID="94c8c5ed-b069-4112-ae71-d9071bc15ff2" containerID="6cd538b1cdf3993f6cd959aebdda72d9055a7730e3cb15b1a555f7da8b9b1353" exitCode=0 Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.519816 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75997cdf8b-nnlzj" event={"ID":"94c8c5ed-b069-4112-ae71-d9071bc15ff2","Type":"ContainerDied","Data":"6cd538b1cdf3993f6cd959aebdda72d9055a7730e3cb15b1a555f7da8b9b1353"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.519843 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-75997cdf8b-nnlzj" event={"ID":"94c8c5ed-b069-4112-ae71-d9071bc15ff2","Type":"ContainerDied","Data":"bce8cff20a12b2c91f70e55f9d1b924210a91e2f479f037c37aad32da97c53ba"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.519852 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bce8cff20a12b2c91f70e55f9d1b924210a91e2f479f037c37aad32da97c53ba" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.523596 4766 generic.go:334] "Generic (PLEG): container finished" podID="8d43eab0-4595-42fc-8489-38792e0c6e19" containerID="433ae393df6f772a4b0964a7a633bfcdca8d7d78296edfa4ca875b807cacbd06" exitCode=0 Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.523650 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f5f68d797-k4qqv" event={"ID":"8d43eab0-4595-42fc-8489-38792e0c6e19","Type":"ContainerDied","Data":"433ae393df6f772a4b0964a7a633bfcdca8d7d78296edfa4ca875b807cacbd06"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.527771 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.528208 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.529678 4766 generic.go:334] "Generic (PLEG): container finished" podID="0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" containerID="4a4f512030ce39ec49c91ab00c8b625423cbf01429bd4a06c2d5b12c2bf0ca55" exitCode=0 Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.529760 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598","Type":"ContainerDied","Data":"4a4f512030ce39ec49c91ab00c8b625423cbf01429bd4a06c2d5b12c2bf0ca55"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.530928 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kx85\" (UniqueName: \"kubernetes.io/projected/6ea7203d-5727-485f-8a6a-5bde96d05078-kube-api-access-6kx85\") pod \"6ea7203d-5727-485f-8a6a-5bde96d05078\" (UID: \"6ea7203d-5727-485f-8a6a-5bde96d05078\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.531049 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhfrf\" (UniqueName: \"kubernetes.io/projected/3fd6760c-af87-4e2a-adcd-5fe3ca636fef-kube-api-access-mhfrf\") pod \"3fd6760c-af87-4e2a-adcd-5fe3ca636fef\" (UID: \"3fd6760c-af87-4e2a-adcd-5fe3ca636fef\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.531422 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmmgg\" (UniqueName: \"kubernetes.io/projected/1a739206-d877-4212-9242-47a59c440b40-kube-api-access-xmmgg\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.531456 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598","Type":"ContainerDied","Data":"999fb30da080219cb77954c8dd4c088483abadab4f8b242483e557bbf7b94ab3"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.535165 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapibccc-account-delete-n7pf6" event={"ID":"6ea7203d-5727-485f-8a6a-5bde96d05078","Type":"ContainerDied","Data":"46bf389b5fc88b6dbfa295f51f35ca0b5b4f0b242188fa16176efa6b752f69c6"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.535315 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapibccc-account-delete-n7pf6" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.536326 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea7203d-5727-485f-8a6a-5bde96d05078-kube-api-access-6kx85" (OuterVolumeSpecName: "kube-api-access-6kx85") pod "6ea7203d-5727-485f-8a6a-5bde96d05078" (UID: "6ea7203d-5727-485f-8a6a-5bde96d05078"). InnerVolumeSpecName "kube-api-access-6kx85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.537708 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.537965 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd6760c-af87-4e2a-adcd-5fe3ca636fef-kube-api-access-mhfrf" (OuterVolumeSpecName: "kube-api-access-mhfrf") pod "3fd6760c-af87-4e2a-adcd-5fe3ca636fef" (UID: "3fd6760c-af87-4e2a-adcd-5fe3ca636fef"). InnerVolumeSpecName "kube-api-access-mhfrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.538224 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.539972 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance0d4d-account-delete-7cb92" event={"ID":"3fd6760c-af87-4e2a-adcd-5fe3ca636fef","Type":"ContainerDied","Data":"88f9df4a8e65a4cc84c381dac9166d9d593a856fad49af2d2c681017092969b2"} Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.540085 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance0d4d-account-delete-7cb92" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.553428 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.559253 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.564925 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-67bd9fd99f-qbp28"] Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.570602 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-67bd9fd99f-qbp28"] Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.578099 4766 scope.go:117] "RemoveContainer" containerID="a12103d025e03ad958180a996161105d2c14a35f5ae1e7d1aabe48aab8387391" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.578364 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:18:33 crc kubenswrapper[4766]: E1002 11:18:33.578897 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a12103d025e03ad958180a996161105d2c14a35f5ae1e7d1aabe48aab8387391\": container with ID starting with a12103d025e03ad958180a996161105d2c14a35f5ae1e7d1aabe48aab8387391 not found: ID does not exist" containerID="a12103d025e03ad958180a996161105d2c14a35f5ae1e7d1aabe48aab8387391" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.578936 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a12103d025e03ad958180a996161105d2c14a35f5ae1e7d1aabe48aab8387391"} err="failed to get container status \"a12103d025e03ad958180a996161105d2c14a35f5ae1e7d1aabe48aab8387391\": rpc error: code = NotFound desc = could not find container \"a12103d025e03ad958180a996161105d2c14a35f5ae1e7d1aabe48aab8387391\": container with ID starting with a12103d025e03ad958180a996161105d2c14a35f5ae1e7d1aabe48aab8387391 not found: ID does not exist" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.578965 4766 scope.go:117] "RemoveContainer" containerID="11c17e3e9a13cd8d34cb7338544e7b564639e6d7cc5158ff63319715efa5d8b5" Oct 02 11:18:33 crc kubenswrapper[4766]: E1002 11:18:33.579377 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11c17e3e9a13cd8d34cb7338544e7b564639e6d7cc5158ff63319715efa5d8b5\": container with ID starting with 11c17e3e9a13cd8d34cb7338544e7b564639e6d7cc5158ff63319715efa5d8b5 not found: ID does not exist" containerID="11c17e3e9a13cd8d34cb7338544e7b564639e6d7cc5158ff63319715efa5d8b5" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.579395 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11c17e3e9a13cd8d34cb7338544e7b564639e6d7cc5158ff63319715efa5d8b5"} err="failed to get container status \"11c17e3e9a13cd8d34cb7338544e7b564639e6d7cc5158ff63319715efa5d8b5\": rpc error: code = NotFound desc = could not find container \"11c17e3e9a13cd8d34cb7338544e7b564639e6d7cc5158ff63319715efa5d8b5\": container with ID starting with 11c17e3e9a13cd8d34cb7338544e7b564639e6d7cc5158ff63319715efa5d8b5 not found: ID does not exist" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.579409 4766 scope.go:117] "RemoveContainer" containerID="805fa4defeec778eb8f810a670bb90a7f802c044adf651111138c5f240d5a4ad" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.632211 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q84ss\" (UniqueName: \"kubernetes.io/projected/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-kube-api-access-q84ss\") pod \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.632624 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-combined-ca-bundle\") pod \"5a8ba140-6dc8-4023-9789-7f288b85159b\" (UID: \"5a8ba140-6dc8-4023-9789-7f288b85159b\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.632657 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-config-data\") pod \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.632678 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-combined-ca-bundle\") pod \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.632703 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxsf9\" (UniqueName: \"kubernetes.io/projected/94c8c5ed-b069-4112-ae71-d9071bc15ff2-kube-api-access-sxsf9\") pod \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.632740 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xd7k\" (UniqueName: \"kubernetes.io/projected/3fac2bf1-fb0f-4031-bfc8-34090cc90c8a-kube-api-access-2xd7k\") pod \"3fac2bf1-fb0f-4031-bfc8-34090cc90c8a\" (UID: \"3fac2bf1-fb0f-4031-bfc8-34090cc90c8a\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633152 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxq8r\" (UniqueName: \"kubernetes.io/projected/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-api-access-qxq8r\") pod \"5a8ba140-6dc8-4023-9789-7f288b85159b\" (UID: \"5a8ba140-6dc8-4023-9789-7f288b85159b\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633226 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633265 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-internal-tls-certs\") pod \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633289 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-config-data\") pod \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633311 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7x5z\" (UniqueName: \"kubernetes.io/projected/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-kube-api-access-t7x5z\") pod \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633348 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-logs\") pod \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633368 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-combined-ca-bundle\") pod \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633391 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-scripts\") pod \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633413 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-config-data-custom\") pod \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633442 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-internal-tls-certs\") pod \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633483 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-combined-ca-bundle\") pod \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633531 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-httpd-run\") pod \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633565 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-public-tls-certs\") pod \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633598 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-internal-tls-certs\") pod \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633623 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-scripts\") pod \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\" (UID: \"0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633653 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-state-metrics-tls-certs\") pod \"5a8ba140-6dc8-4023-9789-7f288b85159b\" (UID: \"5a8ba140-6dc8-4023-9789-7f288b85159b\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633693 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94c8c5ed-b069-4112-ae71-d9071bc15ff2-logs\") pod \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633723 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-state-metrics-tls-config\") pod \"5a8ba140-6dc8-4023-9789-7f288b85159b\" (UID: \"5a8ba140-6dc8-4023-9789-7f288b85159b\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633778 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-config-data\") pod \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633815 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-logs\") pod \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\" (UID: \"7eb84667-7ff3-441c-ab7c-ccc4fc9233ca\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.633846 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-public-tls-certs\") pod \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\" (UID: \"94c8c5ed-b069-4112-ae71-d9071bc15ff2\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.634406 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kx85\" (UniqueName: \"kubernetes.io/projected/6ea7203d-5727-485f-8a6a-5bde96d05078-kube-api-access-6kx85\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.634432 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhfrf\" (UniqueName: \"kubernetes.io/projected/3fd6760c-af87-4e2a-adcd-5fe3ca636fef-kube-api-access-mhfrf\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.635371 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94c8c5ed-b069-4112-ae71-d9071bc15ff2-logs" (OuterVolumeSpecName: "logs") pod "94c8c5ed-b069-4112-ae71-d9071bc15ff2" (UID: "94c8c5ed-b069-4112-ae71-d9071bc15ff2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.648257 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94c8c5ed-b069-4112-ae71-d9071bc15ff2-kube-api-access-sxsf9" (OuterVolumeSpecName: "kube-api-access-sxsf9") pod "94c8c5ed-b069-4112-ae71-d9071bc15ff2" (UID: "94c8c5ed-b069-4112-ae71-d9071bc15ff2"). InnerVolumeSpecName "kube-api-access-sxsf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.652042 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" (UID: "0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.654149 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.658145 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.659901 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-logs" (OuterVolumeSpecName: "logs") pod "0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" (UID: "0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.667260 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-logs" (OuterVolumeSpecName: "logs") pod "7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" (UID: "7eb84667-7ff3-441c-ab7c-ccc4fc9233ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.672326 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-kube-api-access-q84ss" (OuterVolumeSpecName: "kube-api-access-q84ss") pod "7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" (UID: "7eb84667-7ff3-441c-ab7c-ccc4fc9233ca"). InnerVolumeSpecName "kube-api-access-q84ss". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.673923 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fac2bf1-fb0f-4031-bfc8-34090cc90c8a-kube-api-access-2xd7k" (OuterVolumeSpecName: "kube-api-access-2xd7k") pod "3fac2bf1-fb0f-4031-bfc8-34090cc90c8a" (UID: "3fac2bf1-fb0f-4031-bfc8-34090cc90c8a"). InnerVolumeSpecName "kube-api-access-2xd7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.677244 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-scripts" (OuterVolumeSpecName: "scripts") pod "0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" (UID: "0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.678764 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-api-access-qxq8r" (OuterVolumeSpecName: "kube-api-access-qxq8r") pod "5a8ba140-6dc8-4023-9789-7f288b85159b" (UID: "5a8ba140-6dc8-4023-9789-7f288b85159b"). InnerVolumeSpecName "kube-api-access-qxq8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.679055 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-scripts" (OuterVolumeSpecName: "scripts") pod "7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" (UID: "7eb84667-7ff3-441c-ab7c-ccc4fc9233ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.680727 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" (UID: "0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.681874 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.682057 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-kube-api-access-t7x5z" (OuterVolumeSpecName: "kube-api-access-t7x5z") pod "0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" (UID: "0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598"). InnerVolumeSpecName "kube-api-access-t7x5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.684532 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbicance01-account-delete-jrx2r"] Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.687761 4766 scope.go:117] "RemoveContainer" containerID="5ce14e70f72aa571a01db0570e56eaaa769ab4855145e42bd80cfba66de691f5" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.688788 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "94c8c5ed-b069-4112-ae71-d9071bc15ff2" (UID: "94c8c5ed-b069-4112-ae71-d9071bc15ff2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.700235 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbicance01-account-delete-jrx2r"] Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.732562 4766 scope.go:117] "RemoveContainer" containerID="5ce14e70f72aa571a01db0570e56eaaa769ab4855145e42bd80cfba66de691f5" Oct 02 11:18:33 crc kubenswrapper[4766]: E1002 11:18:33.735299 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ce14e70f72aa571a01db0570e56eaaa769ab4855145e42bd80cfba66de691f5\": container with ID starting with 5ce14e70f72aa571a01db0570e56eaaa769ab4855145e42bd80cfba66de691f5 not found: ID does not exist" containerID="5ce14e70f72aa571a01db0570e56eaaa769ab4855145e42bd80cfba66de691f5" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.735380 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce14e70f72aa571a01db0570e56eaaa769ab4855145e42bd80cfba66de691f5"} err="failed to get container status \"5ce14e70f72aa571a01db0570e56eaaa769ab4855145e42bd80cfba66de691f5\": rpc error: code = NotFound desc = could not find container \"5ce14e70f72aa571a01db0570e56eaaa769ab4855145e42bd80cfba66de691f5\": container with ID starting with 5ce14e70f72aa571a01db0570e56eaaa769ab4855145e42bd80cfba66de691f5 not found: ID does not exist" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.735434 4766 scope.go:117] "RemoveContainer" containerID="15c74c8e7a8896b2165a6a68dd5ee3e1b21f3fcb860feba3289044c93dbf1f19" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.735348 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-combined-ca-bundle\") pod \"bbb13294-05c1-4a20-8265-5144efcd91cf\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.738158 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"c3384223-2ad0-4593-976c-54c2d3cce52e\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.738221 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-config-data\") pod \"bbb13294-05c1-4a20-8265-5144efcd91cf\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.738279 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flv52\" (UniqueName: \"kubernetes.io/projected/c3384223-2ad0-4593-976c-54c2d3cce52e-kube-api-access-flv52\") pod \"c3384223-2ad0-4593-976c-54c2d3cce52e\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.738317 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-config-data-custom\") pod \"8d43eab0-4595-42fc-8489-38792e0c6e19\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.738412 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-config-data\") pod \"c3384223-2ad0-4593-976c-54c2d3cce52e\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.738442 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-nova-metadata-tls-certs\") pod \"bbb13294-05c1-4a20-8265-5144efcd91cf\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.738543 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmkdr\" (UniqueName: \"kubernetes.io/projected/bbb13294-05c1-4a20-8265-5144efcd91cf-kube-api-access-rmkdr\") pod \"bbb13294-05c1-4a20-8265-5144efcd91cf\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.738579 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d43eab0-4595-42fc-8489-38792e0c6e19-logs\") pod \"8d43eab0-4595-42fc-8489-38792e0c6e19\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.738607 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-combined-ca-bundle\") pod \"8d43eab0-4595-42fc-8489-38792e0c6e19\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.738650 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcgz9\" (UniqueName: \"kubernetes.io/projected/8d43eab0-4595-42fc-8489-38792e0c6e19-kube-api-access-wcgz9\") pod \"8d43eab0-4595-42fc-8489-38792e0c6e19\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.738696 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-public-tls-certs\") pod \"c3384223-2ad0-4593-976c-54c2d3cce52e\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.738797 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-scripts\") pod \"c3384223-2ad0-4593-976c-54c2d3cce52e\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.738855 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-combined-ca-bundle\") pod \"c3384223-2ad0-4593-976c-54c2d3cce52e\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.738913 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-config-data\") pod \"8d43eab0-4595-42fc-8489-38792e0c6e19\" (UID: \"8d43eab0-4595-42fc-8489-38792e0c6e19\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.738949 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3384223-2ad0-4593-976c-54c2d3cce52e-httpd-run\") pod \"c3384223-2ad0-4593-976c-54c2d3cce52e\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.738998 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbb13294-05c1-4a20-8265-5144efcd91cf-logs\") pod \"bbb13294-05c1-4a20-8265-5144efcd91cf\" (UID: \"bbb13294-05c1-4a20-8265-5144efcd91cf\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.739062 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3384223-2ad0-4593-976c-54c2d3cce52e-logs\") pod \"c3384223-2ad0-4593-976c-54c2d3cce52e\" (UID: \"c3384223-2ad0-4593-976c-54c2d3cce52e\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.739460 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d43eab0-4595-42fc-8489-38792e0c6e19-logs" (OuterVolumeSpecName: "logs") pod "8d43eab0-4595-42fc-8489-38792e0c6e19" (UID: "8d43eab0-4595-42fc-8489-38792e0c6e19"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.744804 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q84ss\" (UniqueName: \"kubernetes.io/projected/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-kube-api-access-q84ss\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.744849 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxsf9\" (UniqueName: \"kubernetes.io/projected/94c8c5ed-b069-4112-ae71-d9071bc15ff2-kube-api-access-sxsf9\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.744859 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xd7k\" (UniqueName: \"kubernetes.io/projected/3fac2bf1-fb0f-4031-bfc8-34090cc90c8a-kube-api-access-2xd7k\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.744869 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxq8r\" (UniqueName: \"kubernetes.io/projected/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-api-access-qxq8r\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.744919 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.745008 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d43eab0-4595-42fc-8489-38792e0c6e19-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.745021 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7x5z\" (UniqueName: \"kubernetes.io/projected/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-kube-api-access-t7x5z\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.745050 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.745060 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.745753 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.745775 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.745792 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.746183 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94c8c5ed-b069-4112-ae71-d9071bc15ff2-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.746194 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.762153 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3384223-2ad0-4593-976c-54c2d3cce52e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c3384223-2ad0-4593-976c-54c2d3cce52e" (UID: "c3384223-2ad0-4593-976c-54c2d3cce52e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.765363 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbb13294-05c1-4a20-8265-5144efcd91cf-logs" (OuterVolumeSpecName: "logs") pod "bbb13294-05c1-4a20-8265-5144efcd91cf" (UID: "bbb13294-05c1-4a20-8265-5144efcd91cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.765467 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8d43eab0-4595-42fc-8489-38792e0c6e19" (UID: "8d43eab0-4595-42fc-8489-38792e0c6e19"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.773777 4766 scope.go:117] "RemoveContainer" containerID="ff59b3f5c87557d99a67228679782023c256d54647b29c27032c95dfbc2bd77b" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.773783 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbb13294-05c1-4a20-8265-5144efcd91cf" (UID: "bbb13294-05c1-4a20-8265-5144efcd91cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.775301 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3384223-2ad0-4593-976c-54c2d3cce52e-logs" (OuterVolumeSpecName: "logs") pod "c3384223-2ad0-4593-976c-54c2d3cce52e" (UID: "c3384223-2ad0-4593-976c-54c2d3cce52e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.779750 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.781578 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbb13294-05c1-4a20-8265-5144efcd91cf-kube-api-access-rmkdr" (OuterVolumeSpecName: "kube-api-access-rmkdr") pod "bbb13294-05c1-4a20-8265-5144efcd91cf" (UID: "bbb13294-05c1-4a20-8265-5144efcd91cf"). InnerVolumeSpecName "kube-api-access-rmkdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.790681 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-scripts" (OuterVolumeSpecName: "scripts") pod "c3384223-2ad0-4593-976c-54c2d3cce52e" (UID: "c3384223-2ad0-4593-976c-54c2d3cce52e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.790727 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d43eab0-4595-42fc-8489-38792e0c6e19-kube-api-access-wcgz9" (OuterVolumeSpecName: "kube-api-access-wcgz9") pod "8d43eab0-4595-42fc-8489-38792e0c6e19" (UID: "8d43eab0-4595-42fc-8489-38792e0c6e19"). InnerVolumeSpecName "kube-api-access-wcgz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.791090 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "c3384223-2ad0-4593-976c-54c2d3cce52e" (UID: "c3384223-2ad0-4593-976c-54c2d3cce52e"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.792158 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance0d4d-account-delete-7cb92"] Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.803969 4766 scope.go:117] "RemoveContainer" containerID="15c74c8e7a8896b2165a6a68dd5ee3e1b21f3fcb860feba3289044c93dbf1f19" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.804096 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance0d4d-account-delete-7cb92"] Oct 02 11:18:33 crc kubenswrapper[4766]: E1002 11:18:33.804483 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15c74c8e7a8896b2165a6a68dd5ee3e1b21f3fcb860feba3289044c93dbf1f19\": container with ID starting with 15c74c8e7a8896b2165a6a68dd5ee3e1b21f3fcb860feba3289044c93dbf1f19 not found: ID does not exist" containerID="15c74c8e7a8896b2165a6a68dd5ee3e1b21f3fcb860feba3289044c93dbf1f19" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.804530 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15c74c8e7a8896b2165a6a68dd5ee3e1b21f3fcb860feba3289044c93dbf1f19"} err="failed to get container status \"15c74c8e7a8896b2165a6a68dd5ee3e1b21f3fcb860feba3289044c93dbf1f19\": rpc error: code = NotFound desc = could not find container \"15c74c8e7a8896b2165a6a68dd5ee3e1b21f3fcb860feba3289044c93dbf1f19\": container with ID starting with 15c74c8e7a8896b2165a6a68dd5ee3e1b21f3fcb860feba3289044c93dbf1f19 not found: ID does not exist" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.804552 4766 scope.go:117] "RemoveContainer" containerID="ff59b3f5c87557d99a67228679782023c256d54647b29c27032c95dfbc2bd77b" Oct 02 11:18:33 crc kubenswrapper[4766]: E1002 11:18:33.804976 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff59b3f5c87557d99a67228679782023c256d54647b29c27032c95dfbc2bd77b\": container with ID starting with ff59b3f5c87557d99a67228679782023c256d54647b29c27032c95dfbc2bd77b not found: ID does not exist" containerID="ff59b3f5c87557d99a67228679782023c256d54647b29c27032c95dfbc2bd77b" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.805002 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff59b3f5c87557d99a67228679782023c256d54647b29c27032c95dfbc2bd77b"} err="failed to get container status \"ff59b3f5c87557d99a67228679782023c256d54647b29c27032c95dfbc2bd77b\": rpc error: code = NotFound desc = could not find container \"ff59b3f5c87557d99a67228679782023c256d54647b29c27032c95dfbc2bd77b\": container with ID starting with ff59b3f5c87557d99a67228679782023c256d54647b29c27032c95dfbc2bd77b not found: ID does not exist" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.805016 4766 scope.go:117] "RemoveContainer" containerID="c160368cae86319957eb982a337f43c9e94cfdb7fb7fcd90fcf28831821e07b1" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.829690 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3384223-2ad0-4593-976c-54c2d3cce52e-kube-api-access-flv52" (OuterVolumeSpecName: "kube-api-access-flv52") pod "c3384223-2ad0-4593-976c-54c2d3cce52e" (UID: "c3384223-2ad0-4593-976c-54c2d3cce52e"). InnerVolumeSpecName "kube-api-access-flv52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.847688 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-combined-ca-bundle\") pod \"d9339929-4331-4cd9-89bc-8350ef2f55f5\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.847786 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d2gp\" (UniqueName: \"kubernetes.io/projected/d9339929-4331-4cd9-89bc-8350ef2f55f5-kube-api-access-6d2gp\") pod \"d9339929-4331-4cd9-89bc-8350ef2f55f5\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.847965 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data\") pod \"d9339929-4331-4cd9-89bc-8350ef2f55f5\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.847987 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data-custom\") pod \"d9339929-4331-4cd9-89bc-8350ef2f55f5\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.848049 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9339929-4331-4cd9-89bc-8350ef2f55f5-logs\") pod \"d9339929-4331-4cd9-89bc-8350ef2f55f5\" (UID: \"d9339929-4331-4cd9-89bc-8350ef2f55f5\") " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.848394 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.848414 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3384223-2ad0-4593-976c-54c2d3cce52e-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.848425 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbb13294-05c1-4a20-8265-5144efcd91cf-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.848435 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3384223-2ad0-4593-976c-54c2d3cce52e-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.848446 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.848478 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.848490 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flv52\" (UniqueName: \"kubernetes.io/projected/c3384223-2ad0-4593-976c-54c2d3cce52e-kube-api-access-flv52\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.848516 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.848530 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmkdr\" (UniqueName: \"kubernetes.io/projected/bbb13294-05c1-4a20-8265-5144efcd91cf-kube-api-access-rmkdr\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.848542 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcgz9\" (UniqueName: \"kubernetes.io/projected/8d43eab0-4595-42fc-8489-38792e0c6e19-kube-api-access-wcgz9\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.848974 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9339929-4331-4cd9-89bc-8350ef2f55f5-logs" (OuterVolumeSpecName: "logs") pod "d9339929-4331-4cd9-89bc-8350ef2f55f5" (UID: "d9339929-4331-4cd9-89bc-8350ef2f55f5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.849716 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "5a8ba140-6dc8-4023-9789-7f288b85159b" (UID: "5a8ba140-6dc8-4023-9789-7f288b85159b"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.853684 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d43eab0-4595-42fc-8489-38792e0c6e19" (UID: "8d43eab0-4595-42fc-8489-38792e0c6e19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.856735 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9339929-4331-4cd9-89bc-8350ef2f55f5-kube-api-access-6d2gp" (OuterVolumeSpecName: "kube-api-access-6d2gp") pod "d9339929-4331-4cd9-89bc-8350ef2f55f5" (UID: "d9339929-4331-4cd9-89bc-8350ef2f55f5"). InnerVolumeSpecName "kube-api-access-6d2gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.859108 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d9339929-4331-4cd9-89bc-8350ef2f55f5" (UID: "d9339929-4331-4cd9-89bc-8350ef2f55f5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.862615 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.870044 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" (UID: "0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.910050 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08bb38b9-010e-4371-970f-bfe8e7310011" path="/var/lib/kubelet/pods/08bb38b9-010e-4371-970f-bfe8e7310011/volumes" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.910581 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94c8c5ed-b069-4112-ae71-d9071bc15ff2" (UID: "94c8c5ed-b069-4112-ae71-d9071bc15ff2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.917907 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="123b65f7-a8e8-434b-baf1-e9b0d3a985d9" path="/var/lib/kubelet/pods/123b65f7-a8e8-434b-baf1-e9b0d3a985d9/volumes" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.918897 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a739206-d877-4212-9242-47a59c440b40" path="/var/lib/kubelet/pods/1a739206-d877-4212-9242-47a59c440b40/volumes" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.919437 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3089b483-f9df-4165-a28e-181e6134f8dc" path="/var/lib/kubelet/pods/3089b483-f9df-4165-a28e-181e6134f8dc/volumes" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.919934 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9" path="/var/lib/kubelet/pods/33a636ed-9ba5-4082-8f4f-cfcdbd0bd5c9/volumes" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.921199 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a65d329-5efc-4762-aafa-e2a1a3e7b378" path="/var/lib/kubelet/pods/3a65d329-5efc-4762-aafa-e2a1a3e7b378/volumes" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.921862 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd6760c-af87-4e2a-adcd-5fe3ca636fef" path="/var/lib/kubelet/pods/3fd6760c-af87-4e2a-adcd-5fe3ca636fef/volumes" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.922408 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44893df1-77c5-494c-bae0-253447abc8f4" path="/var/lib/kubelet/pods/44893df1-77c5-494c-bae0-253447abc8f4/volumes" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.923565 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b9bc510-a878-4e06-8db9-fd6209039c75" path="/var/lib/kubelet/pods/4b9bc510-a878-4e06-8db9-fd6209039c75/volumes" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.924144 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52bdcfda-75b3-450b-9db4-1a443be18fa3" path="/var/lib/kubelet/pods/52bdcfda-75b3-450b-9db4-1a443be18fa3/volumes" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.924739 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf52703-5083-4a00-a732-864efe21269f" path="/var/lib/kubelet/pods/bbf52703-5083-4a00-a732-864efe21269f/volumes" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.926819 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d0079a-03e3-4e5f-81a2-81f5bceb795c" path="/var/lib/kubelet/pods/d4d0079a-03e3-4e5f-81a2-81f5bceb795c/volumes" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.927440 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6918fd8-4c73-477e-bacd-ed09a36838e6" path="/var/lib/kubelet/pods/d6918fd8-4c73-477e-bacd-ed09a36838e6/volumes" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.927663 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "5a8ba140-6dc8-4023-9789-7f288b85159b" (UID: "5a8ba140-6dc8-4023-9789-7f288b85159b"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.937645 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-config-data" (OuterVolumeSpecName: "config-data") pod "0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" (UID: "0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.962279 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.962307 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.962319 4766 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.962329 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.962338 4766 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.962346 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9339929-4331-4cd9-89bc-8350ef2f55f5-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.962359 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.962367 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.962377 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d2gp\" (UniqueName: \"kubernetes.io/projected/d9339929-4331-4cd9-89bc-8350ef2f55f5-kube-api-access-6d2gp\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.962385 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.965998 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.968041 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a8ba140-6dc8-4023-9789-7f288b85159b" (UID: "5a8ba140-6dc8-4023-9789-7f288b85159b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.970622 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" (UID: "0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.981665 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-config-data" (OuterVolumeSpecName: "config-data") pod "94c8c5ed-b069-4112-ae71-d9071bc15ff2" (UID: "94c8c5ed-b069-4112-ae71-d9071bc15ff2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.981738 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "94c8c5ed-b069-4112-ae71-d9071bc15ff2" (UID: "94c8c5ed-b069-4112-ae71-d9071bc15ff2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:33 crc kubenswrapper[4766]: I1002 11:18:33.992598 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "94c8c5ed-b069-4112-ae71-d9071bc15ff2" (UID: "94c8c5ed-b069-4112-ae71-d9071bc15ff2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.010772 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" (UID: "7eb84667-7ff3-441c-ab7c-ccc4fc9233ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.020757 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9339929-4331-4cd9-89bc-8350ef2f55f5" (UID: "d9339929-4331-4cd9-89bc-8350ef2f55f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.031572 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3384223-2ad0-4593-976c-54c2d3cce52e" (UID: "c3384223-2ad0-4593-976c-54c2d3cce52e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.035592 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-config-data" (OuterVolumeSpecName: "config-data") pod "bbb13294-05c1-4a20-8265-5144efcd91cf" (UID: "bbb13294-05c1-4a20-8265-5144efcd91cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.051594 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-config-data" (OuterVolumeSpecName: "config-data") pod "c3384223-2ad0-4593-976c-54c2d3cce52e" (UID: "c3384223-2ad0-4593-976c-54c2d3cce52e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.052880 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-config-data" (OuterVolumeSpecName: "config-data") pod "8d43eab0-4595-42fc-8489-38792e0c6e19" (UID: "8d43eab0-4595-42fc-8489-38792e0c6e19"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.053822 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-config-data" (OuterVolumeSpecName: "config-data") pod "7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" (UID: "7eb84667-7ff3-441c-ab7c-ccc4fc9233ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.059330 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "bbb13294-05c1-4a20-8265-5144efcd91cf" (UID: "bbb13294-05c1-4a20-8265-5144efcd91cf"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.063977 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.064015 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.064028 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.064040 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.064054 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a8ba140-6dc8-4023-9789-7f288b85159b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.064065 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.064075 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.064087 4766 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb13294-05c1-4a20-8265-5144efcd91cf-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.064097 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.064109 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.064120 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.064133 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.064145 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94c8c5ed-b069-4112-ae71-d9071bc15ff2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.064157 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d43eab0-4595-42fc-8489-38792e0c6e19-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.064673 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" (UID: "7eb84667-7ff3-441c-ab7c-ccc4fc9233ca"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.077799 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c3384223-2ad0-4593-976c-54c2d3cce52e" (UID: "c3384223-2ad0-4593-976c-54c2d3cce52e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.079122 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data" (OuterVolumeSpecName: "config-data") pod "d9339929-4331-4cd9-89bc-8350ef2f55f5" (UID: "d9339929-4331-4cd9-89bc-8350ef2f55f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.097649 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" (UID: "7eb84667-7ff3-441c-ab7c-ccc4fc9233ca"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.172231 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.172275 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9339929-4331-4cd9-89bc-8350ef2f55f5-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.172290 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.172303 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c3384223-2ad0-4593-976c-54c2d3cce52e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.182049 4766 scope.go:117] "RemoveContainer" containerID="4a4f512030ce39ec49c91ab00c8b625423cbf01429bd4a06c2d5b12c2bf0ca55" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.219962 4766 scope.go:117] "RemoveContainer" containerID="1e54d0211edb5bd37091707d979fb377b49c8ac9d64e30887c84f1c90a8d9682" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.259922 4766 scope.go:117] "RemoveContainer" containerID="4a4f512030ce39ec49c91ab00c8b625423cbf01429bd4a06c2d5b12c2bf0ca55" Oct 02 11:18:34 crc kubenswrapper[4766]: E1002 11:18:34.260785 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a4f512030ce39ec49c91ab00c8b625423cbf01429bd4a06c2d5b12c2bf0ca55\": container with ID starting with 4a4f512030ce39ec49c91ab00c8b625423cbf01429bd4a06c2d5b12c2bf0ca55 not found: ID does not exist" containerID="4a4f512030ce39ec49c91ab00c8b625423cbf01429bd4a06c2d5b12c2bf0ca55" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.260828 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a4f512030ce39ec49c91ab00c8b625423cbf01429bd4a06c2d5b12c2bf0ca55"} err="failed to get container status \"4a4f512030ce39ec49c91ab00c8b625423cbf01429bd4a06c2d5b12c2bf0ca55\": rpc error: code = NotFound desc = could not find container \"4a4f512030ce39ec49c91ab00c8b625423cbf01429bd4a06c2d5b12c2bf0ca55\": container with ID starting with 4a4f512030ce39ec49c91ab00c8b625423cbf01429bd4a06c2d5b12c2bf0ca55 not found: ID does not exist" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.260855 4766 scope.go:117] "RemoveContainer" containerID="1e54d0211edb5bd37091707d979fb377b49c8ac9d64e30887c84f1c90a8d9682" Oct 02 11:18:34 crc kubenswrapper[4766]: E1002 11:18:34.261114 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e54d0211edb5bd37091707d979fb377b49c8ac9d64e30887c84f1c90a8d9682\": container with ID starting with 1e54d0211edb5bd37091707d979fb377b49c8ac9d64e30887c84f1c90a8d9682 not found: ID does not exist" containerID="1e54d0211edb5bd37091707d979fb377b49c8ac9d64e30887c84f1c90a8d9682" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.261143 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e54d0211edb5bd37091707d979fb377b49c8ac9d64e30887c84f1c90a8d9682"} err="failed to get container status \"1e54d0211edb5bd37091707d979fb377b49c8ac9d64e30887c84f1c90a8d9682\": rpc error: code = NotFound desc = could not find container \"1e54d0211edb5bd37091707d979fb377b49c8ac9d64e30887c84f1c90a8d9682\": container with ID starting with 1e54d0211edb5bd37091707d979fb377b49c8ac9d64e30887c84f1c90a8d9682 not found: ID does not exist" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.261161 4766 scope.go:117] "RemoveContainer" containerID="6135436a83b835e03c70b30a3e8fc15e4387e379664e3d6e6ac0c954d565a595" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.317973 4766 scope.go:117] "RemoveContainer" containerID="0e31d1e220ac761acdc5da9aa0f9af96e22aabf5bfdb6457f1154c8742a0295b" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.554064 4766 generic.go:334] "Generic (PLEG): container finished" podID="b869942b-07a4-4a08-b312-2b09cee2abf1" containerID="814f365245b0829017a26fc43722610ee727739807d4d801998edff378def268" exitCode=0 Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.554408 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b869942b-07a4-4a08-b312-2b09cee2abf1","Type":"ContainerDied","Data":"814f365245b0829017a26fc43722610ee727739807d4d801998edff378def268"} Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.557878 4766 generic.go:334] "Generic (PLEG): container finished" podID="3ae83487-5f24-4934-aba5-9ee2ca6ca657" containerID="4a11f30bcfa9373da75b15639350e1913eb380463cb8cd414bb358e208ee9607" exitCode=1 Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.557964 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell16336-account-delete-8q58s" event={"ID":"3ae83487-5f24-4934-aba5-9ee2ca6ca657","Type":"ContainerDied","Data":"4a11f30bcfa9373da75b15639350e1913eb380463cb8cd414bb358e208ee9607"} Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.558014 4766 scope.go:117] "RemoveContainer" containerID="c49ddbab603ee44d8b491bab6b3b0f05fc4a499c55527957a337c027a9b1320d" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.577577 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.592668 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f5f68d797-k4qqv" event={"ID":"8d43eab0-4595-42fc-8489-38792e0c6e19","Type":"ContainerDied","Data":"fbb7736ca2d64df541ebcaaaf6a45b4ba4512d16015fd11481ec9d053bfea1cc"} Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.593673 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-f5f68d797-k4qqv" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.624366 4766 scope.go:117] "RemoveContainer" containerID="433ae393df6f772a4b0964a7a633bfcdca8d7d78296edfa4ca875b807cacbd06" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.628075 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.635554 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.650908 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-f5f68d797-k4qqv"] Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.654341 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" event={"ID":"d9339929-4331-4cd9-89bc-8350ef2f55f5","Type":"ContainerDied","Data":"428cfb135d81844a70fbfa047eb311a462a87b6b4af97a2fdf7aac2011337530"} Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.654363 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-664d98ccd8-hh5xk" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.655882 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-f5f68d797-k4qqv"] Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.657353 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.657363 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-75997cdf8b-nnlzj" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.657389 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-84bf49766d-bbf2p" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.657445 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.657356 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.660225 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder891f-account-delete-2hm8h" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.685738 4766 scope.go:117] "RemoveContainer" containerID="f1a0b9913341b1fffedbb9296ca3e24f9abbcd44a25a0528cebe5da010d355e1" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.692005 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-664d98ccd8-hh5xk"] Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.702625 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-664d98ccd8-hh5xk"] Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.718022 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder891f-account-delete-2hm8h"] Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.732772 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder891f-account-delete-2hm8h"] Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.744887 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.752348 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.759492 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-84bf49766d-bbf2p"] Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.765524 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-84bf49766d-bbf2p"] Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.767256 4766 scope.go:117] "RemoveContainer" containerID="973c651619479183947224e4242097f161cf673b8da2935544f44e1700d072b9" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.773735 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.779444 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 11:18:34 crc kubenswrapper[4766]: E1002 11:18:34.782882 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Oct 02 11:18:34 crc kubenswrapper[4766]: E1002 11:18:34.782950 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-config-data podName:874d062e-d2f8-462c-95b3-8f630b7120af nodeName:}" failed. No retries permitted until 2025-10-02 11:18:42.782935716 +0000 UTC m=+1637.725806660 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-config-data") pod "rabbitmq-cell1-server-0" (UID: "874d062e-d2f8-462c-95b3-8f630b7120af") : configmap "rabbitmq-cell1-config-data" not found Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.784490 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.791863 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.805652 4766 scope.go:117] "RemoveContainer" containerID="dc1353dac7e3a318b9bf88253e7621b0e0c300fbb3ed030d2c367fb3cffe1ca0" Oct 02 11:18:34 crc kubenswrapper[4766]: E1002 11:18:34.805665 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:18:34 crc kubenswrapper[4766]: E1002 11:18:34.807807 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.815612 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-75997cdf8b-nnlzj"] Oct 02 11:18:34 crc kubenswrapper[4766]: E1002 11:18:34.817476 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 11:18:34 crc kubenswrapper[4766]: E1002 11:18:34.817548 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="7dec5495-e66b-4e5e-90b6-82ee673ab269" containerName="nova-scheduler-scheduler" Oct 02 11:18:34 crc kubenswrapper[4766]: I1002 11:18:34.820637 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-75997cdf8b-nnlzj"] Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.079399 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.087862 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell16336-account-delete-8q58s" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.148678 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_7498a37c-33a3-4a3a-9c72-64a0c533282c/ovn-northd/0.log" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.148751 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.187393 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-config-data\") pod \"b869942b-07a4-4a08-b312-2b09cee2abf1\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.187447 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-ovn-northd-tls-certs\") pod \"7498a37c-33a3-4a3a-9c72-64a0c533282c\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.187519 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7498a37c-33a3-4a3a-9c72-64a0c533282c-ovn-rundir\") pod \"7498a37c-33a3-4a3a-9c72-64a0c533282c\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.187558 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-combined-ca-bundle\") pod \"b869942b-07a4-4a08-b312-2b09cee2abf1\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.187590 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2chb\" (UniqueName: \"kubernetes.io/projected/3ae83487-5f24-4934-aba5-9ee2ca6ca657-kube-api-access-h2chb\") pod \"3ae83487-5f24-4934-aba5-9ee2ca6ca657\" (UID: \"3ae83487-5f24-4934-aba5-9ee2ca6ca657\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.187626 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7498a37c-33a3-4a3a-9c72-64a0c533282c-scripts\") pod \"7498a37c-33a3-4a3a-9c72-64a0c533282c\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.187657 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9fz2\" (UniqueName: \"kubernetes.io/projected/b869942b-07a4-4a08-b312-2b09cee2abf1-kube-api-access-m9fz2\") pod \"b869942b-07a4-4a08-b312-2b09cee2abf1\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.188463 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7498a37c-33a3-4a3a-9c72-64a0c533282c-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "7498a37c-33a3-4a3a-9c72-64a0c533282c" (UID: "7498a37c-33a3-4a3a-9c72-64a0c533282c"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.189177 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7498a37c-33a3-4a3a-9c72-64a0c533282c-scripts" (OuterVolumeSpecName: "scripts") pod "7498a37c-33a3-4a3a-9c72-64a0c533282c" (UID: "7498a37c-33a3-4a3a-9c72-64a0c533282c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.187738 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d77mk\" (UniqueName: \"kubernetes.io/projected/7498a37c-33a3-4a3a-9c72-64a0c533282c-kube-api-access-d77mk\") pod \"7498a37c-33a3-4a3a-9c72-64a0c533282c\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.189381 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-metrics-certs-tls-certs\") pod \"7498a37c-33a3-4a3a-9c72-64a0c533282c\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.189408 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-internal-tls-certs\") pod \"b869942b-07a4-4a08-b312-2b09cee2abf1\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.189445 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-public-tls-certs\") pod \"b869942b-07a4-4a08-b312-2b09cee2abf1\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.189473 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7498a37c-33a3-4a3a-9c72-64a0c533282c-config\") pod \"7498a37c-33a3-4a3a-9c72-64a0c533282c\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.189490 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-combined-ca-bundle\") pod \"7498a37c-33a3-4a3a-9c72-64a0c533282c\" (UID: \"7498a37c-33a3-4a3a-9c72-64a0c533282c\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.189549 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b869942b-07a4-4a08-b312-2b09cee2abf1-logs\") pod \"b869942b-07a4-4a08-b312-2b09cee2abf1\" (UID: \"b869942b-07a4-4a08-b312-2b09cee2abf1\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.190397 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7498a37c-33a3-4a3a-9c72-64a0c533282c-ovn-rundir\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.190424 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7498a37c-33a3-4a3a-9c72-64a0c533282c-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.190534 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7498a37c-33a3-4a3a-9c72-64a0c533282c-config" (OuterVolumeSpecName: "config") pod "7498a37c-33a3-4a3a-9c72-64a0c533282c" (UID: "7498a37c-33a3-4a3a-9c72-64a0c533282c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.190764 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b869942b-07a4-4a08-b312-2b09cee2abf1-logs" (OuterVolumeSpecName: "logs") pod "b869942b-07a4-4a08-b312-2b09cee2abf1" (UID: "b869942b-07a4-4a08-b312-2b09cee2abf1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.192396 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b869942b-07a4-4a08-b312-2b09cee2abf1-kube-api-access-m9fz2" (OuterVolumeSpecName: "kube-api-access-m9fz2") pod "b869942b-07a4-4a08-b312-2b09cee2abf1" (UID: "b869942b-07a4-4a08-b312-2b09cee2abf1"). InnerVolumeSpecName "kube-api-access-m9fz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.193965 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae83487-5f24-4934-aba5-9ee2ca6ca657-kube-api-access-h2chb" (OuterVolumeSpecName: "kube-api-access-h2chb") pod "3ae83487-5f24-4934-aba5-9ee2ca6ca657" (UID: "3ae83487-5f24-4934-aba5-9ee2ca6ca657"). InnerVolumeSpecName "kube-api-access-h2chb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.218716 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7498a37c-33a3-4a3a-9c72-64a0c533282c-kube-api-access-d77mk" (OuterVolumeSpecName: "kube-api-access-d77mk") pod "7498a37c-33a3-4a3a-9c72-64a0c533282c" (UID: "7498a37c-33a3-4a3a-9c72-64a0c533282c"). InnerVolumeSpecName "kube-api-access-d77mk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.236424 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7498a37c-33a3-4a3a-9c72-64a0c533282c" (UID: "7498a37c-33a3-4a3a-9c72-64a0c533282c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.258658 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b869942b-07a4-4a08-b312-2b09cee2abf1" (UID: "b869942b-07a4-4a08-b312-2b09cee2abf1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.262852 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-config-data" (OuterVolumeSpecName: "config-data") pod "b869942b-07a4-4a08-b312-2b09cee2abf1" (UID: "b869942b-07a4-4a08-b312-2b09cee2abf1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.291772 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d77mk\" (UniqueName: \"kubernetes.io/projected/7498a37c-33a3-4a3a-9c72-64a0c533282c-kube-api-access-d77mk\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.291808 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7498a37c-33a3-4a3a-9c72-64a0c533282c-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.291817 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.291826 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b869942b-07a4-4a08-b312-2b09cee2abf1-logs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.291833 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.291842 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.291862 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2chb\" (UniqueName: \"kubernetes.io/projected/3ae83487-5f24-4934-aba5-9ee2ca6ca657-kube-api-access-h2chb\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.291871 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9fz2\" (UniqueName: \"kubernetes.io/projected/b869942b-07a4-4a08-b312-2b09cee2abf1-kube-api-access-m9fz2\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.300265 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b869942b-07a4-4a08-b312-2b09cee2abf1" (UID: "b869942b-07a4-4a08-b312-2b09cee2abf1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.307068 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b869942b-07a4-4a08-b312-2b09cee2abf1" (UID: "b869942b-07a4-4a08-b312-2b09cee2abf1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.307805 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "7498a37c-33a3-4a3a-9c72-64a0c533282c" (UID: "7498a37c-33a3-4a3a-9c72-64a0c533282c"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.324124 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "7498a37c-33a3-4a3a-9c72-64a0c533282c" (UID: "7498a37c-33a3-4a3a-9c72-64a0c533282c"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.351291 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.392625 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e63ea453-c8bd-4128-a47e-7b0d740a6066-config-data-generated\") pod \"e63ea453-c8bd-4128-a47e-7b0d740a6066\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.393015 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"e63ea453-c8bd-4128-a47e-7b0d740a6066\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.393103 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6w7d\" (UniqueName: \"kubernetes.io/projected/e63ea453-c8bd-4128-a47e-7b0d740a6066-kube-api-access-d6w7d\") pod \"e63ea453-c8bd-4128-a47e-7b0d740a6066\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.393454 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-operator-scripts\") pod \"e63ea453-c8bd-4128-a47e-7b0d740a6066\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.393584 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-combined-ca-bundle\") pod \"e63ea453-c8bd-4128-a47e-7b0d740a6066\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.393607 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e63ea453-c8bd-4128-a47e-7b0d740a6066-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "e63ea453-c8bd-4128-a47e-7b0d740a6066" (UID: "e63ea453-c8bd-4128-a47e-7b0d740a6066"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.393749 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-kolla-config\") pod \"e63ea453-c8bd-4128-a47e-7b0d740a6066\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.393836 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-galera-tls-certs\") pod \"e63ea453-c8bd-4128-a47e-7b0d740a6066\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.393950 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-secrets\") pod \"e63ea453-c8bd-4128-a47e-7b0d740a6066\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.394036 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-config-data-default\") pod \"e63ea453-c8bd-4128-a47e-7b0d740a6066\" (UID: \"e63ea453-c8bd-4128-a47e-7b0d740a6066\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.394416 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.394479 4766 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7498a37c-33a3-4a3a-9c72-64a0c533282c-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.394551 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e63ea453-c8bd-4128-a47e-7b0d740a6066-config-data-generated\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.394617 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.394671 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b869942b-07a4-4a08-b312-2b09cee2abf1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.394569 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e63ea453-c8bd-4128-a47e-7b0d740a6066" (UID: "e63ea453-c8bd-4128-a47e-7b0d740a6066"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.395354 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e63ea453-c8bd-4128-a47e-7b0d740a6066" (UID: "e63ea453-c8bd-4128-a47e-7b0d740a6066"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.396973 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "e63ea453-c8bd-4128-a47e-7b0d740a6066" (UID: "e63ea453-c8bd-4128-a47e-7b0d740a6066"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.398254 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e63ea453-c8bd-4128-a47e-7b0d740a6066-kube-api-access-d6w7d" (OuterVolumeSpecName: "kube-api-access-d6w7d") pod "e63ea453-c8bd-4128-a47e-7b0d740a6066" (UID: "e63ea453-c8bd-4128-a47e-7b0d740a6066"). InnerVolumeSpecName "kube-api-access-d6w7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.400631 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-secrets" (OuterVolumeSpecName: "secrets") pod "e63ea453-c8bd-4128-a47e-7b0d740a6066" (UID: "e63ea453-c8bd-4128-a47e-7b0d740a6066"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.406798 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "e63ea453-c8bd-4128-a47e-7b0d740a6066" (UID: "e63ea453-c8bd-4128-a47e-7b0d740a6066"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.419288 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e63ea453-c8bd-4128-a47e-7b0d740a6066" (UID: "e63ea453-c8bd-4128-a47e-7b0d740a6066"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.434736 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "e63ea453-c8bd-4128-a47e-7b0d740a6066" (UID: "e63ea453-c8bd-4128-a47e-7b0d740a6066"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.496872 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6w7d\" (UniqueName: \"kubernetes.io/projected/e63ea453-c8bd-4128-a47e-7b0d740a6066-kube-api-access-d6w7d\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.496914 4766 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-operator-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.496926 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.496945 4766 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.496958 4766 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.496969 4766 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e63ea453-c8bd-4128-a47e-7b0d740a6066-secrets\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.496980 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e63ea453-c8bd-4128-a47e-7b0d740a6066-config-data-default\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.497015 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.517526 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.598625 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: E1002 11:18:35.652644 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:35 crc kubenswrapper[4766]: E1002 11:18:35.653160 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:35 crc kubenswrapper[4766]: E1002 11:18:35.653722 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:35 crc kubenswrapper[4766]: E1002 11:18:35.653787 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8wzw9" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovsdb-server" Oct 02 11:18:35 crc kubenswrapper[4766]: E1002 11:18:35.654804 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.656162 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:35 crc kubenswrapper[4766]: E1002 11:18:35.656745 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:35 crc kubenswrapper[4766]: E1002 11:18:35.658141 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:35 crc kubenswrapper[4766]: E1002 11:18:35.658178 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8wzw9" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovs-vswitchd" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.668492 4766 generic.go:334] "Generic (PLEG): container finished" podID="874d062e-d2f8-462c-95b3-8f630b7120af" containerID="197d65982becb7d1b3560136e8ef3d13532ea4c97a2f09eb585ab09256067519" exitCode=0 Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.668565 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"874d062e-d2f8-462c-95b3-8f630b7120af","Type":"ContainerDied","Data":"197d65982becb7d1b3560136e8ef3d13532ea4c97a2f09eb585ab09256067519"} Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.668592 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"874d062e-d2f8-462c-95b3-8f630b7120af","Type":"ContainerDied","Data":"58b8de417037e2e2d79b7476ad8bb2f7a43aaaf1c7bf0d8a32b4387a620f6f1b"} Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.668610 4766 scope.go:117] "RemoveContainer" containerID="197d65982becb7d1b3560136e8ef3d13532ea4c97a2f09eb585ab09256067519" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.668721 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.671769 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_7498a37c-33a3-4a3a-9c72-64a0c533282c/ovn-northd/0.log" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.671809 4766 generic.go:334] "Generic (PLEG): container finished" podID="7498a37c-33a3-4a3a-9c72-64a0c533282c" containerID="84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca" exitCode=139 Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.671846 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7498a37c-33a3-4a3a-9c72-64a0c533282c","Type":"ContainerDied","Data":"84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca"} Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.671863 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"7498a37c-33a3-4a3a-9c72-64a0c533282c","Type":"ContainerDied","Data":"a67be1c3772782750932201c89ba7971944be78138b8c2ae7a803c36b9559b86"} Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.671916 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.685032 4766 generic.go:334] "Generic (PLEG): container finished" podID="e63ea453-c8bd-4128-a47e-7b0d740a6066" containerID="059c180c1ce2eff2818e38ff90b0da551c9f6423a5517609968f6e8b7d8825e4" exitCode=0 Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.685087 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e63ea453-c8bd-4128-a47e-7b0d740a6066","Type":"ContainerDied","Data":"059c180c1ce2eff2818e38ff90b0da551c9f6423a5517609968f6e8b7d8825e4"} Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.685108 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e63ea453-c8bd-4128-a47e-7b0d740a6066","Type":"ContainerDied","Data":"4292b0c03ec8562f4ede1d43538e612eb70c0fb04b1536d93666a80e1d999558"} Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.685165 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.688955 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell16336-account-delete-8q58s" event={"ID":"3ae83487-5f24-4934-aba5-9ee2ca6ca657","Type":"ContainerDied","Data":"15363bcff6154d63ab66c0ff475bf73a8f2f486bb90162a1c1f06e57540daa32"} Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.689135 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell16336-account-delete-8q58s" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.699422 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-plugins\") pod \"874d062e-d2f8-462c-95b3-8f630b7120af\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.699604 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-tls\") pod \"874d062e-d2f8-462c-95b3-8f630b7120af\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.699695 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"874d062e-d2f8-462c-95b3-8f630b7120af\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.699766 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnt6k\" (UniqueName: \"kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-kube-api-access-bnt6k\") pod \"874d062e-d2f8-462c-95b3-8f630b7120af\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.699834 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-config-data\") pod \"874d062e-d2f8-462c-95b3-8f630b7120af\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.699902 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "874d062e-d2f8-462c-95b3-8f630b7120af" (UID: "874d062e-d2f8-462c-95b3-8f630b7120af"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.699974 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-confd\") pod \"874d062e-d2f8-462c-95b3-8f630b7120af\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.700062 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/874d062e-d2f8-462c-95b3-8f630b7120af-pod-info\") pod \"874d062e-d2f8-462c-95b3-8f630b7120af\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.700147 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-erlang-cookie\") pod \"874d062e-d2f8-462c-95b3-8f630b7120af\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.700245 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-plugins-conf\") pod \"874d062e-d2f8-462c-95b3-8f630b7120af\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.700311 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-server-conf\") pod \"874d062e-d2f8-462c-95b3-8f630b7120af\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.700375 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/874d062e-d2f8-462c-95b3-8f630b7120af-erlang-cookie-secret\") pod \"874d062e-d2f8-462c-95b3-8f630b7120af\" (UID: \"874d062e-d2f8-462c-95b3-8f630b7120af\") " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.700744 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: E1002 11:18:35.700864 4766 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Oct 02 11:18:35 crc kubenswrapper[4766]: E1002 11:18:35.700960 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-config-data podName:1282b506-728d-4c6f-aa9c-3d3c1f826b71 nodeName:}" failed. No retries permitted until 2025-10-02 11:18:43.700945053 +0000 UTC m=+1638.643815997 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-config-data") pod "rabbitmq-server-0" (UID: "1282b506-728d-4c6f-aa9c-3d3c1f826b71") : configmap "rabbitmq-config-data" not found Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.703417 4766 scope.go:117] "RemoveContainer" containerID="8c379e630eaafe0740e76b9174157026bf0829370b8090bf639367d0c55aed1a" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.708016 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "874d062e-d2f8-462c-95b3-8f630b7120af" (UID: "874d062e-d2f8-462c-95b3-8f630b7120af"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.709059 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "874d062e-d2f8-462c-95b3-8f630b7120af" (UID: "874d062e-d2f8-462c-95b3-8f630b7120af"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.709479 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "874d062e-d2f8-462c-95b3-8f630b7120af" (UID: "874d062e-d2f8-462c-95b3-8f630b7120af"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.721292 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b869942b-07a4-4a08-b312-2b09cee2abf1","Type":"ContainerDied","Data":"91dd9bbd80d43b9ac805fd99dc0edc144c30f8299ab2c8710b8b373644193427"} Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.721383 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.723529 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "874d062e-d2f8-462c-95b3-8f630b7120af" (UID: "874d062e-d2f8-462c-95b3-8f630b7120af"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.728750 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/874d062e-d2f8-462c-95b3-8f630b7120af-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "874d062e-d2f8-462c-95b3-8f630b7120af" (UID: "874d062e-d2f8-462c-95b3-8f630b7120af"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.730540 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.749695 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/874d062e-d2f8-462c-95b3-8f630b7120af-pod-info" (OuterVolumeSpecName: "pod-info") pod "874d062e-d2f8-462c-95b3-8f630b7120af" (UID: "874d062e-d2f8-462c-95b3-8f630b7120af"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.749774 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-kube-api-access-bnt6k" (OuterVolumeSpecName: "kube-api-access-bnt6k") pod "874d062e-d2f8-462c-95b3-8f630b7120af" (UID: "874d062e-d2f8-462c-95b3-8f630b7120af"). InnerVolumeSpecName "kube-api-access-bnt6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.749975 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.755767 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell16336-account-delete-8q58s"] Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.763132 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell16336-account-delete-8q58s"] Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.764577 4766 scope.go:117] "RemoveContainer" containerID="197d65982becb7d1b3560136e8ef3d13532ea4c97a2f09eb585ab09256067519" Oct 02 11:18:35 crc kubenswrapper[4766]: E1002 11:18:35.765070 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"197d65982becb7d1b3560136e8ef3d13532ea4c97a2f09eb585ab09256067519\": container with ID starting with 197d65982becb7d1b3560136e8ef3d13532ea4c97a2f09eb585ab09256067519 not found: ID does not exist" containerID="197d65982becb7d1b3560136e8ef3d13532ea4c97a2f09eb585ab09256067519" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.765105 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"197d65982becb7d1b3560136e8ef3d13532ea4c97a2f09eb585ab09256067519"} err="failed to get container status \"197d65982becb7d1b3560136e8ef3d13532ea4c97a2f09eb585ab09256067519\": rpc error: code = NotFound desc = could not find container \"197d65982becb7d1b3560136e8ef3d13532ea4c97a2f09eb585ab09256067519\": container with ID starting with 197d65982becb7d1b3560136e8ef3d13532ea4c97a2f09eb585ab09256067519 not found: ID does not exist" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.765131 4766 scope.go:117] "RemoveContainer" containerID="8c379e630eaafe0740e76b9174157026bf0829370b8090bf639367d0c55aed1a" Oct 02 11:18:35 crc kubenswrapper[4766]: E1002 11:18:35.765563 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c379e630eaafe0740e76b9174157026bf0829370b8090bf639367d0c55aed1a\": container with ID starting with 8c379e630eaafe0740e76b9174157026bf0829370b8090bf639367d0c55aed1a not found: ID does not exist" containerID="8c379e630eaafe0740e76b9174157026bf0829370b8090bf639367d0c55aed1a" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.765611 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c379e630eaafe0740e76b9174157026bf0829370b8090bf639367d0c55aed1a"} err="failed to get container status \"8c379e630eaafe0740e76b9174157026bf0829370b8090bf639367d0c55aed1a\": rpc error: code = NotFound desc = could not find container \"8c379e630eaafe0740e76b9174157026bf0829370b8090bf639367d0c55aed1a\": container with ID starting with 8c379e630eaafe0740e76b9174157026bf0829370b8090bf639367d0c55aed1a not found: ID does not exist" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.765631 4766 scope.go:117] "RemoveContainer" containerID="73791752d4b908acb0c2c84f95d0aaba2c3fbd09a79daaa50e9a88de0dd2d032" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.782655 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.784560 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-config-data" (OuterVolumeSpecName: "config-data") pod "874d062e-d2f8-462c-95b3-8f630b7120af" (UID: "874d062e-d2f8-462c-95b3-8f630b7120af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.788466 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.800196 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-server-conf" (OuterVolumeSpecName: "server-conf") pod "874d062e-d2f8-462c-95b3-8f630b7120af" (UID: "874d062e-d2f8-462c-95b3-8f630b7120af"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.802819 4766 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/874d062e-d2f8-462c-95b3-8f630b7120af-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.802846 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.802860 4766 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.802872 4766 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.802883 4766 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/874d062e-d2f8-462c-95b3-8f630b7120af-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.802894 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.802930 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.802941 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnt6k\" (UniqueName: \"kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-kube-api-access-bnt6k\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.802952 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/874d062e-d2f8-462c-95b3-8f630b7120af-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.818485 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.833024 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "874d062e-d2f8-462c-95b3-8f630b7120af" (UID: "874d062e-d2f8-462c-95b3-8f630b7120af"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.903741 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" path="/var/lib/kubelet/pods/0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598/volumes" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.904632 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ae83487-5f24-4934-aba5-9ee2ca6ca657" path="/var/lib/kubelet/pods/3ae83487-5f24-4934-aba5-9ee2ca6ca657/volumes" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.904638 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/874d062e-d2f8-462c-95b3-8f630b7120af-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.904785 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.905194 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fac2bf1-fb0f-4031-bfc8-34090cc90c8a" path="/var/lib/kubelet/pods/3fac2bf1-fb0f-4031-bfc8-34090cc90c8a/volumes" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.906375 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a8ba140-6dc8-4023-9789-7f288b85159b" path="/var/lib/kubelet/pods/5a8ba140-6dc8-4023-9789-7f288b85159b/volumes" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.907365 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7498a37c-33a3-4a3a-9c72-64a0c533282c" path="/var/lib/kubelet/pods/7498a37c-33a3-4a3a-9c72-64a0c533282c/volumes" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.908140 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" path="/var/lib/kubelet/pods/7eb84667-7ff3-441c-ab7c-ccc4fc9233ca/volumes" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.909214 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d43eab0-4595-42fc-8489-38792e0c6e19" path="/var/lib/kubelet/pods/8d43eab0-4595-42fc-8489-38792e0c6e19/volumes" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.909839 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94c8c5ed-b069-4112-ae71-d9071bc15ff2" path="/var/lib/kubelet/pods/94c8c5ed-b069-4112-ae71-d9071bc15ff2/volumes" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.910489 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbb13294-05c1-4a20-8265-5144efcd91cf" path="/var/lib/kubelet/pods/bbb13294-05c1-4a20-8265-5144efcd91cf/volumes" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.913335 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3384223-2ad0-4593-976c-54c2d3cce52e" path="/var/lib/kubelet/pods/c3384223-2ad0-4593-976c-54c2d3cce52e/volumes" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.914139 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9339929-4331-4cd9-89bc-8350ef2f55f5" path="/var/lib/kubelet/pods/d9339929-4331-4cd9-89bc-8350ef2f55f5/volumes" Oct 02 11:18:35 crc kubenswrapper[4766]: I1002 11:18:35.915798 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e63ea453-c8bd-4128-a47e-7b0d740a6066" path="/var/lib/kubelet/pods/e63ea453-c8bd-4128-a47e-7b0d740a6066/volumes" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.089757 4766 scope.go:117] "RemoveContainer" containerID="84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.102555 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.136406 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.233056 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.260638 4766 scope.go:117] "RemoveContainer" containerID="73791752d4b908acb0c2c84f95d0aaba2c3fbd09a79daaa50e9a88de0dd2d032" Oct 02 11:18:36 crc kubenswrapper[4766]: E1002 11:18:36.266634 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73791752d4b908acb0c2c84f95d0aaba2c3fbd09a79daaa50e9a88de0dd2d032\": container with ID starting with 73791752d4b908acb0c2c84f95d0aaba2c3fbd09a79daaa50e9a88de0dd2d032 not found: ID does not exist" containerID="73791752d4b908acb0c2c84f95d0aaba2c3fbd09a79daaa50e9a88de0dd2d032" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.266678 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73791752d4b908acb0c2c84f95d0aaba2c3fbd09a79daaa50e9a88de0dd2d032"} err="failed to get container status \"73791752d4b908acb0c2c84f95d0aaba2c3fbd09a79daaa50e9a88de0dd2d032\": rpc error: code = NotFound desc = could not find container \"73791752d4b908acb0c2c84f95d0aaba2c3fbd09a79daaa50e9a88de0dd2d032\": container with ID starting with 73791752d4b908acb0c2c84f95d0aaba2c3fbd09a79daaa50e9a88de0dd2d032 not found: ID does not exist" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.266704 4766 scope.go:117] "RemoveContainer" containerID="84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca" Oct 02 11:18:36 crc kubenswrapper[4766]: E1002 11:18:36.273645 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca\": container with ID starting with 84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca not found: ID does not exist" containerID="84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.273703 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca"} err="failed to get container status \"84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca\": rpc error: code = NotFound desc = could not find container \"84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca\": container with ID starting with 84f4fb4d72a0afd9e9f7422deb756c54744234b6ab4f9301604db2538dba62ca not found: ID does not exist" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.273742 4766 scope.go:117] "RemoveContainer" containerID="059c180c1ce2eff2818e38ff90b0da551c9f6423a5517609968f6e8b7d8825e4" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.313073 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbqmz\" (UniqueName: \"kubernetes.io/projected/07e3dfd6-c718-4304-9770-edbbfaca9cf4-kube-api-access-dbqmz\") pod \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.313417 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-public-tls-certs\") pod \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.313454 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-internal-tls-certs\") pod \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.313583 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-credential-keys\") pod \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.313621 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-config-data\") pod \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.313700 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-scripts\") pod \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.313736 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-combined-ca-bundle\") pod \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.313797 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-fernet-keys\") pod \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\" (UID: \"07e3dfd6-c718-4304-9770-edbbfaca9cf4\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.318922 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07e3dfd6-c718-4304-9770-edbbfaca9cf4-kube-api-access-dbqmz" (OuterVolumeSpecName: "kube-api-access-dbqmz") pod "07e3dfd6-c718-4304-9770-edbbfaca9cf4" (UID: "07e3dfd6-c718-4304-9770-edbbfaca9cf4"). InnerVolumeSpecName "kube-api-access-dbqmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.321935 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "07e3dfd6-c718-4304-9770-edbbfaca9cf4" (UID: "07e3dfd6-c718-4304-9770-edbbfaca9cf4"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.321986 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-scripts" (OuterVolumeSpecName: "scripts") pod "07e3dfd6-c718-4304-9770-edbbfaca9cf4" (UID: "07e3dfd6-c718-4304-9770-edbbfaca9cf4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.322134 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "07e3dfd6-c718-4304-9770-edbbfaca9cf4" (UID: "07e3dfd6-c718-4304-9770-edbbfaca9cf4"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.326706 4766 scope.go:117] "RemoveContainer" containerID="b4a6774b602583404b313774393432a551d673663fd177292369cd290ea65f62" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.342288 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07e3dfd6-c718-4304-9770-edbbfaca9cf4" (UID: "07e3dfd6-c718-4304-9770-edbbfaca9cf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.356881 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-config-data" (OuterVolumeSpecName: "config-data") pod "07e3dfd6-c718-4304-9770-edbbfaca9cf4" (UID: "07e3dfd6-c718-4304-9770-edbbfaca9cf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.368729 4766 scope.go:117] "RemoveContainer" containerID="059c180c1ce2eff2818e38ff90b0da551c9f6423a5517609968f6e8b7d8825e4" Oct 02 11:18:36 crc kubenswrapper[4766]: E1002 11:18:36.369157 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"059c180c1ce2eff2818e38ff90b0da551c9f6423a5517609968f6e8b7d8825e4\": container with ID starting with 059c180c1ce2eff2818e38ff90b0da551c9f6423a5517609968f6e8b7d8825e4 not found: ID does not exist" containerID="059c180c1ce2eff2818e38ff90b0da551c9f6423a5517609968f6e8b7d8825e4" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.369207 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"059c180c1ce2eff2818e38ff90b0da551c9f6423a5517609968f6e8b7d8825e4"} err="failed to get container status \"059c180c1ce2eff2818e38ff90b0da551c9f6423a5517609968f6e8b7d8825e4\": rpc error: code = NotFound desc = could not find container \"059c180c1ce2eff2818e38ff90b0da551c9f6423a5517609968f6e8b7d8825e4\": container with ID starting with 059c180c1ce2eff2818e38ff90b0da551c9f6423a5517609968f6e8b7d8825e4 not found: ID does not exist" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.369236 4766 scope.go:117] "RemoveContainer" containerID="b4a6774b602583404b313774393432a551d673663fd177292369cd290ea65f62" Oct 02 11:18:36 crc kubenswrapper[4766]: E1002 11:18:36.369613 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4a6774b602583404b313774393432a551d673663fd177292369cd290ea65f62\": container with ID starting with b4a6774b602583404b313774393432a551d673663fd177292369cd290ea65f62 not found: ID does not exist" containerID="b4a6774b602583404b313774393432a551d673663fd177292369cd290ea65f62" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.369690 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4a6774b602583404b313774393432a551d673663fd177292369cd290ea65f62"} err="failed to get container status \"b4a6774b602583404b313774393432a551d673663fd177292369cd290ea65f62\": rpc error: code = NotFound desc = could not find container \"b4a6774b602583404b313774393432a551d673663fd177292369cd290ea65f62\": container with ID starting with b4a6774b602583404b313774393432a551d673663fd177292369cd290ea65f62 not found: ID does not exist" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.369732 4766 scope.go:117] "RemoveContainer" containerID="4a11f30bcfa9373da75b15639350e1913eb380463cb8cd414bb358e208ee9607" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.369801 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "07e3dfd6-c718-4304-9770-edbbfaca9cf4" (UID: "07e3dfd6-c718-4304-9770-edbbfaca9cf4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.376018 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "07e3dfd6-c718-4304-9770-edbbfaca9cf4" (UID: "07e3dfd6-c718-4304-9770-edbbfaca9cf4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.394544 4766 scope.go:117] "RemoveContainer" containerID="814f365245b0829017a26fc43722610ee727739807d4d801998edff378def268" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.412500 4766 scope.go:117] "RemoveContainer" containerID="1aee3ffe3cc0a7ee677e4497600a6cf21df562e9f835edca4df6d4e59c78e1d2" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.415817 4766 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.415863 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbqmz\" (UniqueName: \"kubernetes.io/projected/07e3dfd6-c718-4304-9770-edbbfaca9cf4-kube-api-access-dbqmz\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.415872 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.415880 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.415888 4766 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.415895 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.415923 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.415932 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07e3dfd6-c718-4304-9770-edbbfaca9cf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.426764 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.510355 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-67bd9fd99f-qbp28" podUID="44893df1-77c5-494c-bae0-253447abc8f4" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.152:8080/healthcheck\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.510600 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-67bd9fd99f-qbp28" podUID="44893df1-77c5-494c-bae0-253447abc8f4" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.152:8080/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.517231 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1282b506-728d-4c6f-aa9c-3d3c1f826b71-pod-info\") pod \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.517306 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.517349 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-config-data\") pod \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.517386 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-confd\") pod \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.517409 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-erlang-cookie\") pod \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.517427 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-tls\") pod \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.517445 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xpn4\" (UniqueName: \"kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-kube-api-access-6xpn4\") pod \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.517462 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1282b506-728d-4c6f-aa9c-3d3c1f826b71-erlang-cookie-secret\") pod \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.517483 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-server-conf\") pod \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.517516 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-plugins-conf\") pod \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.517548 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-plugins\") pod \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\" (UID: \"1282b506-728d-4c6f-aa9c-3d3c1f826b71\") " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.518060 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1282b506-728d-4c6f-aa9c-3d3c1f826b71" (UID: "1282b506-728d-4c6f-aa9c-3d3c1f826b71"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.518925 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1282b506-728d-4c6f-aa9c-3d3c1f826b71" (UID: "1282b506-728d-4c6f-aa9c-3d3c1f826b71"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.519617 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1282b506-728d-4c6f-aa9c-3d3c1f826b71" (UID: "1282b506-728d-4c6f-aa9c-3d3c1f826b71"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.520632 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1282b506-728d-4c6f-aa9c-3d3c1f826b71-pod-info" (OuterVolumeSpecName: "pod-info") pod "1282b506-728d-4c6f-aa9c-3d3c1f826b71" (UID: "1282b506-728d-4c6f-aa9c-3d3c1f826b71"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.521306 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1282b506-728d-4c6f-aa9c-3d3c1f826b71" (UID: "1282b506-728d-4c6f-aa9c-3d3c1f826b71"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.521372 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-kube-api-access-6xpn4" (OuterVolumeSpecName: "kube-api-access-6xpn4") pod "1282b506-728d-4c6f-aa9c-3d3c1f826b71" (UID: "1282b506-728d-4c6f-aa9c-3d3c1f826b71"). InnerVolumeSpecName "kube-api-access-6xpn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.521463 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1282b506-728d-4c6f-aa9c-3d3c1f826b71-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1282b506-728d-4c6f-aa9c-3d3c1f826b71" (UID: "1282b506-728d-4c6f-aa9c-3d3c1f826b71"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.522762 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "1282b506-728d-4c6f-aa9c-3d3c1f826b71" (UID: "1282b506-728d-4c6f-aa9c-3d3c1f826b71"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.539973 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-config-data" (OuterVolumeSpecName: "config-data") pod "1282b506-728d-4c6f-aa9c-3d3c1f826b71" (UID: "1282b506-728d-4c6f-aa9c-3d3c1f826b71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.559274 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-server-conf" (OuterVolumeSpecName: "server-conf") pod "1282b506-728d-4c6f-aa9c-3d3c1f826b71" (UID: "1282b506-728d-4c6f-aa9c-3d3c1f826b71"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.596445 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1282b506-728d-4c6f-aa9c-3d3c1f826b71" (UID: "1282b506-728d-4c6f-aa9c-3d3c1f826b71"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.619106 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.619138 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.619149 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.619158 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xpn4\" (UniqueName: \"kubernetes.io/projected/1282b506-728d-4c6f-aa9c-3d3c1f826b71-kube-api-access-6xpn4\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.619166 4766 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1282b506-728d-4c6f-aa9c-3d3c1f826b71-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.619176 4766 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.619184 4766 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.619192 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1282b506-728d-4c6f-aa9c-3d3c1f826b71-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.619212 4766 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1282b506-728d-4c6f-aa9c-3d3c1f826b71-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.619236 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.619244 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1282b506-728d-4c6f-aa9c-3d3c1f826b71-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.634995 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.721206 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.734352 4766 generic.go:334] "Generic (PLEG): container finished" podID="07e3dfd6-c718-4304-9770-edbbfaca9cf4" containerID="e6c3d2e041b5f5a0a93635dedbb2e7ad90fbe97c81c3d583742c3fd4c3beb5a3" exitCode=0 Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.734412 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-576797c867-n7r4b" event={"ID":"07e3dfd6-c718-4304-9770-edbbfaca9cf4","Type":"ContainerDied","Data":"e6c3d2e041b5f5a0a93635dedbb2e7ad90fbe97c81c3d583742c3fd4c3beb5a3"} Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.734445 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-576797c867-n7r4b" event={"ID":"07e3dfd6-c718-4304-9770-edbbfaca9cf4","Type":"ContainerDied","Data":"c9baccdf2aa205a53a557014f5fef109b7e2b59c5ead4f716fc4892c5edb23cd"} Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.734463 4766 scope.go:117] "RemoveContainer" containerID="e6c3d2e041b5f5a0a93635dedbb2e7ad90fbe97c81c3d583742c3fd4c3beb5a3" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.734490 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-576797c867-n7r4b" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.748671 4766 generic.go:334] "Generic (PLEG): container finished" podID="1282b506-728d-4c6f-aa9c-3d3c1f826b71" containerID="366bc0e5d2fdf94dff0819d1232ca88a875b2fd3ae879cab99a8d75f0ceae62c" exitCode=0 Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.748740 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1282b506-728d-4c6f-aa9c-3d3c1f826b71","Type":"ContainerDied","Data":"366bc0e5d2fdf94dff0819d1232ca88a875b2fd3ae879cab99a8d75f0ceae62c"} Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.748768 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1282b506-728d-4c6f-aa9c-3d3c1f826b71","Type":"ContainerDied","Data":"2a9873b5a34829c9c836ac4d9b6cee686f7ad2f7d0121bc47a4bf389659b0dac"} Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.748833 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.764683 4766 scope.go:117] "RemoveContainer" containerID="e6c3d2e041b5f5a0a93635dedbb2e7ad90fbe97c81c3d583742c3fd4c3beb5a3" Oct 02 11:18:36 crc kubenswrapper[4766]: E1002 11:18:36.765135 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6c3d2e041b5f5a0a93635dedbb2e7ad90fbe97c81c3d583742c3fd4c3beb5a3\": container with ID starting with e6c3d2e041b5f5a0a93635dedbb2e7ad90fbe97c81c3d583742c3fd4c3beb5a3 not found: ID does not exist" containerID="e6c3d2e041b5f5a0a93635dedbb2e7ad90fbe97c81c3d583742c3fd4c3beb5a3" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.765181 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6c3d2e041b5f5a0a93635dedbb2e7ad90fbe97c81c3d583742c3fd4c3beb5a3"} err="failed to get container status \"e6c3d2e041b5f5a0a93635dedbb2e7ad90fbe97c81c3d583742c3fd4c3beb5a3\": rpc error: code = NotFound desc = could not find container \"e6c3d2e041b5f5a0a93635dedbb2e7ad90fbe97c81c3d583742c3fd4c3beb5a3\": container with ID starting with e6c3d2e041b5f5a0a93635dedbb2e7ad90fbe97c81c3d583742c3fd4c3beb5a3 not found: ID does not exist" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.765209 4766 scope.go:117] "RemoveContainer" containerID="366bc0e5d2fdf94dff0819d1232ca88a875b2fd3ae879cab99a8d75f0ceae62c" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.797421 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-576797c867-n7r4b"] Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.803779 4766 scope.go:117] "RemoveContainer" containerID="1f522cd7f555c48aa4a94cc691909b1a9f65c1ca666bbe2a28d742da033b7470" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.811414 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-576797c867-n7r4b"] Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.823328 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.829447 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.878576 4766 scope.go:117] "RemoveContainer" containerID="366bc0e5d2fdf94dff0819d1232ca88a875b2fd3ae879cab99a8d75f0ceae62c" Oct 02 11:18:36 crc kubenswrapper[4766]: E1002 11:18:36.879114 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"366bc0e5d2fdf94dff0819d1232ca88a875b2fd3ae879cab99a8d75f0ceae62c\": container with ID starting with 366bc0e5d2fdf94dff0819d1232ca88a875b2fd3ae879cab99a8d75f0ceae62c not found: ID does not exist" containerID="366bc0e5d2fdf94dff0819d1232ca88a875b2fd3ae879cab99a8d75f0ceae62c" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.879152 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"366bc0e5d2fdf94dff0819d1232ca88a875b2fd3ae879cab99a8d75f0ceae62c"} err="failed to get container status \"366bc0e5d2fdf94dff0819d1232ca88a875b2fd3ae879cab99a8d75f0ceae62c\": rpc error: code = NotFound desc = could not find container \"366bc0e5d2fdf94dff0819d1232ca88a875b2fd3ae879cab99a8d75f0ceae62c\": container with ID starting with 366bc0e5d2fdf94dff0819d1232ca88a875b2fd3ae879cab99a8d75f0ceae62c not found: ID does not exist" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.879178 4766 scope.go:117] "RemoveContainer" containerID="1f522cd7f555c48aa4a94cc691909b1a9f65c1ca666bbe2a28d742da033b7470" Oct 02 11:18:36 crc kubenswrapper[4766]: E1002 11:18:36.879736 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f522cd7f555c48aa4a94cc691909b1a9f65c1ca666bbe2a28d742da033b7470\": container with ID starting with 1f522cd7f555c48aa4a94cc691909b1a9f65c1ca666bbe2a28d742da033b7470 not found: ID does not exist" containerID="1f522cd7f555c48aa4a94cc691909b1a9f65c1ca666bbe2a28d742da033b7470" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.879769 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f522cd7f555c48aa4a94cc691909b1a9f65c1ca666bbe2a28d742da033b7470"} err="failed to get container status \"1f522cd7f555c48aa4a94cc691909b1a9f65c1ca666bbe2a28d742da033b7470\": rpc error: code = NotFound desc = could not find container \"1f522cd7f555c48aa4a94cc691909b1a9f65c1ca666bbe2a28d742da033b7470\": container with ID starting with 1f522cd7f555c48aa4a94cc691909b1a9f65c1ca666bbe2a28d742da033b7470 not found: ID does not exist" Oct 02 11:18:36 crc kubenswrapper[4766]: I1002 11:18:36.881438 4766 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/nova-cell1-conductor-0" secret="" err="secret \"nova-nova-dockercfg-44h9r\" not found" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.395806 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.431474 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-config-data\") pod \"7b5ac374-df46-4a36-947d-de07af25426c\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.431586 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-scripts\") pod \"7b5ac374-df46-4a36-947d-de07af25426c\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.431621 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b5ac374-df46-4a36-947d-de07af25426c-log-httpd\") pod \"7b5ac374-df46-4a36-947d-de07af25426c\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.431709 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfzxs\" (UniqueName: \"kubernetes.io/projected/7b5ac374-df46-4a36-947d-de07af25426c-kube-api-access-vfzxs\") pod \"7b5ac374-df46-4a36-947d-de07af25426c\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.431772 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b5ac374-df46-4a36-947d-de07af25426c-run-httpd\") pod \"7b5ac374-df46-4a36-947d-de07af25426c\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.431823 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-sg-core-conf-yaml\") pod \"7b5ac374-df46-4a36-947d-de07af25426c\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.431845 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-combined-ca-bundle\") pod \"7b5ac374-df46-4a36-947d-de07af25426c\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.431879 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-ceilometer-tls-certs\") pod \"7b5ac374-df46-4a36-947d-de07af25426c\" (UID: \"7b5ac374-df46-4a36-947d-de07af25426c\") " Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.433257 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b5ac374-df46-4a36-947d-de07af25426c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7b5ac374-df46-4a36-947d-de07af25426c" (UID: "7b5ac374-df46-4a36-947d-de07af25426c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.435122 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b5ac374-df46-4a36-947d-de07af25426c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7b5ac374-df46-4a36-947d-de07af25426c" (UID: "7b5ac374-df46-4a36-947d-de07af25426c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.436967 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-scripts" (OuterVolumeSpecName: "scripts") pod "7b5ac374-df46-4a36-947d-de07af25426c" (UID: "7b5ac374-df46-4a36-947d-de07af25426c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.445492 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b5ac374-df46-4a36-947d-de07af25426c-kube-api-access-vfzxs" (OuterVolumeSpecName: "kube-api-access-vfzxs") pod "7b5ac374-df46-4a36-947d-de07af25426c" (UID: "7b5ac374-df46-4a36-947d-de07af25426c"). InnerVolumeSpecName "kube-api-access-vfzxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.476578 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7b5ac374-df46-4a36-947d-de07af25426c" (UID: "7b5ac374-df46-4a36-947d-de07af25426c"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.480659 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7b5ac374-df46-4a36-947d-de07af25426c" (UID: "7b5ac374-df46-4a36-947d-de07af25426c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.492380 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b5ac374-df46-4a36-947d-de07af25426c" (UID: "7b5ac374-df46-4a36-947d-de07af25426c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.508140 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-config-data" (OuterVolumeSpecName: "config-data") pod "7b5ac374-df46-4a36-947d-de07af25426c" (UID: "7b5ac374-df46-4a36-947d-de07af25426c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.541321 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b5ac374-df46-4a36-947d-de07af25426c-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.541355 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.541367 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.541375 4766 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.541383 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.541391 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b5ac374-df46-4a36-947d-de07af25426c-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.541400 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7b5ac374-df46-4a36-947d-de07af25426c-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.541409 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfzxs\" (UniqueName: \"kubernetes.io/projected/7b5ac374-df46-4a36-947d-de07af25426c-kube-api-access-vfzxs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.564868 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.643118 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dec5495-e66b-4e5e-90b6-82ee673ab269-config-data\") pod \"7dec5495-e66b-4e5e-90b6-82ee673ab269\" (UID: \"7dec5495-e66b-4e5e-90b6-82ee673ab269\") " Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.643161 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g84r8\" (UniqueName: \"kubernetes.io/projected/7dec5495-e66b-4e5e-90b6-82ee673ab269-kube-api-access-g84r8\") pod \"7dec5495-e66b-4e5e-90b6-82ee673ab269\" (UID: \"7dec5495-e66b-4e5e-90b6-82ee673ab269\") " Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.643179 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dec5495-e66b-4e5e-90b6-82ee673ab269-combined-ca-bundle\") pod \"7dec5495-e66b-4e5e-90b6-82ee673ab269\" (UID: \"7dec5495-e66b-4e5e-90b6-82ee673ab269\") " Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.645614 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dec5495-e66b-4e5e-90b6-82ee673ab269-kube-api-access-g84r8" (OuterVolumeSpecName: "kube-api-access-g84r8") pod "7dec5495-e66b-4e5e-90b6-82ee673ab269" (UID: "7dec5495-e66b-4e5e-90b6-82ee673ab269"). InnerVolumeSpecName "kube-api-access-g84r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.660554 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dec5495-e66b-4e5e-90b6-82ee673ab269-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dec5495-e66b-4e5e-90b6-82ee673ab269" (UID: "7dec5495-e66b-4e5e-90b6-82ee673ab269"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.662761 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dec5495-e66b-4e5e-90b6-82ee673ab269-config-data" (OuterVolumeSpecName: "config-data") pod "7dec5495-e66b-4e5e-90b6-82ee673ab269" (UID: "7dec5495-e66b-4e5e-90b6-82ee673ab269"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.744344 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dec5495-e66b-4e5e-90b6-82ee673ab269-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.744376 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g84r8\" (UniqueName: \"kubernetes.io/projected/7dec5495-e66b-4e5e-90b6-82ee673ab269-kube-api-access-g84r8\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.744388 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dec5495-e66b-4e5e-90b6-82ee673ab269-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.769913 4766 generic.go:334] "Generic (PLEG): container finished" podID="7b5ac374-df46-4a36-947d-de07af25426c" containerID="6384876206ceb62e698ffec3b4534e9b368119d727684623f371cd380375cf5c" exitCode=0 Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.769944 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b5ac374-df46-4a36-947d-de07af25426c","Type":"ContainerDied","Data":"6384876206ceb62e698ffec3b4534e9b368119d727684623f371cd380375cf5c"} Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.769980 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.769998 4766 scope.go:117] "RemoveContainer" containerID="56bd28360a526e730981b495f3d32920b04b2ee9b375daef8362b798d8067eaf" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.769986 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7b5ac374-df46-4a36-947d-de07af25426c","Type":"ContainerDied","Data":"9528ae3cc414cdaf93226f68c3078cc8943cd1ba76abcd03af19c848321b06b5"} Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.773322 4766 generic.go:334] "Generic (PLEG): container finished" podID="7dec5495-e66b-4e5e-90b6-82ee673ab269" containerID="07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936" exitCode=0 Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.773366 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.773387 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7dec5495-e66b-4e5e-90b6-82ee673ab269","Type":"ContainerDied","Data":"07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936"} Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.773420 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7dec5495-e66b-4e5e-90b6-82ee673ab269","Type":"ContainerDied","Data":"cecd0986302cf9ff4c82dc99a37140782713856423ffb4e9875e68db83729200"} Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.795920 4766 scope.go:117] "RemoveContainer" containerID="76a65e0801ebae9343bc82e73244baaf4bf39324be0c967cb15033840367ed33" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.804344 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.810920 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.823339 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.828122 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.865263 4766 scope.go:117] "RemoveContainer" containerID="6384876206ceb62e698ffec3b4534e9b368119d727684623f371cd380375cf5c" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.886044 4766 scope.go:117] "RemoveContainer" containerID="e7e7c6bcc90173226bb27fb9e8539546111c6ddeb92e1200c4fa61b8a055e508" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.892933 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07e3dfd6-c718-4304-9770-edbbfaca9cf4" path="/var/lib/kubelet/pods/07e3dfd6-c718-4304-9770-edbbfaca9cf4/volumes" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.893683 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1282b506-728d-4c6f-aa9c-3d3c1f826b71" path="/var/lib/kubelet/pods/1282b506-728d-4c6f-aa9c-3d3c1f826b71/volumes" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.894567 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b5ac374-df46-4a36-947d-de07af25426c" path="/var/lib/kubelet/pods/7b5ac374-df46-4a36-947d-de07af25426c/volumes" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.895811 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dec5495-e66b-4e5e-90b6-82ee673ab269" path="/var/lib/kubelet/pods/7dec5495-e66b-4e5e-90b6-82ee673ab269/volumes" Oct 02 11:18:37 crc kubenswrapper[4766]: I1002 11:18:37.896546 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="874d062e-d2f8-462c-95b3-8f630b7120af" path="/var/lib/kubelet/pods/874d062e-d2f8-462c-95b3-8f630b7120af/volumes" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.011173 4766 scope.go:117] "RemoveContainer" containerID="56bd28360a526e730981b495f3d32920b04b2ee9b375daef8362b798d8067eaf" Oct 02 11:18:38 crc kubenswrapper[4766]: E1002 11:18:38.016982 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56bd28360a526e730981b495f3d32920b04b2ee9b375daef8362b798d8067eaf\": container with ID starting with 56bd28360a526e730981b495f3d32920b04b2ee9b375daef8362b798d8067eaf not found: ID does not exist" containerID="56bd28360a526e730981b495f3d32920b04b2ee9b375daef8362b798d8067eaf" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.017053 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56bd28360a526e730981b495f3d32920b04b2ee9b375daef8362b798d8067eaf"} err="failed to get container status \"56bd28360a526e730981b495f3d32920b04b2ee9b375daef8362b798d8067eaf\": rpc error: code = NotFound desc = could not find container \"56bd28360a526e730981b495f3d32920b04b2ee9b375daef8362b798d8067eaf\": container with ID starting with 56bd28360a526e730981b495f3d32920b04b2ee9b375daef8362b798d8067eaf not found: ID does not exist" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.017096 4766 scope.go:117] "RemoveContainer" containerID="76a65e0801ebae9343bc82e73244baaf4bf39324be0c967cb15033840367ed33" Oct 02 11:18:38 crc kubenswrapper[4766]: E1002 11:18:38.017496 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76a65e0801ebae9343bc82e73244baaf4bf39324be0c967cb15033840367ed33\": container with ID starting with 76a65e0801ebae9343bc82e73244baaf4bf39324be0c967cb15033840367ed33 not found: ID does not exist" containerID="76a65e0801ebae9343bc82e73244baaf4bf39324be0c967cb15033840367ed33" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.017587 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76a65e0801ebae9343bc82e73244baaf4bf39324be0c967cb15033840367ed33"} err="failed to get container status \"76a65e0801ebae9343bc82e73244baaf4bf39324be0c967cb15033840367ed33\": rpc error: code = NotFound desc = could not find container \"76a65e0801ebae9343bc82e73244baaf4bf39324be0c967cb15033840367ed33\": container with ID starting with 76a65e0801ebae9343bc82e73244baaf4bf39324be0c967cb15033840367ed33 not found: ID does not exist" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.017610 4766 scope.go:117] "RemoveContainer" containerID="6384876206ceb62e698ffec3b4534e9b368119d727684623f371cd380375cf5c" Oct 02 11:18:38 crc kubenswrapper[4766]: E1002 11:18:38.018030 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6384876206ceb62e698ffec3b4534e9b368119d727684623f371cd380375cf5c\": container with ID starting with 6384876206ceb62e698ffec3b4534e9b368119d727684623f371cd380375cf5c not found: ID does not exist" containerID="6384876206ceb62e698ffec3b4534e9b368119d727684623f371cd380375cf5c" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.018079 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6384876206ceb62e698ffec3b4534e9b368119d727684623f371cd380375cf5c"} err="failed to get container status \"6384876206ceb62e698ffec3b4534e9b368119d727684623f371cd380375cf5c\": rpc error: code = NotFound desc = could not find container \"6384876206ceb62e698ffec3b4534e9b368119d727684623f371cd380375cf5c\": container with ID starting with 6384876206ceb62e698ffec3b4534e9b368119d727684623f371cd380375cf5c not found: ID does not exist" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.018112 4766 scope.go:117] "RemoveContainer" containerID="e7e7c6bcc90173226bb27fb9e8539546111c6ddeb92e1200c4fa61b8a055e508" Oct 02 11:18:38 crc kubenswrapper[4766]: E1002 11:18:38.018648 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7e7c6bcc90173226bb27fb9e8539546111c6ddeb92e1200c4fa61b8a055e508\": container with ID starting with e7e7c6bcc90173226bb27fb9e8539546111c6ddeb92e1200c4fa61b8a055e508 not found: ID does not exist" containerID="e7e7c6bcc90173226bb27fb9e8539546111c6ddeb92e1200c4fa61b8a055e508" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.018690 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7e7c6bcc90173226bb27fb9e8539546111c6ddeb92e1200c4fa61b8a055e508"} err="failed to get container status \"e7e7c6bcc90173226bb27fb9e8539546111c6ddeb92e1200c4fa61b8a055e508\": rpc error: code = NotFound desc = could not find container \"e7e7c6bcc90173226bb27fb9e8539546111c6ddeb92e1200c4fa61b8a055e508\": container with ID starting with e7e7c6bcc90173226bb27fb9e8539546111c6ddeb92e1200c4fa61b8a055e508 not found: ID does not exist" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.018713 4766 scope.go:117] "RemoveContainer" containerID="07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.043654 4766 scope.go:117] "RemoveContainer" containerID="07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936" Oct 02 11:18:38 crc kubenswrapper[4766]: E1002 11:18:38.044234 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936\": container with ID starting with 07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936 not found: ID does not exist" containerID="07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.044289 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936"} err="failed to get container status \"07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936\": rpc error: code = NotFound desc = could not find container \"07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936\": container with ID starting with 07fa9133665aa759e550eab4256a6985a67260b720e51ca4a4047c0ccc2e9936 not found: ID does not exist" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.302343 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.355113 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-httpd-config\") pod \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.355156 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-public-tls-certs\") pod \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.355192 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-combined-ca-bundle\") pod \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.356256 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-internal-tls-certs\") pod \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.356285 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9svq9\" (UniqueName: \"kubernetes.io/projected/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-kube-api-access-9svq9\") pod \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.356644 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-config\") pod \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.356675 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-ovndb-tls-certs\") pod \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\" (UID: \"ec8cdac7-81c9-41e7-a956-41d13e5b91a6\") " Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.361685 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ec8cdac7-81c9-41e7-a956-41d13e5b91a6" (UID: "ec8cdac7-81c9-41e7-a956-41d13e5b91a6"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.361732 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-kube-api-access-9svq9" (OuterVolumeSpecName: "kube-api-access-9svq9") pod "ec8cdac7-81c9-41e7-a956-41d13e5b91a6" (UID: "ec8cdac7-81c9-41e7-a956-41d13e5b91a6"). InnerVolumeSpecName "kube-api-access-9svq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.389921 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec8cdac7-81c9-41e7-a956-41d13e5b91a6" (UID: "ec8cdac7-81c9-41e7-a956-41d13e5b91a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.391481 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ec8cdac7-81c9-41e7-a956-41d13e5b91a6" (UID: "ec8cdac7-81c9-41e7-a956-41d13e5b91a6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.391823 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ec8cdac7-81c9-41e7-a956-41d13e5b91a6" (UID: "ec8cdac7-81c9-41e7-a956-41d13e5b91a6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.392730 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-config" (OuterVolumeSpecName: "config") pod "ec8cdac7-81c9-41e7-a956-41d13e5b91a6" (UID: "ec8cdac7-81c9-41e7-a956-41d13e5b91a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.405993 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ec8cdac7-81c9-41e7-a956-41d13e5b91a6" (UID: "ec8cdac7-81c9-41e7-a956-41d13e5b91a6"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.469446 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.469711 4766 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.469786 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.469855 4766 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.469922 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9svq9\" (UniqueName: \"kubernetes.io/projected/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-kube-api-access-9svq9\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.470019 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.470093 4766 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ec8cdac7-81c9-41e7-a956-41d13e5b91a6-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.791006 4766 generic.go:334] "Generic (PLEG): container finished" podID="ec8cdac7-81c9-41e7-a956-41d13e5b91a6" containerID="b88eb59f2ed5fce10e26153dd82767f0f8e62d65f3120c04ecdf968f4b718053" exitCode=0 Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.791069 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55f8b9d7c-hfdcr" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.791072 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55f8b9d7c-hfdcr" event={"ID":"ec8cdac7-81c9-41e7-a956-41d13e5b91a6","Type":"ContainerDied","Data":"b88eb59f2ed5fce10e26153dd82767f0f8e62d65f3120c04ecdf968f4b718053"} Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.791121 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55f8b9d7c-hfdcr" event={"ID":"ec8cdac7-81c9-41e7-a956-41d13e5b91a6","Type":"ContainerDied","Data":"f9cf68c62ab67bd2c2561b1c4ed33ee8e77490994b3ccdb82b73650c412cafd1"} Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.791146 4766 scope.go:117] "RemoveContainer" containerID="f5124828eb5c9f7610af6224039bb25da1c36b7f880181f818d2c8a8eefcf481" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.827147 4766 scope.go:117] "RemoveContainer" containerID="b88eb59f2ed5fce10e26153dd82767f0f8e62d65f3120c04ecdf968f4b718053" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.864992 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55f8b9d7c-hfdcr"] Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.870768 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-55f8b9d7c-hfdcr"] Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.883275 4766 scope.go:117] "RemoveContainer" containerID="f5124828eb5c9f7610af6224039bb25da1c36b7f880181f818d2c8a8eefcf481" Oct 02 11:18:38 crc kubenswrapper[4766]: E1002 11:18:38.883888 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5124828eb5c9f7610af6224039bb25da1c36b7f880181f818d2c8a8eefcf481\": container with ID starting with f5124828eb5c9f7610af6224039bb25da1c36b7f880181f818d2c8a8eefcf481 not found: ID does not exist" containerID="f5124828eb5c9f7610af6224039bb25da1c36b7f880181f818d2c8a8eefcf481" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.883932 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5124828eb5c9f7610af6224039bb25da1c36b7f880181f818d2c8a8eefcf481"} err="failed to get container status \"f5124828eb5c9f7610af6224039bb25da1c36b7f880181f818d2c8a8eefcf481\": rpc error: code = NotFound desc = could not find container \"f5124828eb5c9f7610af6224039bb25da1c36b7f880181f818d2c8a8eefcf481\": container with ID starting with f5124828eb5c9f7610af6224039bb25da1c36b7f880181f818d2c8a8eefcf481 not found: ID does not exist" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.883954 4766 scope.go:117] "RemoveContainer" containerID="b88eb59f2ed5fce10e26153dd82767f0f8e62d65f3120c04ecdf968f4b718053" Oct 02 11:18:38 crc kubenswrapper[4766]: E1002 11:18:38.884434 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b88eb59f2ed5fce10e26153dd82767f0f8e62d65f3120c04ecdf968f4b718053\": container with ID starting with b88eb59f2ed5fce10e26153dd82767f0f8e62d65f3120c04ecdf968f4b718053 not found: ID does not exist" containerID="b88eb59f2ed5fce10e26153dd82767f0f8e62d65f3120c04ecdf968f4b718053" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.884451 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b88eb59f2ed5fce10e26153dd82767f0f8e62d65f3120c04ecdf968f4b718053"} err="failed to get container status \"b88eb59f2ed5fce10e26153dd82767f0f8e62d65f3120c04ecdf968f4b718053\": rpc error: code = NotFound desc = could not find container \"b88eb59f2ed5fce10e26153dd82767f0f8e62d65f3120c04ecdf968f4b718053\": container with ID starting with b88eb59f2ed5fce10e26153dd82767f0f8e62d65f3120c04ecdf968f4b718053 not found: ID does not exist" Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.973340 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.973779 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="1b5a88cf-8095-4025-a68a-349c579dddd3" containerName="memcached" containerID="cri-o://909147f23d289681b51aaa92222ea0360b2be9611eda4060826e6759c819c03b" gracePeriod=30 Oct 02 11:18:38 crc kubenswrapper[4766]: I1002 11:18:38.992265 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b4dsb"] Oct 02 11:18:39 crc kubenswrapper[4766]: I1002 11:18:39.000854 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:18:39 crc kubenswrapper[4766]: I1002 11:18:39.001068 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="0064fd48-390f-4a0f-abfe-9922c8c431f9" containerName="nova-cell1-conductor-conductor" containerID="cri-o://95b502421d0b283b99e2e399ae912e61831216f1aad7cf4549c10685585bea25" gracePeriod=30 Oct 02 11:18:39 crc kubenswrapper[4766]: I1002 11:18:39.006116 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-b4dsb"] Oct 02 11:18:39 crc kubenswrapper[4766]: I1002 11:18:39.010796 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4gmwl"] Oct 02 11:18:39 crc kubenswrapper[4766]: I1002 11:18:39.014654 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:18:39 crc kubenswrapper[4766]: I1002 11:18:39.015162 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="db4400a2-c286-467e-b62d-a5cb3042aa88" containerName="nova-cell0-conductor-conductor" containerID="cri-o://43a906e7afa204c5a711bfc8ccb6e78006f4ffebb41d2ad46a51e21a16f179ce" gracePeriod=30 Oct 02 11:18:39 crc kubenswrapper[4766]: I1002 11:18:39.018666 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-4gmwl"] Oct 02 11:18:39 crc kubenswrapper[4766]: E1002 11:18:39.265816 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="95b502421d0b283b99e2e399ae912e61831216f1aad7cf4549c10685585bea25" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 11:18:39 crc kubenswrapper[4766]: E1002 11:18:39.267452 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="95b502421d0b283b99e2e399ae912e61831216f1aad7cf4549c10685585bea25" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 11:18:39 crc kubenswrapper[4766]: E1002 11:18:39.268596 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="95b502421d0b283b99e2e399ae912e61831216f1aad7cf4549c10685585bea25" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 11:18:39 crc kubenswrapper[4766]: E1002 11:18:39.268755 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="0064fd48-390f-4a0f-abfe-9922c8c431f9" containerName="nova-cell1-conductor-conductor" Oct 02 11:18:39 crc kubenswrapper[4766]: I1002 11:18:39.891321 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c803d467-a739-40aa-9dc9-4f04e6e14923" path="/var/lib/kubelet/pods/c803d467-a739-40aa-9dc9-4f04e6e14923/volumes" Oct 02 11:18:39 crc kubenswrapper[4766]: I1002 11:18:39.892100 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc899043-5f53-453c-bc00-0cc214647667" path="/var/lib/kubelet/pods/dc899043-5f53-453c-bc00-0cc214647667/volumes" Oct 02 11:18:39 crc kubenswrapper[4766]: I1002 11:18:39.892607 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec8cdac7-81c9-41e7-a956-41d13e5b91a6" path="/var/lib/kubelet/pods/ec8cdac7-81c9-41e7-a956-41d13e5b91a6/volumes" Oct 02 11:18:40 crc kubenswrapper[4766]: I1002 11:18:40.654340 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/memcached-0" podUID="1b5a88cf-8095-4025-a68a-349c579dddd3" containerName="memcached" probeResult="failure" output="dial tcp 10.217.0.102:11211: connect: connection refused" Oct 02 11:18:40 crc kubenswrapper[4766]: E1002 11:18:40.654409 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:40 crc kubenswrapper[4766]: E1002 11:18:40.654890 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:40 crc kubenswrapper[4766]: E1002 11:18:40.654956 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:40 crc kubenswrapper[4766]: E1002 11:18:40.655152 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:40 crc kubenswrapper[4766]: E1002 11:18:40.655171 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8wzw9" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovsdb-server" Oct 02 11:18:40 crc kubenswrapper[4766]: E1002 11:18:40.657140 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:40 crc kubenswrapper[4766]: E1002 11:18:40.658554 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:40 crc kubenswrapper[4766]: E1002 11:18:40.658593 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8wzw9" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovs-vswitchd" Oct 02 11:18:40 crc kubenswrapper[4766]: E1002 11:18:40.706770 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b5a88cf_8095_4025_a68a_349c579dddd3.slice/crio-conmon-909147f23d289681b51aaa92222ea0360b2be9611eda4060826e6759c819c03b.scope\": RecentStats: unable to find data in memory cache]" Oct 02 11:18:40 crc kubenswrapper[4766]: I1002 11:18:40.830060 4766 generic.go:334] "Generic (PLEG): container finished" podID="db4400a2-c286-467e-b62d-a5cb3042aa88" containerID="43a906e7afa204c5a711bfc8ccb6e78006f4ffebb41d2ad46a51e21a16f179ce" exitCode=0 Oct 02 11:18:40 crc kubenswrapper[4766]: I1002 11:18:40.830423 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"db4400a2-c286-467e-b62d-a5cb3042aa88","Type":"ContainerDied","Data":"43a906e7afa204c5a711bfc8ccb6e78006f4ffebb41d2ad46a51e21a16f179ce"} Oct 02 11:18:40 crc kubenswrapper[4766]: I1002 11:18:40.831837 4766 generic.go:334] "Generic (PLEG): container finished" podID="1b5a88cf-8095-4025-a68a-349c579dddd3" containerID="909147f23d289681b51aaa92222ea0360b2be9611eda4060826e6759c819c03b" exitCode=0 Oct 02 11:18:40 crc kubenswrapper[4766]: I1002 11:18:40.831864 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1b5a88cf-8095-4025-a68a-349c579dddd3","Type":"ContainerDied","Data":"909147f23d289681b51aaa92222ea0360b2be9611eda4060826e6759c819c03b"} Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.081799 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.089201 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.103689 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dprhj\" (UniqueName: \"kubernetes.io/projected/1b5a88cf-8095-4025-a68a-349c579dddd3-kube-api-access-dprhj\") pod \"1b5a88cf-8095-4025-a68a-349c579dddd3\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.103861 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db4400a2-c286-467e-b62d-a5cb3042aa88-config-data\") pod \"db4400a2-c286-467e-b62d-a5cb3042aa88\" (UID: \"db4400a2-c286-467e-b62d-a5cb3042aa88\") " Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.103934 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1b5a88cf-8095-4025-a68a-349c579dddd3-kolla-config\") pod \"1b5a88cf-8095-4025-a68a-349c579dddd3\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.105076 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5a88cf-8095-4025-a68a-349c579dddd3-combined-ca-bundle\") pod \"1b5a88cf-8095-4025-a68a-349c579dddd3\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.105117 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4400a2-c286-467e-b62d-a5cb3042aa88-combined-ca-bundle\") pod \"db4400a2-c286-467e-b62d-a5cb3042aa88\" (UID: \"db4400a2-c286-467e-b62d-a5cb3042aa88\") " Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.105154 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x46b\" (UniqueName: \"kubernetes.io/projected/db4400a2-c286-467e-b62d-a5cb3042aa88-kube-api-access-8x46b\") pod \"db4400a2-c286-467e-b62d-a5cb3042aa88\" (UID: \"db4400a2-c286-467e-b62d-a5cb3042aa88\") " Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.105211 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5a88cf-8095-4025-a68a-349c579dddd3-memcached-tls-certs\") pod \"1b5a88cf-8095-4025-a68a-349c579dddd3\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.105248 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b5a88cf-8095-4025-a68a-349c579dddd3-config-data\") pod \"1b5a88cf-8095-4025-a68a-349c579dddd3\" (UID: \"1b5a88cf-8095-4025-a68a-349c579dddd3\") " Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.107279 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5a88cf-8095-4025-a68a-349c579dddd3-config-data" (OuterVolumeSpecName: "config-data") pod "1b5a88cf-8095-4025-a68a-349c579dddd3" (UID: "1b5a88cf-8095-4025-a68a-349c579dddd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.109589 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5a88cf-8095-4025-a68a-349c579dddd3-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "1b5a88cf-8095-4025-a68a-349c579dddd3" (UID: "1b5a88cf-8095-4025-a68a-349c579dddd3"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.113605 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5a88cf-8095-4025-a68a-349c579dddd3-kube-api-access-dprhj" (OuterVolumeSpecName: "kube-api-access-dprhj") pod "1b5a88cf-8095-4025-a68a-349c579dddd3" (UID: "1b5a88cf-8095-4025-a68a-349c579dddd3"). InnerVolumeSpecName "kube-api-access-dprhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.125835 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db4400a2-c286-467e-b62d-a5cb3042aa88-kube-api-access-8x46b" (OuterVolumeSpecName: "kube-api-access-8x46b") pod "db4400a2-c286-467e-b62d-a5cb3042aa88" (UID: "db4400a2-c286-467e-b62d-a5cb3042aa88"). InnerVolumeSpecName "kube-api-access-8x46b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.131815 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5a88cf-8095-4025-a68a-349c579dddd3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b5a88cf-8095-4025-a68a-349c579dddd3" (UID: "1b5a88cf-8095-4025-a68a-349c579dddd3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.133363 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4400a2-c286-467e-b62d-a5cb3042aa88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db4400a2-c286-467e-b62d-a5cb3042aa88" (UID: "db4400a2-c286-467e-b62d-a5cb3042aa88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.139652 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db4400a2-c286-467e-b62d-a5cb3042aa88-config-data" (OuterVolumeSpecName: "config-data") pod "db4400a2-c286-467e-b62d-a5cb3042aa88" (UID: "db4400a2-c286-467e-b62d-a5cb3042aa88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.153229 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5a88cf-8095-4025-a68a-349c579dddd3-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "1b5a88cf-8095-4025-a68a-349c579dddd3" (UID: "1b5a88cf-8095-4025-a68a-349c579dddd3"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.207229 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db4400a2-c286-467e-b62d-a5cb3042aa88-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.207264 4766 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1b5a88cf-8095-4025-a68a-349c579dddd3-kolla-config\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.207276 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db4400a2-c286-467e-b62d-a5cb3042aa88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.207287 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5a88cf-8095-4025-a68a-349c579dddd3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.207297 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x46b\" (UniqueName: \"kubernetes.io/projected/db4400a2-c286-467e-b62d-a5cb3042aa88-kube-api-access-8x46b\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.207305 4766 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5a88cf-8095-4025-a68a-349c579dddd3-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.207313 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b5a88cf-8095-4025-a68a-349c579dddd3-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.207322 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dprhj\" (UniqueName: \"kubernetes.io/projected/1b5a88cf-8095-4025-a68a-349c579dddd3-kube-api-access-dprhj\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.841825 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.841846 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"db4400a2-c286-467e-b62d-a5cb3042aa88","Type":"ContainerDied","Data":"c1ccaeb4d1397ace6d1b78c16ae286d96f01561a75118cf137042a01f30abc40"} Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.842217 4766 scope.go:117] "RemoveContainer" containerID="43a906e7afa204c5a711bfc8ccb6e78006f4ffebb41d2ad46a51e21a16f179ce" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.843460 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1b5a88cf-8095-4025-a68a-349c579dddd3","Type":"ContainerDied","Data":"d7dca218c778173488c590afc7dcdc9ecc806cb40402063798eafd9ab7fd2afa"} Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.843511 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.874334 4766 scope.go:117] "RemoveContainer" containerID="909147f23d289681b51aaa92222ea0360b2be9611eda4060826e6759c819c03b" Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.895228 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.895268 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.899011 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:18:41 crc kubenswrapper[4766]: I1002 11:18:41.905344 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 11:18:43 crc kubenswrapper[4766]: I1002 11:18:43.870214 4766 generic.go:334] "Generic (PLEG): container finished" podID="0064fd48-390f-4a0f-abfe-9922c8c431f9" containerID="95b502421d0b283b99e2e399ae912e61831216f1aad7cf4549c10685585bea25" exitCode=0 Oct 02 11:18:43 crc kubenswrapper[4766]: I1002 11:18:43.870334 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0064fd48-390f-4a0f-abfe-9922c8c431f9","Type":"ContainerDied","Data":"95b502421d0b283b99e2e399ae912e61831216f1aad7cf4549c10685585bea25"} Oct 02 11:18:43 crc kubenswrapper[4766]: I1002 11:18:43.882145 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:18:43 crc kubenswrapper[4766]: E1002 11:18:43.882429 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:18:43 crc kubenswrapper[4766]: I1002 11:18:43.898565 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5a88cf-8095-4025-a68a-349c579dddd3" path="/var/lib/kubelet/pods/1b5a88cf-8095-4025-a68a-349c579dddd3/volumes" Oct 02 11:18:43 crc kubenswrapper[4766]: I1002 11:18:43.899739 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db4400a2-c286-467e-b62d-a5cb3042aa88" path="/var/lib/kubelet/pods/db4400a2-c286-467e-b62d-a5cb3042aa88/volumes" Oct 02 11:18:44 crc kubenswrapper[4766]: I1002 11:18:44.125301 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 11:18:44 crc kubenswrapper[4766]: I1002 11:18:44.150295 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0064fd48-390f-4a0f-abfe-9922c8c431f9-combined-ca-bundle\") pod \"0064fd48-390f-4a0f-abfe-9922c8c431f9\" (UID: \"0064fd48-390f-4a0f-abfe-9922c8c431f9\") " Oct 02 11:18:44 crc kubenswrapper[4766]: I1002 11:18:44.150368 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0064fd48-390f-4a0f-abfe-9922c8c431f9-config-data\") pod \"0064fd48-390f-4a0f-abfe-9922c8c431f9\" (UID: \"0064fd48-390f-4a0f-abfe-9922c8c431f9\") " Oct 02 11:18:44 crc kubenswrapper[4766]: I1002 11:18:44.150545 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxxf7\" (UniqueName: \"kubernetes.io/projected/0064fd48-390f-4a0f-abfe-9922c8c431f9-kube-api-access-lxxf7\") pod \"0064fd48-390f-4a0f-abfe-9922c8c431f9\" (UID: \"0064fd48-390f-4a0f-abfe-9922c8c431f9\") " Oct 02 11:18:44 crc kubenswrapper[4766]: I1002 11:18:44.159213 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0064fd48-390f-4a0f-abfe-9922c8c431f9-kube-api-access-lxxf7" (OuterVolumeSpecName: "kube-api-access-lxxf7") pod "0064fd48-390f-4a0f-abfe-9922c8c431f9" (UID: "0064fd48-390f-4a0f-abfe-9922c8c431f9"). InnerVolumeSpecName "kube-api-access-lxxf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:44 crc kubenswrapper[4766]: I1002 11:18:44.173235 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0064fd48-390f-4a0f-abfe-9922c8c431f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0064fd48-390f-4a0f-abfe-9922c8c431f9" (UID: "0064fd48-390f-4a0f-abfe-9922c8c431f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:44 crc kubenswrapper[4766]: I1002 11:18:44.173452 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0064fd48-390f-4a0f-abfe-9922c8c431f9-config-data" (OuterVolumeSpecName: "config-data") pod "0064fd48-390f-4a0f-abfe-9922c8c431f9" (UID: "0064fd48-390f-4a0f-abfe-9922c8c431f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:44 crc kubenswrapper[4766]: I1002 11:18:44.253418 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0064fd48-390f-4a0f-abfe-9922c8c431f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:44 crc kubenswrapper[4766]: I1002 11:18:44.253475 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0064fd48-390f-4a0f-abfe-9922c8c431f9-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:44 crc kubenswrapper[4766]: I1002 11:18:44.253488 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxxf7\" (UniqueName: \"kubernetes.io/projected/0064fd48-390f-4a0f-abfe-9922c8c431f9-kube-api-access-lxxf7\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:44 crc kubenswrapper[4766]: I1002 11:18:44.882426 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0064fd48-390f-4a0f-abfe-9922c8c431f9","Type":"ContainerDied","Data":"ba6115d8ae5ab6c8b285b8b221fe8db9027b76e8b2acca29732c54f957a5152b"} Oct 02 11:18:44 crc kubenswrapper[4766]: I1002 11:18:44.882531 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 11:18:44 crc kubenswrapper[4766]: I1002 11:18:44.882808 4766 scope.go:117] "RemoveContainer" containerID="95b502421d0b283b99e2e399ae912e61831216f1aad7cf4549c10685585bea25" Oct 02 11:18:44 crc kubenswrapper[4766]: I1002 11:18:44.917182 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:18:44 crc kubenswrapper[4766]: I1002 11:18:44.922200 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 11:18:45 crc kubenswrapper[4766]: E1002 11:18:45.653936 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:45 crc kubenswrapper[4766]: E1002 11:18:45.654298 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:45 crc kubenswrapper[4766]: E1002 11:18:45.654607 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:45 crc kubenswrapper[4766]: E1002 11:18:45.654640 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8wzw9" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovsdb-server" Oct 02 11:18:45 crc kubenswrapper[4766]: E1002 11:18:45.655022 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:45 crc kubenswrapper[4766]: E1002 11:18:45.656193 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:45 crc kubenswrapper[4766]: E1002 11:18:45.657539 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:45 crc kubenswrapper[4766]: E1002 11:18:45.657582 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8wzw9" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovs-vswitchd" Oct 02 11:18:45 crc kubenswrapper[4766]: I1002 11:18:45.891601 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0064fd48-390f-4a0f-abfe-9922c8c431f9" path="/var/lib/kubelet/pods/0064fd48-390f-4a0f-abfe-9922c8c431f9/volumes" Oct 02 11:18:50 crc kubenswrapper[4766]: E1002 11:18:50.653590 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:50 crc kubenswrapper[4766]: E1002 11:18:50.654695 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:50 crc kubenswrapper[4766]: E1002 11:18:50.655092 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:50 crc kubenswrapper[4766]: E1002 11:18:50.655129 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8wzw9" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovsdb-server" Oct 02 11:18:50 crc kubenswrapper[4766]: E1002 11:18:50.655767 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:50 crc kubenswrapper[4766]: E1002 11:18:50.657569 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:50 crc kubenswrapper[4766]: E1002 11:18:50.659202 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:50 crc kubenswrapper[4766]: E1002 11:18:50.659251 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8wzw9" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovs-vswitchd" Oct 02 11:18:55 crc kubenswrapper[4766]: E1002 11:18:55.652849 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:55 crc kubenswrapper[4766]: E1002 11:18:55.653779 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:55 crc kubenswrapper[4766]: E1002 11:18:55.654070 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Oct 02 11:18:55 crc kubenswrapper[4766]: E1002 11:18:55.654093 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-8wzw9" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovsdb-server" Oct 02 11:18:55 crc kubenswrapper[4766]: E1002 11:18:55.654698 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:55 crc kubenswrapper[4766]: E1002 11:18:55.655818 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:55 crc kubenswrapper[4766]: E1002 11:18:55.657151 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Oct 02 11:18:55 crc kubenswrapper[4766]: E1002 11:18:55.657190 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-8wzw9" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovs-vswitchd" Oct 02 11:18:56 crc kubenswrapper[4766]: I1002 11:18:56.881453 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:18:56 crc kubenswrapper[4766]: E1002 11:18:56.881763 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.877032 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8wzw9_d90db976-cd03-4eb7-8e1d-361ef7c5045b/ovs-vswitchd/0.log" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.878048 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.881972 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.965824 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-config-data\") pod \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.965904 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7vls\" (UniqueName: \"kubernetes.io/projected/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-kube-api-access-n7vls\") pod \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.965994 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-run\") pod \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.966020 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-scripts\") pod \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.966043 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-combined-ca-bundle\") pod \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.966074 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-etc-ovs\") pod \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.966099 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-log\") pod \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.966119 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-etc-machine-id\") pod \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.966103 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-run" (OuterVolumeSpecName: "var-run") pod "d90db976-cd03-4eb7-8e1d-361ef7c5045b" (UID: "d90db976-cd03-4eb7-8e1d-361ef7c5045b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.966162 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "d90db976-cd03-4eb7-8e1d-361ef7c5045b" (UID: "d90db976-cd03-4eb7-8e1d-361ef7c5045b"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.966184 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-config-data-custom\") pod \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\" (UID: \"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6\") " Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.966258 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt4qd\" (UniqueName: \"kubernetes.io/projected/d90db976-cd03-4eb7-8e1d-361ef7c5045b-kube-api-access-wt4qd\") pod \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.966287 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-lib\") pod \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.966313 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d90db976-cd03-4eb7-8e1d-361ef7c5045b-scripts\") pod \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\" (UID: \"d90db976-cd03-4eb7-8e1d-361ef7c5045b\") " Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.966703 4766 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-etc-ovs\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.966721 4766 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.966698 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6b3b4ffb-195e-4aba-b3bc-9969b56c01d6" (UID: "6b3b4ffb-195e-4aba-b3bc-9969b56c01d6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.966721 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-log" (OuterVolumeSpecName: "var-log") pod "d90db976-cd03-4eb7-8e1d-361ef7c5045b" (UID: "d90db976-cd03-4eb7-8e1d-361ef7c5045b"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.966759 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-lib" (OuterVolumeSpecName: "var-lib") pod "d90db976-cd03-4eb7-8e1d-361ef7c5045b" (UID: "d90db976-cd03-4eb7-8e1d-361ef7c5045b"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.967779 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d90db976-cd03-4eb7-8e1d-361ef7c5045b-scripts" (OuterVolumeSpecName: "scripts") pod "d90db976-cd03-4eb7-8e1d-361ef7c5045b" (UID: "d90db976-cd03-4eb7-8e1d-361ef7c5045b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.970790 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-kube-api-access-n7vls" (OuterVolumeSpecName: "kube-api-access-n7vls") pod "6b3b4ffb-195e-4aba-b3bc-9969b56c01d6" (UID: "6b3b4ffb-195e-4aba-b3bc-9969b56c01d6"). InnerVolumeSpecName "kube-api-access-n7vls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.970916 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6b3b4ffb-195e-4aba-b3bc-9969b56c01d6" (UID: "6b3b4ffb-195e-4aba-b3bc-9969b56c01d6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.971261 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-scripts" (OuterVolumeSpecName: "scripts") pod "6b3b4ffb-195e-4aba-b3bc-9969b56c01d6" (UID: "6b3b4ffb-195e-4aba-b3bc-9969b56c01d6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.971278 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d90db976-cd03-4eb7-8e1d-361ef7c5045b-kube-api-access-wt4qd" (OuterVolumeSpecName: "kube-api-access-wt4qd") pod "d90db976-cd03-4eb7-8e1d-361ef7c5045b" (UID: "d90db976-cd03-4eb7-8e1d-361ef7c5045b"). InnerVolumeSpecName "kube-api-access-wt4qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.992754 4766 generic.go:334] "Generic (PLEG): container finished" podID="6b3b4ffb-195e-4aba-b3bc-9969b56c01d6" containerID="e5e102f20a63a8d722e965baa46260978733d5085519f5210483d94b2bb40281" exitCode=137 Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.992803 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6","Type":"ContainerDied","Data":"e5e102f20a63a8d722e965baa46260978733d5085519f5210483d94b2bb40281"} Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.992857 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6b3b4ffb-195e-4aba-b3bc-9969b56c01d6","Type":"ContainerDied","Data":"4b127a06d475d3d8b01682ab934e5c8debe86e90c48741e8f524ec9e9717ce67"} Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.992855 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.992874 4766 scope.go:117] "RemoveContainer" containerID="4fa69755c8413dc8a6865886029bf8ec092fa254162c708d148185b7305c545c" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.994358 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8wzw9_d90db976-cd03-4eb7-8e1d-361ef7c5045b/ovs-vswitchd/0.log" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.995638 4766 generic.go:334] "Generic (PLEG): container finished" podID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" exitCode=137 Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.995736 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8wzw9" event={"ID":"d90db976-cd03-4eb7-8e1d-361ef7c5045b","Type":"ContainerDied","Data":"e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a"} Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.995752 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8wzw9" Oct 02 11:18:57 crc kubenswrapper[4766]: I1002 11:18:57.995772 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8wzw9" event={"ID":"d90db976-cd03-4eb7-8e1d-361ef7c5045b","Type":"ContainerDied","Data":"42a68a963fa83ad37a0b7b70ffbf3c85ac15b367a028dd797b60800bd4367db5"} Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.011010 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b3b4ffb-195e-4aba-b3bc-9969b56c01d6" (UID: "6b3b4ffb-195e-4aba-b3bc-9969b56c01d6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.012192 4766 scope.go:117] "RemoveContainer" containerID="e5e102f20a63a8d722e965baa46260978733d5085519f5210483d94b2bb40281" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.024327 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-8wzw9"] Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.029166 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-8wzw9"] Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.033782 4766 scope.go:117] "RemoveContainer" containerID="4fa69755c8413dc8a6865886029bf8ec092fa254162c708d148185b7305c545c" Oct 02 11:18:58 crc kubenswrapper[4766]: E1002 11:18:58.034223 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fa69755c8413dc8a6865886029bf8ec092fa254162c708d148185b7305c545c\": container with ID starting with 4fa69755c8413dc8a6865886029bf8ec092fa254162c708d148185b7305c545c not found: ID does not exist" containerID="4fa69755c8413dc8a6865886029bf8ec092fa254162c708d148185b7305c545c" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.034254 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fa69755c8413dc8a6865886029bf8ec092fa254162c708d148185b7305c545c"} err="failed to get container status \"4fa69755c8413dc8a6865886029bf8ec092fa254162c708d148185b7305c545c\": rpc error: code = NotFound desc = could not find container \"4fa69755c8413dc8a6865886029bf8ec092fa254162c708d148185b7305c545c\": container with ID starting with 4fa69755c8413dc8a6865886029bf8ec092fa254162c708d148185b7305c545c not found: ID does not exist" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.034307 4766 scope.go:117] "RemoveContainer" containerID="e5e102f20a63a8d722e965baa46260978733d5085519f5210483d94b2bb40281" Oct 02 11:18:58 crc kubenswrapper[4766]: E1002 11:18:58.034660 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5e102f20a63a8d722e965baa46260978733d5085519f5210483d94b2bb40281\": container with ID starting with e5e102f20a63a8d722e965baa46260978733d5085519f5210483d94b2bb40281 not found: ID does not exist" containerID="e5e102f20a63a8d722e965baa46260978733d5085519f5210483d94b2bb40281" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.034687 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5e102f20a63a8d722e965baa46260978733d5085519f5210483d94b2bb40281"} err="failed to get container status \"e5e102f20a63a8d722e965baa46260978733d5085519f5210483d94b2bb40281\": rpc error: code = NotFound desc = could not find container \"e5e102f20a63a8d722e965baa46260978733d5085519f5210483d94b2bb40281\": container with ID starting with e5e102f20a63a8d722e965baa46260978733d5085519f5210483d94b2bb40281 not found: ID does not exist" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.034707 4766 scope.go:117] "RemoveContainer" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.050404 4766 scope.go:117] "RemoveContainer" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.051624 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-config-data" (OuterVolumeSpecName: "config-data") pod "6b3b4ffb-195e-4aba-b3bc-9969b56c01d6" (UID: "6b3b4ffb-195e-4aba-b3bc-9969b56c01d6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.068257 4766 scope.go:117] "RemoveContainer" containerID="da98ca489f682c05c8621e04ef1b42fa609b160482856ffde00dc0bb522f3ea4" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.068375 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.068403 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.068416 4766 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-log\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.068424 4766 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.068433 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.068441 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt4qd\" (UniqueName: \"kubernetes.io/projected/d90db976-cd03-4eb7-8e1d-361ef7c5045b-kube-api-access-wt4qd\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.068452 4766 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d90db976-cd03-4eb7-8e1d-361ef7c5045b-var-lib\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.068459 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d90db976-cd03-4eb7-8e1d-361ef7c5045b-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.068612 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.068624 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7vls\" (UniqueName: \"kubernetes.io/projected/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6-kube-api-access-n7vls\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.089238 4766 scope.go:117] "RemoveContainer" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" Oct 02 11:18:58 crc kubenswrapper[4766]: E1002 11:18:58.089657 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a\": container with ID starting with e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a not found: ID does not exist" containerID="e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.089699 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a"} err="failed to get container status \"e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a\": rpc error: code = NotFound desc = could not find container \"e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a\": container with ID starting with e687d334edfb09f6abae50e619282edf0fc45da0e51b1075aa6d69ee4b02f93a not found: ID does not exist" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.089728 4766 scope.go:117] "RemoveContainer" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" Oct 02 11:18:58 crc kubenswrapper[4766]: E1002 11:18:58.090179 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa\": container with ID starting with 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa not found: ID does not exist" containerID="3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.090210 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa"} err="failed to get container status \"3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa\": rpc error: code = NotFound desc = could not find container \"3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa\": container with ID starting with 3170d327588564c3187e84f5df2af28c5bd37d544f391af82f6cb629df998faa not found: ID does not exist" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.090247 4766 scope.go:117] "RemoveContainer" containerID="da98ca489f682c05c8621e04ef1b42fa609b160482856ffde00dc0bb522f3ea4" Oct 02 11:18:58 crc kubenswrapper[4766]: E1002 11:18:58.090489 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da98ca489f682c05c8621e04ef1b42fa609b160482856ffde00dc0bb522f3ea4\": container with ID starting with da98ca489f682c05c8621e04ef1b42fa609b160482856ffde00dc0bb522f3ea4 not found: ID does not exist" containerID="da98ca489f682c05c8621e04ef1b42fa609b160482856ffde00dc0bb522f3ea4" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.090527 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da98ca489f682c05c8621e04ef1b42fa609b160482856ffde00dc0bb522f3ea4"} err="failed to get container status \"da98ca489f682c05c8621e04ef1b42fa609b160482856ffde00dc0bb522f3ea4\": rpc error: code = NotFound desc = could not find container \"da98ca489f682c05c8621e04ef1b42fa609b160482856ffde00dc0bb522f3ea4\": container with ID starting with da98ca489f682c05c8621e04ef1b42fa609b160482856ffde00dc0bb522f3ea4 not found: ID does not exist" Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.351251 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:18:58 crc kubenswrapper[4766]: I1002 11:18:58.357644 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.013245 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerID="48f8731a29f544c073845eb8fcd06b0efc46da3e9e5d54fb23e339018591d7f7" exitCode=137 Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.013287 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerDied","Data":"48f8731a29f544c073845eb8fcd06b0efc46da3e9e5d54fb23e339018591d7f7"} Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.066080 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.184639 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd964\" (UniqueName: \"kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-kube-api-access-wd964\") pod \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.184706 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1ba556fb-6ff5-4418-a2b9-f26a51003d79-lock\") pod \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.184803 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift\") pod \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.184829 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.184849 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1ba556fb-6ff5-4418-a2b9-f26a51003d79-cache\") pod \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\" (UID: \"1ba556fb-6ff5-4418-a2b9-f26a51003d79\") " Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.185711 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ba556fb-6ff5-4418-a2b9-f26a51003d79-cache" (OuterVolumeSpecName: "cache") pod "1ba556fb-6ff5-4418-a2b9-f26a51003d79" (UID: "1ba556fb-6ff5-4418-a2b9-f26a51003d79"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.186210 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ba556fb-6ff5-4418-a2b9-f26a51003d79-lock" (OuterVolumeSpecName: "lock") pod "1ba556fb-6ff5-4418-a2b9-f26a51003d79" (UID: "1ba556fb-6ff5-4418-a2b9-f26a51003d79"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.191079 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-kube-api-access-wd964" (OuterVolumeSpecName: "kube-api-access-wd964") pod "1ba556fb-6ff5-4418-a2b9-f26a51003d79" (UID: "1ba556fb-6ff5-4418-a2b9-f26a51003d79"). InnerVolumeSpecName "kube-api-access-wd964". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.192812 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "1ba556fb-6ff5-4418-a2b9-f26a51003d79" (UID: "1ba556fb-6ff5-4418-a2b9-f26a51003d79"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.199629 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1ba556fb-6ff5-4418-a2b9-f26a51003d79" (UID: "1ba556fb-6ff5-4418-a2b9-f26a51003d79"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.286555 4766 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1ba556fb-6ff5-4418-a2b9-f26a51003d79-lock\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.286798 4766 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.286829 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.286837 4766 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1ba556fb-6ff5-4418-a2b9-f26a51003d79-cache\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.286847 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd964\" (UniqueName: \"kubernetes.io/projected/1ba556fb-6ff5-4418-a2b9-f26a51003d79-kube-api-access-wd964\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.301284 4766 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.388146 4766 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.890581 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b3b4ffb-195e-4aba-b3bc-9969b56c01d6" path="/var/lib/kubelet/pods/6b3b4ffb-195e-4aba-b3bc-9969b56c01d6/volumes" Oct 02 11:18:59 crc kubenswrapper[4766]: I1002 11:18:59.891465 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" path="/var/lib/kubelet/pods/d90db976-cd03-4eb7-8e1d-361ef7c5045b/volumes" Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.027250 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1ba556fb-6ff5-4418-a2b9-f26a51003d79","Type":"ContainerDied","Data":"021195eb124e8620d60c6158d85f33dcf5ec39aad1705b71cd2a00ec5747d331"} Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.027305 4766 scope.go:117] "RemoveContainer" containerID="48f8731a29f544c073845eb8fcd06b0efc46da3e9e5d54fb23e339018591d7f7" Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.027481 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.051194 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.051358 4766 scope.go:117] "RemoveContainer" containerID="d4d2f3213d0d8347ad4fc0fda1819961e3e48ca84f6a1a08530c3e457a9145ee" Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.055645 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.072831 4766 scope.go:117] "RemoveContainer" containerID="1b4624fea8b62d613092718192f8a9e9faa6138904a93866528c86735b20b493" Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.091775 4766 scope.go:117] "RemoveContainer" containerID="018613b4ace3495529050cc51bbcbc25c762db064d56bf3f3b2fa7c0ac1213cf" Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.115334 4766 scope.go:117] "RemoveContainer" containerID="711bf815d45f9226c3bfc08294214785c60a9b9bd2e213b6e5c41884b2a87710" Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.132968 4766 scope.go:117] "RemoveContainer" containerID="2f0ac5d9172eee436579154fd4936b18259605ce7d7deaad27a10a2c5cdbf7f4" Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.155779 4766 scope.go:117] "RemoveContainer" containerID="37549a28195c14e158d5a907b7e78a1e6d69aa6e6efd13ac3ffc312e52ea12e6" Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.176441 4766 scope.go:117] "RemoveContainer" containerID="2d24232fdec7040ef6a5964e5689deb80c9416e2b9a5928b6c44a05db6a24a58" Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.193552 4766 scope.go:117] "RemoveContainer" containerID="3647b01248aaadded5263e23d36e7d1f488a1f32c6f5c9c8f0363cd9c896464c" Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.211121 4766 scope.go:117] "RemoveContainer" containerID="a6e14799401d44ada3dfee676cdeb9c70cb0d90eacc2dca91f8d1079c24c183c" Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.228000 4766 scope.go:117] "RemoveContainer" containerID="9da3dd83501b5003165c1f29947b2365058f97cf1d114c7cef65d3123aa7bf9a" Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.253734 4766 scope.go:117] "RemoveContainer" containerID="33cb37528efacc79ce75b3a9c57737f3259f3f0c3402a200d781a3fa066c92aa" Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.276820 4766 scope.go:117] "RemoveContainer" containerID="da7cf8c62c28a42c7de4196b70da74f94b7baed423d08b118379806f1e593aa6" Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.295034 4766 scope.go:117] "RemoveContainer" containerID="3df0bb832b8ac1eab32d7e7a4d74b6f6630e1a9104832fae36b218c6eafdc058" Oct 02 11:19:00 crc kubenswrapper[4766]: I1002 11:19:00.316312 4766 scope.go:117] "RemoveContainer" containerID="2e00b32e05f94d6db2ce326a131639f65e69eaa8e091e4ac19058f4bc69304fa" Oct 02 11:19:01 crc kubenswrapper[4766]: I1002 11:19:01.891767 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" path="/var/lib/kubelet/pods/1ba556fb-6ff5-4418-a2b9-f26a51003d79/volumes" Oct 02 11:19:04 crc kubenswrapper[4766]: I1002 11:19:04.171763 4766 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod6ea7203d-5727-485f-8a6a-5bde96d05078"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod6ea7203d-5727-485f-8a6a-5bde96d05078] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6ea7203d_5727_485f_8a6a_5bde96d05078.slice" Oct 02 11:19:04 crc kubenswrapper[4766]: E1002 11:19:04.172076 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod6ea7203d-5727-485f-8a6a-5bde96d05078] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod6ea7203d-5727-485f-8a6a-5bde96d05078] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6ea7203d_5727_485f_8a6a_5bde96d05078.slice" pod="openstack/novaapibccc-account-delete-n7pf6" podUID="6ea7203d-5727-485f-8a6a-5bde96d05078" Oct 02 11:19:05 crc kubenswrapper[4766]: I1002 11:19:05.066689 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapibccc-account-delete-n7pf6" Oct 02 11:19:05 crc kubenswrapper[4766]: I1002 11:19:05.087315 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapibccc-account-delete-n7pf6"] Oct 02 11:19:05 crc kubenswrapper[4766]: I1002 11:19:05.091863 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapibccc-account-delete-n7pf6"] Oct 02 11:19:05 crc kubenswrapper[4766]: I1002 11:19:05.798600 4766 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podb869942b-07a4-4a08-b312-2b09cee2abf1"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podb869942b-07a4-4a08-b312-2b09cee2abf1] : Timed out while waiting for systemd to remove kubepods-besteffort-podb869942b_07a4_4a08_b312_2b09cee2abf1.slice" Oct 02 11:19:05 crc kubenswrapper[4766]: E1002 11:19:05.798673 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podb869942b-07a4-4a08-b312-2b09cee2abf1] : unable to destroy cgroup paths for cgroup [kubepods besteffort podb869942b-07a4-4a08-b312-2b09cee2abf1] : Timed out while waiting for systemd to remove kubepods-besteffort-podb869942b_07a4_4a08_b312_2b09cee2abf1.slice" pod="openstack/nova-api-0" podUID="b869942b-07a4-4a08-b312-2b09cee2abf1" Oct 02 11:19:05 crc kubenswrapper[4766]: I1002 11:19:05.890857 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea7203d-5727-485f-8a6a-5bde96d05078" path="/var/lib/kubelet/pods/6ea7203d-5727-485f-8a6a-5bde96d05078/volumes" Oct 02 11:19:06 crc kubenswrapper[4766]: I1002 11:19:06.076087 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 11:19:06 crc kubenswrapper[4766]: I1002 11:19:06.098094 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:19:06 crc kubenswrapper[4766]: I1002 11:19:06.104141 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 11:19:07 crc kubenswrapper[4766]: I1002 11:19:07.890335 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b869942b-07a4-4a08-b312-2b09cee2abf1" path="/var/lib/kubelet/pods/b869942b-07a4-4a08-b312-2b09cee2abf1/volumes" Oct 02 11:19:11 crc kubenswrapper[4766]: I1002 11:19:11.881544 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:19:11 crc kubenswrapper[4766]: E1002 11:19:11.881982 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:19:23 crc kubenswrapper[4766]: I1002 11:19:23.881383 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:19:23 crc kubenswrapper[4766]: E1002 11:19:23.882079 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:19:31 crc kubenswrapper[4766]: I1002 11:19:31.376145 4766 scope.go:117] "RemoveContainer" containerID="7b6649db0c3371deb879753f751b0d38b0339189eda5edb4f0fbad5ed847bc49" Oct 02 11:19:31 crc kubenswrapper[4766]: I1002 11:19:31.415878 4766 scope.go:117] "RemoveContainer" containerID="c261e4d0c71a2d44b8fccc74ed25a2644106f2f91983fb1b4e335f1366bcfec6" Oct 02 11:19:31 crc kubenswrapper[4766]: I1002 11:19:31.471852 4766 scope.go:117] "RemoveContainer" containerID="512ce4ebb3045b47a1e615569ed542e6ca78e2a66f4974e41be6a1f77ef1f5f0" Oct 02 11:19:31 crc kubenswrapper[4766]: I1002 11:19:31.493263 4766 scope.go:117] "RemoveContainer" containerID="29e42fd248d0b6b516b05f79a0ab4bd6931fba6de08f16588f76950913e24142" Oct 02 11:19:31 crc kubenswrapper[4766]: I1002 11:19:31.521257 4766 scope.go:117] "RemoveContainer" containerID="95d4da30e876b118cdc4f269c126b3e96240f4d4fdf17c6c9ab901a8dca25a20" Oct 02 11:19:31 crc kubenswrapper[4766]: I1002 11:19:31.539234 4766 scope.go:117] "RemoveContainer" containerID="df063c24cf752127f4e79a9b20cff91247c60f33737c20cd1aa1191d6550913d" Oct 02 11:19:31 crc kubenswrapper[4766]: I1002 11:19:31.561818 4766 scope.go:117] "RemoveContainer" containerID="423fd22f502bcbb76fa7bee55fb87f790a0e24c2ab7604de639bcbd74611dd08" Oct 02 11:19:31 crc kubenswrapper[4766]: I1002 11:19:31.585082 4766 scope.go:117] "RemoveContainer" containerID="82a78301519df94a54730f7315b392fb65a0b1d4477a74589b7813ff8e15631e" Oct 02 11:19:31 crc kubenswrapper[4766]: I1002 11:19:31.611712 4766 scope.go:117] "RemoveContainer" containerID="412fce6dd8d64d547cc77de430881d5d15bf12a7aa0bc9988b3411682724ce9e" Oct 02 11:19:31 crc kubenswrapper[4766]: I1002 11:19:31.628315 4766 scope.go:117] "RemoveContainer" containerID="f3b321357fa79f48983ef5c023b5c5cbac1d1998bc3b87360e098266b95ca8e7" Oct 02 11:19:31 crc kubenswrapper[4766]: I1002 11:19:31.644226 4766 scope.go:117] "RemoveContainer" containerID="f303d98e6e10d3d9cba51492add063a9d0321b2f0bf299e9009af06e8aa409eb" Oct 02 11:19:31 crc kubenswrapper[4766]: I1002 11:19:31.661906 4766 scope.go:117] "RemoveContainer" containerID="961541d1569b99b83eabdf54931372b404c77eeafc45c413f6b4937167c67180" Oct 02 11:19:31 crc kubenswrapper[4766]: I1002 11:19:31.677668 4766 scope.go:117] "RemoveContainer" containerID="8ecd910a6ea237f16877cbe960d6203045eacb4bb226a18f16662ac0b8c5ad78" Oct 02 11:19:31 crc kubenswrapper[4766]: I1002 11:19:31.701834 4766 scope.go:117] "RemoveContainer" containerID="c51327eb234af217d66c3ca5a46e6220ad81f8519d7a4eaf37c6d30894c229f7" Oct 02 11:19:31 crc kubenswrapper[4766]: I1002 11:19:31.716690 4766 scope.go:117] "RemoveContainer" containerID="dc2b681fecd8d81f1e30f9755b958fbc42b452ed488323cea6e899db4471b0f6" Oct 02 11:19:31 crc kubenswrapper[4766]: I1002 11:19:31.742101 4766 scope.go:117] "RemoveContainer" containerID="ebe8d8190ea569d86618041b0db9265a5d28b6160651bd622c8b41a97932fbfe" Oct 02 11:19:34 crc kubenswrapper[4766]: I1002 11:19:34.881551 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:19:34 crc kubenswrapper[4766]: E1002 11:19:34.881948 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:19:48 crc kubenswrapper[4766]: I1002 11:19:48.881991 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:19:48 crc kubenswrapper[4766]: E1002 11:19:48.882958 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:20:03 crc kubenswrapper[4766]: I1002 11:20:03.881947 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:20:03 crc kubenswrapper[4766]: E1002 11:20:03.882731 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:20:14 crc kubenswrapper[4766]: I1002 11:20:14.881100 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:20:14 crc kubenswrapper[4766]: E1002 11:20:14.882098 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:20:27 crc kubenswrapper[4766]: I1002 11:20:27.881711 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:20:27 crc kubenswrapper[4766]: E1002 11:20:27.882459 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:20:32 crc kubenswrapper[4766]: I1002 11:20:32.350433 4766 scope.go:117] "RemoveContainer" containerID="68d08e12b25020fbe30c5d0dfd77d7b48a170b4b187be4c4958856f980dbd584" Oct 02 11:20:32 crc kubenswrapper[4766]: I1002 11:20:32.406329 4766 scope.go:117] "RemoveContainer" containerID="e59b9e3e28f7f3521b5649e7c481c6db733ebb4c636e4f6e575b4857b80ca3bd" Oct 02 11:20:32 crc kubenswrapper[4766]: I1002 11:20:32.435967 4766 scope.go:117] "RemoveContainer" containerID="edbb3da01856c2a41dd64eefb64dc0d38f95f5a264b444f1dbcac1d668a8635d" Oct 02 11:20:32 crc kubenswrapper[4766]: I1002 11:20:32.483831 4766 scope.go:117] "RemoveContainer" containerID="12eb8d333452acd192738223ca8f4fa9dc2c00bb7c7f6c98059bd4f4f3f58665" Oct 02 11:20:32 crc kubenswrapper[4766]: I1002 11:20:32.528013 4766 scope.go:117] "RemoveContainer" containerID="9e3131de3333db6b7ac4e841b0504597a46f9838934a930881def24fa734d3b6" Oct 02 11:20:39 crc kubenswrapper[4766]: I1002 11:20:39.882711 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:20:39 crc kubenswrapper[4766]: E1002 11:20:39.883500 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:20:54 crc kubenswrapper[4766]: I1002 11:20:54.880768 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:20:54 crc kubenswrapper[4766]: E1002 11:20:54.881453 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:21:07 crc kubenswrapper[4766]: I1002 11:21:07.881427 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:21:07 crc kubenswrapper[4766]: E1002 11:21:07.882197 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:21:22 crc kubenswrapper[4766]: I1002 11:21:22.880946 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:21:22 crc kubenswrapper[4766]: E1002 11:21:22.881843 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:21:32 crc kubenswrapper[4766]: I1002 11:21:32.625091 4766 scope.go:117] "RemoveContainer" containerID="1739e7ba00b0bf9bb771dca9d289d586ed12ce58d682ed8f6ac6cf039138a291" Oct 02 11:21:32 crc kubenswrapper[4766]: I1002 11:21:32.659022 4766 scope.go:117] "RemoveContainer" containerID="867efe65a7b16f3f4c20bd8fbe635248271d0736d8757d4cc1dbab820099a089" Oct 02 11:21:32 crc kubenswrapper[4766]: I1002 11:21:32.717492 4766 scope.go:117] "RemoveContainer" containerID="2259de9d8ca58008098b26cc61e357e3f7d5f3b09133b0ffed15372297e474f1" Oct 02 11:21:32 crc kubenswrapper[4766]: I1002 11:21:32.733132 4766 scope.go:117] "RemoveContainer" containerID="f56abd7e1794b77ea62792a0bd79f484e63160d0da2a2b9689b20ab708f80f3a" Oct 02 11:21:32 crc kubenswrapper[4766]: I1002 11:21:32.748287 4766 scope.go:117] "RemoveContainer" containerID="c9d72b05ecbb9e359c79bf03ad7d011f837f63b6ad4298ee304c7cc1c4031928" Oct 02 11:21:32 crc kubenswrapper[4766]: I1002 11:21:32.768909 4766 scope.go:117] "RemoveContainer" containerID="0e158a295e0a51624444cffd5cf073f04a1ecabdd07a8ff50f1a37d13288e8a2" Oct 02 11:21:32 crc kubenswrapper[4766]: I1002 11:21:32.797451 4766 scope.go:117] "RemoveContainer" containerID="6cd538b1cdf3993f6cd959aebdda72d9055a7730e3cb15b1a555f7da8b9b1353" Oct 02 11:21:32 crc kubenswrapper[4766]: I1002 11:21:32.827966 4766 scope.go:117] "RemoveContainer" containerID="bad133e51d3973fb304b20194be3bf5efca1233e15ed4a89282b8c538aa01f90" Oct 02 11:21:32 crc kubenswrapper[4766]: I1002 11:21:32.845710 4766 scope.go:117] "RemoveContainer" containerID="acd8271431ed0cd3230257d6a30239e607a8fd1008099bf28ae24a2917d9dd2b" Oct 02 11:21:32 crc kubenswrapper[4766]: I1002 11:21:32.868297 4766 scope.go:117] "RemoveContainer" containerID="ea0c10f7b96fd5133417a93349e2d33afb1ade7b529354f115840f9c68087314" Oct 02 11:21:32 crc kubenswrapper[4766]: I1002 11:21:32.891605 4766 scope.go:117] "RemoveContainer" containerID="fb1c0454ba668b962552208833c616de4db07019cb885d1cc5bdc0cc294a91b5" Oct 02 11:21:32 crc kubenswrapper[4766]: I1002 11:21:32.911260 4766 scope.go:117] "RemoveContainer" containerID="3d18a6249adb9873ab43d08de93aedc55c827c33e60c6d8d86226fc15f98872a" Oct 02 11:21:32 crc kubenswrapper[4766]: I1002 11:21:32.937005 4766 scope.go:117] "RemoveContainer" containerID="0b8dae08b6fef80dba408e5df15861c9a4ec087115f964d50b2c13d7ce34c9a4" Oct 02 11:21:37 crc kubenswrapper[4766]: I1002 11:21:37.881178 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:21:37 crc kubenswrapper[4766]: E1002 11:21:37.881658 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:21:50 crc kubenswrapper[4766]: I1002 11:21:50.882172 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:21:50 crc kubenswrapper[4766]: E1002 11:21:50.882729 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:22:01 crc kubenswrapper[4766]: I1002 11:22:01.881675 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:22:02 crc kubenswrapper[4766]: I1002 11:22:02.534702 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"ad849014129d0cceee0f98aefb0b7ee04dec448811308b28f18e68707adcd334"} Oct 02 11:22:33 crc kubenswrapper[4766]: I1002 11:22:33.110068 4766 scope.go:117] "RemoveContainer" containerID="24dcbf81a6048e4223093aaf313d135dc7e342e1aad2567595ccd81590fd91ce" Oct 02 11:22:33 crc kubenswrapper[4766]: I1002 11:22:33.136382 4766 scope.go:117] "RemoveContainer" containerID="7c63b673954ab2d5062dc5c8c71c3cfb5b1fdbffd48e5e8ae4259cb2050e9614" Oct 02 11:22:33 crc kubenswrapper[4766]: I1002 11:22:33.153588 4766 scope.go:117] "RemoveContainer" containerID="224becfa27d031ef4fcc783c953605675730a85f887b4f3671b869a18bf84129" Oct 02 11:22:33 crc kubenswrapper[4766]: I1002 11:22:33.204657 4766 scope.go:117] "RemoveContainer" containerID="3165e7e8bf3173b47ec39b7859d2841e0ef5c79cdc3391e216907ef6acd85df3" Oct 02 11:23:33 crc kubenswrapper[4766]: I1002 11:23:33.304177 4766 scope.go:117] "RemoveContainer" containerID="5be38962640da76b9046534d30afa195b17c52471dba02787f5ee4b789836031" Oct 02 11:23:33 crc kubenswrapper[4766]: I1002 11:23:33.347229 4766 scope.go:117] "RemoveContainer" containerID="3f01021d7015f6ce7b28a4464cf46b4c73b350981093792e92fec171df9fdb63" Oct 02 11:23:33 crc kubenswrapper[4766]: I1002 11:23:33.381592 4766 scope.go:117] "RemoveContainer" containerID="3c3f1237df38c8b6ccd32b780a793babbd91e3830b9cb19cf87c41863f677752" Oct 02 11:24:24 crc kubenswrapper[4766]: I1002 11:24:24.432206 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:24:24 crc kubenswrapper[4766]: I1002 11:24:24.433420 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:24:33 crc kubenswrapper[4766]: I1002 11:24:33.436547 4766 scope.go:117] "RemoveContainer" containerID="bd5acfd9f8b6e410882799f1e42f4fb64c5bdc13db56d36a3ca2d32eb57ddec0" Oct 02 11:24:33 crc kubenswrapper[4766]: I1002 11:24:33.457711 4766 scope.go:117] "RemoveContainer" containerID="e79b5f1b193fe730d66df4b38d682b12876fcbb70e51c1029f4b2b498b6ededd" Oct 02 11:24:33 crc kubenswrapper[4766]: I1002 11:24:33.487717 4766 scope.go:117] "RemoveContainer" containerID="f6cd0095fa2a61271a1c6b2812af96732964a92ed3e96a78c81919c9ef11e724" Oct 02 11:24:33 crc kubenswrapper[4766]: I1002 11:24:33.507797 4766 scope.go:117] "RemoveContainer" containerID="cc2390f57c0079f23233bafd281f8a6e5358940b44e887d48c5d270690334bf0" Oct 02 11:24:54 crc kubenswrapper[4766]: I1002 11:24:54.432002 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:24:54 crc kubenswrapper[4766]: I1002 11:24:54.432685 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:25:24 crc kubenswrapper[4766]: I1002 11:25:24.432075 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:25:24 crc kubenswrapper[4766]: I1002 11:25:24.432668 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:25:24 crc kubenswrapper[4766]: I1002 11:25:24.432716 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 11:25:24 crc kubenswrapper[4766]: I1002 11:25:24.433269 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad849014129d0cceee0f98aefb0b7ee04dec448811308b28f18e68707adcd334"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:25:24 crc kubenswrapper[4766]: I1002 11:25:24.433318 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://ad849014129d0cceee0f98aefb0b7ee04dec448811308b28f18e68707adcd334" gracePeriod=600 Oct 02 11:25:24 crc kubenswrapper[4766]: I1002 11:25:24.957347 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="ad849014129d0cceee0f98aefb0b7ee04dec448811308b28f18e68707adcd334" exitCode=0 Oct 02 11:25:24 crc kubenswrapper[4766]: I1002 11:25:24.958015 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"ad849014129d0cceee0f98aefb0b7ee04dec448811308b28f18e68707adcd334"} Oct 02 11:25:24 crc kubenswrapper[4766]: I1002 11:25:24.958067 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9"} Oct 02 11:25:24 crc kubenswrapper[4766]: I1002 11:25:24.958095 4766 scope.go:117] "RemoveContainer" containerID="3f9dff8cf37dfe67d5d8418573c4e434d4959b690086f3222b9b79d1cf714b1e" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.713281 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r2zfp"] Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714307 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fdf37c9-9a32-4103-8418-198d45d14415" containerName="init" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714321 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fdf37c9-9a32-4103-8418-198d45d14415" containerName="init" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714341 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be5e935-0d64-4fed-a00a-bd0cb5891e75" containerName="ovsdbserver-sb" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714350 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be5e935-0d64-4fed-a00a-bd0cb5891e75" containerName="ovsdbserver-sb" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714363 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fdf37c9-9a32-4103-8418-198d45d14415" containerName="dnsmasq-dns" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714378 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fdf37c9-9a32-4103-8418-198d45d14415" containerName="dnsmasq-dns" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714387 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be46caa-0351-4f60-b16b-a258b9874a6f" containerName="extract-content" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714393 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be46caa-0351-4f60-b16b-a258b9874a6f" containerName="extract-content" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714403 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b90dab1-a183-4adc-b415-b67bd0d782f7" containerName="ovsdbserver-nb" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714409 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b90dab1-a183-4adc-b415-b67bd0d782f7" containerName="ovsdbserver-nb" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714418 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="874d062e-d2f8-462c-95b3-8f630b7120af" containerName="setup-container" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714423 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="874d062e-d2f8-462c-95b3-8f630b7120af" containerName="setup-container" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714439 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5ac374-df46-4a36-947d-de07af25426c" containerName="sg-core" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714444 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5ac374-df46-4a36-947d-de07af25426c" containerName="sg-core" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714450 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dec5495-e66b-4e5e-90b6-82ee673ab269" containerName="nova-scheduler-scheduler" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714456 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dec5495-e66b-4e5e-90b6-82ee673ab269" containerName="nova-scheduler-scheduler" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714466 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52bdcfda-75b3-450b-9db4-1a443be18fa3" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714475 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="52bdcfda-75b3-450b-9db4-1a443be18fa3" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714483 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-auditor" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714489 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-auditor" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714496 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovs-vswitchd" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714521 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovs-vswitchd" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714530 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" containerName="glance-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714536 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" containerName="glance-log" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714546 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="rsync" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714552 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="rsync" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714559 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95842554-1651-4c34-b934-d4eb21c6c52d" containerName="openstack-network-exporter" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714565 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="95842554-1651-4c34-b934-d4eb21c6c52d" containerName="openstack-network-exporter" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714575 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c8c5ed-b069-4112-ae71-d9071bc15ff2" containerName="barbican-api" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714636 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c8c5ed-b069-4112-ae71-d9071bc15ff2" containerName="barbican-api" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714669 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44893df1-77c5-494c-bae0-253447abc8f4" containerName="proxy-httpd" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714677 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="44893df1-77c5-494c-bae0-253447abc8f4" containerName="proxy-httpd" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714692 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3384223-2ad0-4593-976c-54c2d3cce52e" containerName="glance-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714700 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3384223-2ad0-4593-976c-54c2d3cce52e" containerName="glance-log" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714706 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0064fd48-390f-4a0f-abfe-9922c8c431f9" containerName="nova-cell1-conductor-conductor" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714711 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0064fd48-390f-4a0f-abfe-9922c8c431f9" containerName="nova-cell1-conductor-conductor" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714719 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be46caa-0351-4f60-b16b-a258b9874a6f" containerName="registry-server" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714724 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be46caa-0351-4f60-b16b-a258b9874a6f" containerName="registry-server" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714735 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d43eab0-4595-42fc-8489-38792e0c6e19" containerName="barbican-worker" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714740 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d43eab0-4595-42fc-8489-38792e0c6e19" containerName="barbican-worker" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714748 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8cdac7-81c9-41e7-a956-41d13e5b91a6" containerName="neutron-api" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714753 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8cdac7-81c9-41e7-a956-41d13e5b91a6" containerName="neutron-api" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714764 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9860354f-7494-4b02-bca3-adc731683f7f" containerName="ovn-controller" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714770 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9860354f-7494-4b02-bca3-adc731683f7f" containerName="ovn-controller" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714783 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e63ea453-c8bd-4128-a47e-7b0d740a6066" containerName="mysql-bootstrap" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714788 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e63ea453-c8bd-4128-a47e-7b0d740a6066" containerName="mysql-bootstrap" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714795 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be5e935-0d64-4fed-a00a-bd0cb5891e75" containerName="openstack-network-exporter" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714801 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be5e935-0d64-4fed-a00a-bd0cb5891e75" containerName="openstack-network-exporter" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714807 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" containerName="glance-httpd" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714813 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" containerName="glance-httpd" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714823 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5ac374-df46-4a36-947d-de07af25426c" containerName="ceilometer-notification-agent" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714830 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5ac374-df46-4a36-947d-de07af25426c" containerName="ceilometer-notification-agent" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714845 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="container-auditor" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714852 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="container-auditor" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714871 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovsdb-server-init" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714879 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovsdb-server-init" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714890 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9bc510-a878-4e06-8db9-fd6209039c75" containerName="galera" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714896 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9bc510-a878-4e06-8db9-fd6209039c75" containerName="galera" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714904 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3b4ffb-195e-4aba-b3bc-9969b56c01d6" containerName="cinder-scheduler" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714909 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3b4ffb-195e-4aba-b3bc-9969b56c01d6" containerName="cinder-scheduler" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714919 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5ac374-df46-4a36-947d-de07af25426c" containerName="ceilometer-central-agent" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714926 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5ac374-df46-4a36-947d-de07af25426c" containerName="ceilometer-central-agent" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714935 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5a88cf-8095-4025-a68a-349c579dddd3" containerName="memcached" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714941 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5a88cf-8095-4025-a68a-349c579dddd3" containerName="memcached" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714955 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7498a37c-33a3-4a3a-9c72-64a0c533282c" containerName="openstack-network-exporter" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714961 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7498a37c-33a3-4a3a-9c72-64a0c533282c" containerName="openstack-network-exporter" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714969 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="account-server" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714975 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="account-server" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.714983 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db4400a2-c286-467e-b62d-a5cb3042aa88" containerName="nova-cell0-conductor-conductor" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.714991 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="db4400a2-c286-467e-b62d-a5cb3042aa88" containerName="nova-cell0-conductor-conductor" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715001 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a739206-d877-4212-9242-47a59c440b40" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715008 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a739206-d877-4212-9242-47a59c440b40" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715019 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9339929-4331-4cd9-89bc-8350ef2f55f5" containerName="barbican-keystone-listener-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715027 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9339929-4331-4cd9-89bc-8350ef2f55f5" containerName="barbican-keystone-listener-log" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715036 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1282b506-728d-4c6f-aa9c-3d3c1f826b71" containerName="setup-container" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715042 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1282b506-728d-4c6f-aa9c-3d3c1f826b71" containerName="setup-container" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715055 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd6760c-af87-4e2a-adcd-5fe3ca636fef" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715063 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd6760c-af87-4e2a-adcd-5fe3ca636fef" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715076 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7498a37c-33a3-4a3a-9c72-64a0c533282c" containerName="ovn-northd" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715083 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7498a37c-33a3-4a3a-9c72-64a0c533282c" containerName="ovn-northd" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715092 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="account-reaper" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715098 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="account-reaper" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715104 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-expirer" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715111 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-expirer" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715120 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b869942b-07a4-4a08-b312-2b09cee2abf1" containerName="nova-api-api" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715126 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b869942b-07a4-4a08-b312-2b09cee2abf1" containerName="nova-api-api" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715134 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b869942b-07a4-4a08-b312-2b09cee2abf1" containerName="nova-api-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715140 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b869942b-07a4-4a08-b312-2b09cee2abf1" containerName="nova-api-log" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715151 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovsdb-server" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715156 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovsdb-server" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715164 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="container-updater" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715169 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="container-updater" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715181 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8cdac7-81c9-41e7-a956-41d13e5b91a6" containerName="neutron-httpd" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715187 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8cdac7-81c9-41e7-a956-41d13e5b91a6" containerName="neutron-httpd" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715195 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fac2bf1-fb0f-4031-bfc8-34090cc90c8a" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715208 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fac2bf1-fb0f-4031-bfc8-34090cc90c8a" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715218 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="swift-recon-cron" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715227 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="swift-recon-cron" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715242 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94c8c5ed-b069-4112-ae71-d9071bc15ff2" containerName="barbican-api-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715248 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="94c8c5ed-b069-4112-ae71-d9071bc15ff2" containerName="barbican-api-log" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715256 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07e3dfd6-c718-4304-9770-edbbfaca9cf4" containerName="keystone-api" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715262 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="07e3dfd6-c718-4304-9770-edbbfaca9cf4" containerName="keystone-api" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715271 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb13294-05c1-4a20-8265-5144efcd91cf" containerName="nova-metadata-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715276 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb13294-05c1-4a20-8265-5144efcd91cf" containerName="nova-metadata-log" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715295 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea7203d-5727-485f-8a6a-5bde96d05078" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715303 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea7203d-5727-485f-8a6a-5bde96d05078" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715318 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5ac374-df46-4a36-947d-de07af25426c" containerName="proxy-httpd" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715325 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5ac374-df46-4a36-947d-de07af25426c" containerName="proxy-httpd" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715337 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="874d062e-d2f8-462c-95b3-8f630b7120af" containerName="rabbitmq" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715344 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="874d062e-d2f8-462c-95b3-8f630b7120af" containerName="rabbitmq" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715353 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-replicator" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715361 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-replicator" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715373 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b90dab1-a183-4adc-b415-b67bd0d782f7" containerName="openstack-network-exporter" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715380 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b90dab1-a183-4adc-b415-b67bd0d782f7" containerName="openstack-network-exporter" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715408 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-updater" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715415 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-updater" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715426 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a8ba140-6dc8-4023-9789-7f288b85159b" containerName="kube-state-metrics" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715432 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a8ba140-6dc8-4023-9789-7f288b85159b" containerName="kube-state-metrics" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715440 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae83487-5f24-4934-aba5-9ee2ca6ca657" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715490 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae83487-5f24-4934-aba5-9ee2ca6ca657" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715515 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b3b4ffb-195e-4aba-b3bc-9969b56c01d6" containerName="probe" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715521 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b3b4ffb-195e-4aba-b3bc-9969b56c01d6" containerName="probe" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715528 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="123b65f7-a8e8-434b-baf1-e9b0d3a985d9" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715534 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="123b65f7-a8e8-434b-baf1-e9b0d3a985d9" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715543 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb13294-05c1-4a20-8265-5144efcd91cf" containerName="nova-metadata-metadata" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715549 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb13294-05c1-4a20-8265-5144efcd91cf" containerName="nova-metadata-metadata" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715555 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="container-replicator" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715561 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="container-replicator" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715573 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1282b506-728d-4c6f-aa9c-3d3c1f826b71" containerName="rabbitmq" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715578 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1282b506-728d-4c6f-aa9c-3d3c1f826b71" containerName="rabbitmq" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715587 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44893df1-77c5-494c-bae0-253447abc8f4" containerName="proxy-server" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715592 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="44893df1-77c5-494c-bae0-253447abc8f4" containerName="proxy-server" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715598 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be46caa-0351-4f60-b16b-a258b9874a6f" containerName="extract-utilities" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715604 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be46caa-0351-4f60-b16b-a258b9874a6f" containerName="extract-utilities" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715610 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae83487-5f24-4934-aba5-9ee2ca6ca657" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715616 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae83487-5f24-4934-aba5-9ee2ca6ca657" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715623 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9bc510-a878-4e06-8db9-fd6209039c75" containerName="mysql-bootstrap" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715629 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9bc510-a878-4e06-8db9-fd6209039c75" containerName="mysql-bootstrap" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715641 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" containerName="placement-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715647 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" containerName="placement-log" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715655 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d43eab0-4595-42fc-8489-38792e0c6e19" containerName="barbican-worker-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715662 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d43eab0-4595-42fc-8489-38792e0c6e19" containerName="barbican-worker-log" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715670 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e63ea453-c8bd-4128-a47e-7b0d740a6066" containerName="galera" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715677 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e63ea453-c8bd-4128-a47e-7b0d740a6066" containerName="galera" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715687 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d0079a-03e3-4e5f-81a2-81f5bceb795c" containerName="cinder-api-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715722 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d0079a-03e3-4e5f-81a2-81f5bceb795c" containerName="cinder-api-log" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715730 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d0079a-03e3-4e5f-81a2-81f5bceb795c" containerName="cinder-api" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715735 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d0079a-03e3-4e5f-81a2-81f5bceb795c" containerName="cinder-api" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715746 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3384223-2ad0-4593-976c-54c2d3cce52e" containerName="glance-httpd" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715752 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3384223-2ad0-4593-976c-54c2d3cce52e" containerName="glance-httpd" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715802 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="account-replicator" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715810 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="account-replicator" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715821 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="container-server" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715827 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="container-server" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715838 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="account-auditor" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715844 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="account-auditor" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715853 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" containerName="placement-api" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715859 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" containerName="placement-api" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715868 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-server" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715873 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-server" Oct 02 11:26:13 crc kubenswrapper[4766]: E1002 11:26:13.715884 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9339929-4331-4cd9-89bc-8350ef2f55f5" containerName="barbican-keystone-listener" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.715889 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9339929-4331-4cd9-89bc-8350ef2f55f5" containerName="barbican-keystone-listener" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716181 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="container-server" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716192 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5a88cf-8095-4025-a68a-349c579dddd3" containerName="memcached" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716201 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="rsync" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716209 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9339929-4331-4cd9-89bc-8350ef2f55f5" containerName="barbican-keystone-listener-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716216 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbb13294-05c1-4a20-8265-5144efcd91cf" containerName="nova-metadata-metadata" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716224 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3384223-2ad0-4593-976c-54c2d3cce52e" containerName="glance-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716233 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0064fd48-390f-4a0f-abfe-9922c8c431f9" containerName="nova-cell1-conductor-conductor" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716241 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b90dab1-a183-4adc-b415-b67bd0d782f7" containerName="openstack-network-exporter" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716249 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="94c8c5ed-b069-4112-ae71-d9071bc15ff2" containerName="barbican-api-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716259 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b869942b-07a4-4a08-b312-2b09cee2abf1" containerName="nova-api-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716271 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-expirer" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716282 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="container-replicator" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716288 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="container-auditor" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716294 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec8cdac7-81c9-41e7-a956-41d13e5b91a6" containerName="neutron-httpd" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716303 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d0079a-03e3-4e5f-81a2-81f5bceb795c" containerName="cinder-api" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716309 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d43eab0-4595-42fc-8489-38792e0c6e19" containerName="barbican-worker-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716319 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="94c8c5ed-b069-4112-ae71-d9071bc15ff2" containerName="barbican-api" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716327 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7498a37c-33a3-4a3a-9c72-64a0c533282c" containerName="ovn-northd" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716335 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b90dab1-a183-4adc-b415-b67bd0d782f7" containerName="ovsdbserver-nb" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716354 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbb13294-05c1-4a20-8265-5144efcd91cf" containerName="nova-metadata-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716365 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="account-reaper" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716375 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a739206-d877-4212-9242-47a59c440b40" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716384 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d43eab0-4595-42fc-8489-38792e0c6e19" containerName="barbican-worker" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716390 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5ac374-df46-4a36-947d-de07af25426c" containerName="ceilometer-central-agent" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716400 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="44893df1-77c5-494c-bae0-253447abc8f4" containerName="proxy-server" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716408 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="container-updater" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716416 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovs-vswitchd" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716424 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae83487-5f24-4934-aba5-9ee2ca6ca657" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716433 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="95842554-1651-4c34-b934-d4eb21c6c52d" containerName="openstack-network-exporter" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716443 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-updater" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716448 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="swift-recon-cron" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716458 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dec5495-e66b-4e5e-90b6-82ee673ab269" containerName="nova-scheduler-scheduler" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716466 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e63ea453-c8bd-4128-a47e-7b0d740a6066" containerName="galera" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716476 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="account-server" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716484 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9860354f-7494-4b02-bca3-adc731683f7f" containerName="ovn-controller" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716493 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" containerName="glance-httpd" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716520 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-auditor" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716527 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="44893df1-77c5-494c-bae0-253447abc8f4" containerName="proxy-httpd" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716537 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae83487-5f24-4934-aba5-9ee2ca6ca657" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716544 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec8cdac7-81c9-41e7-a956-41d13e5b91a6" containerName="neutron-api" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716553 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="874d062e-d2f8-462c-95b3-8f630b7120af" containerName="rabbitmq" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716562 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="db4400a2-c286-467e-b62d-a5cb3042aa88" containerName="nova-cell0-conductor-conductor" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716569 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2be5e935-0d64-4fed-a00a-bd0cb5891e75" containerName="openstack-network-exporter" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716578 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="123b65f7-a8e8-434b-baf1-e9b0d3a985d9" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716585 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b9bc510-a878-4e06-8db9-fd6209039c75" containerName="galera" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716591 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" containerName="placement-api" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716597 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5ac374-df46-4a36-947d-de07af25426c" containerName="ceilometer-notification-agent" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716612 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="52bdcfda-75b3-450b-9db4-1a443be18fa3" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716619 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b3b4ffb-195e-4aba-b3bc-9969b56c01d6" containerName="probe" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716627 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-server" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716636 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be46caa-0351-4f60-b16b-a258b9874a6f" containerName="registry-server" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716646 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fdf37c9-9a32-4103-8418-198d45d14415" containerName="dnsmasq-dns" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716654 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd6760c-af87-4e2a-adcd-5fe3ca636fef" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716663 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d0079a-03e3-4e5f-81a2-81f5bceb795c" containerName="cinder-api-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716670 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d90db976-cd03-4eb7-8e1d-361ef7c5045b" containerName="ovsdb-server" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716678 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="account-replicator" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716685 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eb84667-7ff3-441c-ab7c-ccc4fc9233ca" containerName="placement-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716694 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5ac374-df46-4a36-947d-de07af25426c" containerName="sg-core" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716700 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3cd6a1-f457-4e7c-93ce-fce8c4a1d598" containerName="glance-log" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716714 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea7203d-5727-485f-8a6a-5bde96d05078" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716722 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7498a37c-33a3-4a3a-9c72-64a0c533282c" containerName="openstack-network-exporter" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716732 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9339929-4331-4cd9-89bc-8350ef2f55f5" containerName="barbican-keystone-listener" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716741 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b869942b-07a4-4a08-b312-2b09cee2abf1" containerName="nova-api-api" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716747 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3384223-2ad0-4593-976c-54c2d3cce52e" containerName="glance-httpd" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716754 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1282b506-728d-4c6f-aa9c-3d3c1f826b71" containerName="rabbitmq" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716762 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fac2bf1-fb0f-4031-bfc8-34090cc90c8a" containerName="mariadb-account-delete" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716769 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2be5e935-0d64-4fed-a00a-bd0cb5891e75" containerName="ovsdbserver-sb" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716777 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a8ba140-6dc8-4023-9789-7f288b85159b" containerName="kube-state-metrics" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716785 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="object-replicator" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716792 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="07e3dfd6-c718-4304-9770-edbbfaca9cf4" containerName="keystone-api" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716797 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b3b4ffb-195e-4aba-b3bc-9969b56c01d6" containerName="cinder-scheduler" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716804 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba556fb-6ff5-4418-a2b9-f26a51003d79" containerName="account-auditor" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.716809 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5ac374-df46-4a36-947d-de07af25426c" containerName="proxy-httpd" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.717983 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2zfp" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.724056 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2zfp"] Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.826840 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af4071a1-baa1-4890-8325-1c174838e696-catalog-content\") pod \"redhat-marketplace-r2zfp\" (UID: \"af4071a1-baa1-4890-8325-1c174838e696\") " pod="openshift-marketplace/redhat-marketplace-r2zfp" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.826904 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af4071a1-baa1-4890-8325-1c174838e696-utilities\") pod \"redhat-marketplace-r2zfp\" (UID: \"af4071a1-baa1-4890-8325-1c174838e696\") " pod="openshift-marketplace/redhat-marketplace-r2zfp" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.826953 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv7xw\" (UniqueName: \"kubernetes.io/projected/af4071a1-baa1-4890-8325-1c174838e696-kube-api-access-zv7xw\") pod \"redhat-marketplace-r2zfp\" (UID: \"af4071a1-baa1-4890-8325-1c174838e696\") " pod="openshift-marketplace/redhat-marketplace-r2zfp" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.928651 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af4071a1-baa1-4890-8325-1c174838e696-catalog-content\") pod \"redhat-marketplace-r2zfp\" (UID: \"af4071a1-baa1-4890-8325-1c174838e696\") " pod="openshift-marketplace/redhat-marketplace-r2zfp" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.928733 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af4071a1-baa1-4890-8325-1c174838e696-utilities\") pod \"redhat-marketplace-r2zfp\" (UID: \"af4071a1-baa1-4890-8325-1c174838e696\") " pod="openshift-marketplace/redhat-marketplace-r2zfp" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.928787 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv7xw\" (UniqueName: \"kubernetes.io/projected/af4071a1-baa1-4890-8325-1c174838e696-kube-api-access-zv7xw\") pod \"redhat-marketplace-r2zfp\" (UID: \"af4071a1-baa1-4890-8325-1c174838e696\") " pod="openshift-marketplace/redhat-marketplace-r2zfp" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.929405 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af4071a1-baa1-4890-8325-1c174838e696-catalog-content\") pod \"redhat-marketplace-r2zfp\" (UID: \"af4071a1-baa1-4890-8325-1c174838e696\") " pod="openshift-marketplace/redhat-marketplace-r2zfp" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.929444 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af4071a1-baa1-4890-8325-1c174838e696-utilities\") pod \"redhat-marketplace-r2zfp\" (UID: \"af4071a1-baa1-4890-8325-1c174838e696\") " pod="openshift-marketplace/redhat-marketplace-r2zfp" Oct 02 11:26:13 crc kubenswrapper[4766]: I1002 11:26:13.948454 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv7xw\" (UniqueName: \"kubernetes.io/projected/af4071a1-baa1-4890-8325-1c174838e696-kube-api-access-zv7xw\") pod \"redhat-marketplace-r2zfp\" (UID: \"af4071a1-baa1-4890-8325-1c174838e696\") " pod="openshift-marketplace/redhat-marketplace-r2zfp" Oct 02 11:26:14 crc kubenswrapper[4766]: I1002 11:26:14.044770 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2zfp" Oct 02 11:26:14 crc kubenswrapper[4766]: I1002 11:26:14.323650 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2zfp"] Oct 02 11:26:15 crc kubenswrapper[4766]: I1002 11:26:15.296111 4766 generic.go:334] "Generic (PLEG): container finished" podID="af4071a1-baa1-4890-8325-1c174838e696" containerID="75011b63d352c8cb16e55ab88b96e0fab30eab51fea4dda068d7e846453e69b9" exitCode=0 Oct 02 11:26:15 crc kubenswrapper[4766]: I1002 11:26:15.296176 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2zfp" event={"ID":"af4071a1-baa1-4890-8325-1c174838e696","Type":"ContainerDied","Data":"75011b63d352c8cb16e55ab88b96e0fab30eab51fea4dda068d7e846453e69b9"} Oct 02 11:26:15 crc kubenswrapper[4766]: I1002 11:26:15.296452 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2zfp" event={"ID":"af4071a1-baa1-4890-8325-1c174838e696","Type":"ContainerStarted","Data":"ab45453bd392c7af501ca1e82da747e67079c92cf99a6c2dc636fbbff2b8309b"} Oct 02 11:26:15 crc kubenswrapper[4766]: I1002 11:26:15.298843 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:26:16 crc kubenswrapper[4766]: I1002 11:26:16.304552 4766 generic.go:334] "Generic (PLEG): container finished" podID="af4071a1-baa1-4890-8325-1c174838e696" containerID="3e904eec2715f9043b85625f8027a399623bace08e29d43e08f34ef69cb20613" exitCode=0 Oct 02 11:26:16 crc kubenswrapper[4766]: I1002 11:26:16.304679 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2zfp" event={"ID":"af4071a1-baa1-4890-8325-1c174838e696","Type":"ContainerDied","Data":"3e904eec2715f9043b85625f8027a399623bace08e29d43e08f34ef69cb20613"} Oct 02 11:26:17 crc kubenswrapper[4766]: I1002 11:26:17.312809 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2zfp" event={"ID":"af4071a1-baa1-4890-8325-1c174838e696","Type":"ContainerStarted","Data":"1c16917d6e25e3ea0295ccfe0da70d563aad11c68eb99a0837ad59e8a12c5950"} Oct 02 11:26:17 crc kubenswrapper[4766]: I1002 11:26:17.331758 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r2zfp" podStartSLOduration=2.84531683 podStartE2EDuration="4.331736394s" podCreationTimestamp="2025-10-02 11:26:13 +0000 UTC" firstStartedPulling="2025-10-02 11:26:15.298592967 +0000 UTC m=+2090.241463911" lastFinishedPulling="2025-10-02 11:26:16.785012531 +0000 UTC m=+2091.727883475" observedRunningTime="2025-10-02 11:26:17.328221242 +0000 UTC m=+2092.271092216" watchObservedRunningTime="2025-10-02 11:26:17.331736394 +0000 UTC m=+2092.274607328" Oct 02 11:26:24 crc kubenswrapper[4766]: I1002 11:26:24.046065 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r2zfp" Oct 02 11:26:24 crc kubenswrapper[4766]: I1002 11:26:24.046592 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r2zfp" Oct 02 11:26:24 crc kubenswrapper[4766]: I1002 11:26:24.088735 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r2zfp" Oct 02 11:26:24 crc kubenswrapper[4766]: I1002 11:26:24.404583 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r2zfp" Oct 02 11:26:24 crc kubenswrapper[4766]: I1002 11:26:24.444238 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2zfp"] Oct 02 11:26:26 crc kubenswrapper[4766]: I1002 11:26:26.368468 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-r2zfp" podUID="af4071a1-baa1-4890-8325-1c174838e696" containerName="registry-server" containerID="cri-o://1c16917d6e25e3ea0295ccfe0da70d563aad11c68eb99a0837ad59e8a12c5950" gracePeriod=2 Oct 02 11:26:26 crc kubenswrapper[4766]: I1002 11:26:26.734784 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2zfp" Oct 02 11:26:26 crc kubenswrapper[4766]: I1002 11:26:26.826151 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af4071a1-baa1-4890-8325-1c174838e696-catalog-content\") pod \"af4071a1-baa1-4890-8325-1c174838e696\" (UID: \"af4071a1-baa1-4890-8325-1c174838e696\") " Oct 02 11:26:26 crc kubenswrapper[4766]: I1002 11:26:26.826199 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv7xw\" (UniqueName: \"kubernetes.io/projected/af4071a1-baa1-4890-8325-1c174838e696-kube-api-access-zv7xw\") pod \"af4071a1-baa1-4890-8325-1c174838e696\" (UID: \"af4071a1-baa1-4890-8325-1c174838e696\") " Oct 02 11:26:26 crc kubenswrapper[4766]: I1002 11:26:26.826268 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af4071a1-baa1-4890-8325-1c174838e696-utilities\") pod \"af4071a1-baa1-4890-8325-1c174838e696\" (UID: \"af4071a1-baa1-4890-8325-1c174838e696\") " Oct 02 11:26:26 crc kubenswrapper[4766]: I1002 11:26:26.827243 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af4071a1-baa1-4890-8325-1c174838e696-utilities" (OuterVolumeSpecName: "utilities") pod "af4071a1-baa1-4890-8325-1c174838e696" (UID: "af4071a1-baa1-4890-8325-1c174838e696"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:26:26 crc kubenswrapper[4766]: I1002 11:26:26.835706 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af4071a1-baa1-4890-8325-1c174838e696-kube-api-access-zv7xw" (OuterVolumeSpecName: "kube-api-access-zv7xw") pod "af4071a1-baa1-4890-8325-1c174838e696" (UID: "af4071a1-baa1-4890-8325-1c174838e696"). InnerVolumeSpecName "kube-api-access-zv7xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:26:26 crc kubenswrapper[4766]: I1002 11:26:26.853914 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af4071a1-baa1-4890-8325-1c174838e696-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af4071a1-baa1-4890-8325-1c174838e696" (UID: "af4071a1-baa1-4890-8325-1c174838e696"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:26:26 crc kubenswrapper[4766]: I1002 11:26:26.928015 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af4071a1-baa1-4890-8325-1c174838e696-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:26:26 crc kubenswrapper[4766]: I1002 11:26:26.928253 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv7xw\" (UniqueName: \"kubernetes.io/projected/af4071a1-baa1-4890-8325-1c174838e696-kube-api-access-zv7xw\") on node \"crc\" DevicePath \"\"" Oct 02 11:26:26 crc kubenswrapper[4766]: I1002 11:26:26.928270 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af4071a1-baa1-4890-8325-1c174838e696-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:26:27 crc kubenswrapper[4766]: I1002 11:26:27.376496 4766 generic.go:334] "Generic (PLEG): container finished" podID="af4071a1-baa1-4890-8325-1c174838e696" containerID="1c16917d6e25e3ea0295ccfe0da70d563aad11c68eb99a0837ad59e8a12c5950" exitCode=0 Oct 02 11:26:27 crc kubenswrapper[4766]: I1002 11:26:27.376556 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2zfp" event={"ID":"af4071a1-baa1-4890-8325-1c174838e696","Type":"ContainerDied","Data":"1c16917d6e25e3ea0295ccfe0da70d563aad11c68eb99a0837ad59e8a12c5950"} Oct 02 11:26:27 crc kubenswrapper[4766]: I1002 11:26:27.376586 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2zfp" event={"ID":"af4071a1-baa1-4890-8325-1c174838e696","Type":"ContainerDied","Data":"ab45453bd392c7af501ca1e82da747e67079c92cf99a6c2dc636fbbff2b8309b"} Oct 02 11:26:27 crc kubenswrapper[4766]: I1002 11:26:27.376605 4766 scope.go:117] "RemoveContainer" containerID="1c16917d6e25e3ea0295ccfe0da70d563aad11c68eb99a0837ad59e8a12c5950" Oct 02 11:26:27 crc kubenswrapper[4766]: I1002 11:26:27.376737 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2zfp" Oct 02 11:26:27 crc kubenswrapper[4766]: I1002 11:26:27.409546 4766 scope.go:117] "RemoveContainer" containerID="3e904eec2715f9043b85625f8027a399623bace08e29d43e08f34ef69cb20613" Oct 02 11:26:27 crc kubenswrapper[4766]: I1002 11:26:27.410348 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2zfp"] Oct 02 11:26:27 crc kubenswrapper[4766]: I1002 11:26:27.416589 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2zfp"] Oct 02 11:26:27 crc kubenswrapper[4766]: I1002 11:26:27.424195 4766 scope.go:117] "RemoveContainer" containerID="75011b63d352c8cb16e55ab88b96e0fab30eab51fea4dda068d7e846453e69b9" Oct 02 11:26:27 crc kubenswrapper[4766]: I1002 11:26:27.452036 4766 scope.go:117] "RemoveContainer" containerID="1c16917d6e25e3ea0295ccfe0da70d563aad11c68eb99a0837ad59e8a12c5950" Oct 02 11:26:27 crc kubenswrapper[4766]: E1002 11:26:27.452567 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c16917d6e25e3ea0295ccfe0da70d563aad11c68eb99a0837ad59e8a12c5950\": container with ID starting with 1c16917d6e25e3ea0295ccfe0da70d563aad11c68eb99a0837ad59e8a12c5950 not found: ID does not exist" containerID="1c16917d6e25e3ea0295ccfe0da70d563aad11c68eb99a0837ad59e8a12c5950" Oct 02 11:26:27 crc kubenswrapper[4766]: I1002 11:26:27.452620 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c16917d6e25e3ea0295ccfe0da70d563aad11c68eb99a0837ad59e8a12c5950"} err="failed to get container status \"1c16917d6e25e3ea0295ccfe0da70d563aad11c68eb99a0837ad59e8a12c5950\": rpc error: code = NotFound desc = could not find container \"1c16917d6e25e3ea0295ccfe0da70d563aad11c68eb99a0837ad59e8a12c5950\": container with ID starting with 1c16917d6e25e3ea0295ccfe0da70d563aad11c68eb99a0837ad59e8a12c5950 not found: ID does not exist" Oct 02 11:26:27 crc kubenswrapper[4766]: I1002 11:26:27.452654 4766 scope.go:117] "RemoveContainer" containerID="3e904eec2715f9043b85625f8027a399623bace08e29d43e08f34ef69cb20613" Oct 02 11:26:27 crc kubenswrapper[4766]: E1002 11:26:27.453133 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e904eec2715f9043b85625f8027a399623bace08e29d43e08f34ef69cb20613\": container with ID starting with 3e904eec2715f9043b85625f8027a399623bace08e29d43e08f34ef69cb20613 not found: ID does not exist" containerID="3e904eec2715f9043b85625f8027a399623bace08e29d43e08f34ef69cb20613" Oct 02 11:26:27 crc kubenswrapper[4766]: I1002 11:26:27.453191 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e904eec2715f9043b85625f8027a399623bace08e29d43e08f34ef69cb20613"} err="failed to get container status \"3e904eec2715f9043b85625f8027a399623bace08e29d43e08f34ef69cb20613\": rpc error: code = NotFound desc = could not find container \"3e904eec2715f9043b85625f8027a399623bace08e29d43e08f34ef69cb20613\": container with ID starting with 3e904eec2715f9043b85625f8027a399623bace08e29d43e08f34ef69cb20613 not found: ID does not exist" Oct 02 11:26:27 crc kubenswrapper[4766]: I1002 11:26:27.453215 4766 scope.go:117] "RemoveContainer" containerID="75011b63d352c8cb16e55ab88b96e0fab30eab51fea4dda068d7e846453e69b9" Oct 02 11:26:27 crc kubenswrapper[4766]: E1002 11:26:27.453592 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75011b63d352c8cb16e55ab88b96e0fab30eab51fea4dda068d7e846453e69b9\": container with ID starting with 75011b63d352c8cb16e55ab88b96e0fab30eab51fea4dda068d7e846453e69b9 not found: ID does not exist" containerID="75011b63d352c8cb16e55ab88b96e0fab30eab51fea4dda068d7e846453e69b9" Oct 02 11:26:27 crc kubenswrapper[4766]: I1002 11:26:27.453634 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75011b63d352c8cb16e55ab88b96e0fab30eab51fea4dda068d7e846453e69b9"} err="failed to get container status \"75011b63d352c8cb16e55ab88b96e0fab30eab51fea4dda068d7e846453e69b9\": rpc error: code = NotFound desc = could not find container \"75011b63d352c8cb16e55ab88b96e0fab30eab51fea4dda068d7e846453e69b9\": container with ID starting with 75011b63d352c8cb16e55ab88b96e0fab30eab51fea4dda068d7e846453e69b9 not found: ID does not exist" Oct 02 11:26:27 crc kubenswrapper[4766]: I1002 11:26:27.890777 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af4071a1-baa1-4890-8325-1c174838e696" path="/var/lib/kubelet/pods/af4071a1-baa1-4890-8325-1c174838e696/volumes" Oct 02 11:27:24 crc kubenswrapper[4766]: I1002 11:27:24.432022 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:27:24 crc kubenswrapper[4766]: I1002 11:27:24.432777 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.067112 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vrdd5"] Oct 02 11:27:46 crc kubenswrapper[4766]: E1002 11:27:46.068045 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af4071a1-baa1-4890-8325-1c174838e696" containerName="registry-server" Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.068060 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="af4071a1-baa1-4890-8325-1c174838e696" containerName="registry-server" Oct 02 11:27:46 crc kubenswrapper[4766]: E1002 11:27:46.068079 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af4071a1-baa1-4890-8325-1c174838e696" containerName="extract-utilities" Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.068086 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="af4071a1-baa1-4890-8325-1c174838e696" containerName="extract-utilities" Oct 02 11:27:46 crc kubenswrapper[4766]: E1002 11:27:46.068112 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af4071a1-baa1-4890-8325-1c174838e696" containerName="extract-content" Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.068118 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="af4071a1-baa1-4890-8325-1c174838e696" containerName="extract-content" Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.068250 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="af4071a1-baa1-4890-8325-1c174838e696" containerName="registry-server" Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.069361 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrdd5" Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.074220 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vrdd5"] Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.121216 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-catalog-content\") pod \"certified-operators-vrdd5\" (UID: \"7fd9cbbd-a1ef-4cda-afd9-c2287891c199\") " pod="openshift-marketplace/certified-operators-vrdd5" Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.121531 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvk46\" (UniqueName: \"kubernetes.io/projected/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-kube-api-access-lvk46\") pod \"certified-operators-vrdd5\" (UID: \"7fd9cbbd-a1ef-4cda-afd9-c2287891c199\") " pod="openshift-marketplace/certified-operators-vrdd5" Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.121554 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-utilities\") pod \"certified-operators-vrdd5\" (UID: \"7fd9cbbd-a1ef-4cda-afd9-c2287891c199\") " pod="openshift-marketplace/certified-operators-vrdd5" Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.223154 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-catalog-content\") pod \"certified-operators-vrdd5\" (UID: \"7fd9cbbd-a1ef-4cda-afd9-c2287891c199\") " pod="openshift-marketplace/certified-operators-vrdd5" Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.223240 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvk46\" (UniqueName: \"kubernetes.io/projected/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-kube-api-access-lvk46\") pod \"certified-operators-vrdd5\" (UID: \"7fd9cbbd-a1ef-4cda-afd9-c2287891c199\") " pod="openshift-marketplace/certified-operators-vrdd5" Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.223267 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-utilities\") pod \"certified-operators-vrdd5\" (UID: \"7fd9cbbd-a1ef-4cda-afd9-c2287891c199\") " pod="openshift-marketplace/certified-operators-vrdd5" Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.223911 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-utilities\") pod \"certified-operators-vrdd5\" (UID: \"7fd9cbbd-a1ef-4cda-afd9-c2287891c199\") " pod="openshift-marketplace/certified-operators-vrdd5" Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.224086 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-catalog-content\") pod \"certified-operators-vrdd5\" (UID: \"7fd9cbbd-a1ef-4cda-afd9-c2287891c199\") " pod="openshift-marketplace/certified-operators-vrdd5" Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.251295 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvk46\" (UniqueName: \"kubernetes.io/projected/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-kube-api-access-lvk46\") pod \"certified-operators-vrdd5\" (UID: \"7fd9cbbd-a1ef-4cda-afd9-c2287891c199\") " pod="openshift-marketplace/certified-operators-vrdd5" Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.388925 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrdd5" Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.644308 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vrdd5"] Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.920841 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrdd5" event={"ID":"7fd9cbbd-a1ef-4cda-afd9-c2287891c199","Type":"ContainerStarted","Data":"fe63daf84a1aafda4ff514884184e304200041c5b7dc236aa77c650cf204d984"} Oct 02 11:27:46 crc kubenswrapper[4766]: I1002 11:27:46.920901 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrdd5" event={"ID":"7fd9cbbd-a1ef-4cda-afd9-c2287891c199","Type":"ContainerStarted","Data":"96f025c56400ff65e9b4a4c7cae2ea14b99ce71f30cab2d51e580a7e4998e414"} Oct 02 11:27:47 crc kubenswrapper[4766]: I1002 11:27:47.928682 4766 generic.go:334] "Generic (PLEG): container finished" podID="7fd9cbbd-a1ef-4cda-afd9-c2287891c199" containerID="fe63daf84a1aafda4ff514884184e304200041c5b7dc236aa77c650cf204d984" exitCode=0 Oct 02 11:27:47 crc kubenswrapper[4766]: I1002 11:27:47.928789 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrdd5" event={"ID":"7fd9cbbd-a1ef-4cda-afd9-c2287891c199","Type":"ContainerDied","Data":"fe63daf84a1aafda4ff514884184e304200041c5b7dc236aa77c650cf204d984"} Oct 02 11:27:49 crc kubenswrapper[4766]: I1002 11:27:49.944859 4766 generic.go:334] "Generic (PLEG): container finished" podID="7fd9cbbd-a1ef-4cda-afd9-c2287891c199" containerID="fe254f962d5b4e53ad637509fda4c69f461d56dfeadbd682f7234e69e31c267d" exitCode=0 Oct 02 11:27:49 crc kubenswrapper[4766]: I1002 11:27:49.945393 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrdd5" event={"ID":"7fd9cbbd-a1ef-4cda-afd9-c2287891c199","Type":"ContainerDied","Data":"fe254f962d5b4e53ad637509fda4c69f461d56dfeadbd682f7234e69e31c267d"} Oct 02 11:27:50 crc kubenswrapper[4766]: I1002 11:27:50.955784 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrdd5" event={"ID":"7fd9cbbd-a1ef-4cda-afd9-c2287891c199","Type":"ContainerStarted","Data":"5ca09ae8ca2914ec5bc87d81be79bb67751d7b0f49a4115f1da239a4dd2a5408"} Oct 02 11:27:50 crc kubenswrapper[4766]: I1002 11:27:50.978598 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vrdd5" podStartSLOduration=2.427270023 podStartE2EDuration="4.978571829s" podCreationTimestamp="2025-10-02 11:27:46 +0000 UTC" firstStartedPulling="2025-10-02 11:27:47.930889993 +0000 UTC m=+2182.873760937" lastFinishedPulling="2025-10-02 11:27:50.482191799 +0000 UTC m=+2185.425062743" observedRunningTime="2025-10-02 11:27:50.972791244 +0000 UTC m=+2185.915662188" watchObservedRunningTime="2025-10-02 11:27:50.978571829 +0000 UTC m=+2185.921442773" Oct 02 11:27:51 crc kubenswrapper[4766]: I1002 11:27:51.844133 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6zc4p"] Oct 02 11:27:51 crc kubenswrapper[4766]: I1002 11:27:51.845948 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6zc4p" Oct 02 11:27:51 crc kubenswrapper[4766]: I1002 11:27:51.862461 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6zc4p"] Oct 02 11:27:51 crc kubenswrapper[4766]: I1002 11:27:51.904910 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f460d95-be9a-48f9-a37e-7d81697a0b90-utilities\") pod \"redhat-operators-6zc4p\" (UID: \"8f460d95-be9a-48f9-a37e-7d81697a0b90\") " pod="openshift-marketplace/redhat-operators-6zc4p" Oct 02 11:27:51 crc kubenswrapper[4766]: I1002 11:27:51.905263 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f460d95-be9a-48f9-a37e-7d81697a0b90-catalog-content\") pod \"redhat-operators-6zc4p\" (UID: \"8f460d95-be9a-48f9-a37e-7d81697a0b90\") " pod="openshift-marketplace/redhat-operators-6zc4p" Oct 02 11:27:51 crc kubenswrapper[4766]: I1002 11:27:51.905331 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxm8q\" (UniqueName: \"kubernetes.io/projected/8f460d95-be9a-48f9-a37e-7d81697a0b90-kube-api-access-lxm8q\") pod \"redhat-operators-6zc4p\" (UID: \"8f460d95-be9a-48f9-a37e-7d81697a0b90\") " pod="openshift-marketplace/redhat-operators-6zc4p" Oct 02 11:27:52 crc kubenswrapper[4766]: I1002 11:27:52.006725 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f460d95-be9a-48f9-a37e-7d81697a0b90-catalog-content\") pod \"redhat-operators-6zc4p\" (UID: \"8f460d95-be9a-48f9-a37e-7d81697a0b90\") " pod="openshift-marketplace/redhat-operators-6zc4p" Oct 02 11:27:52 crc kubenswrapper[4766]: I1002 11:27:52.006783 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxm8q\" (UniqueName: \"kubernetes.io/projected/8f460d95-be9a-48f9-a37e-7d81697a0b90-kube-api-access-lxm8q\") pod \"redhat-operators-6zc4p\" (UID: \"8f460d95-be9a-48f9-a37e-7d81697a0b90\") " pod="openshift-marketplace/redhat-operators-6zc4p" Oct 02 11:27:52 crc kubenswrapper[4766]: I1002 11:27:52.006837 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f460d95-be9a-48f9-a37e-7d81697a0b90-utilities\") pod \"redhat-operators-6zc4p\" (UID: \"8f460d95-be9a-48f9-a37e-7d81697a0b90\") " pod="openshift-marketplace/redhat-operators-6zc4p" Oct 02 11:27:52 crc kubenswrapper[4766]: I1002 11:27:52.007363 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f460d95-be9a-48f9-a37e-7d81697a0b90-utilities\") pod \"redhat-operators-6zc4p\" (UID: \"8f460d95-be9a-48f9-a37e-7d81697a0b90\") " pod="openshift-marketplace/redhat-operators-6zc4p" Oct 02 11:27:52 crc kubenswrapper[4766]: I1002 11:27:52.007365 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f460d95-be9a-48f9-a37e-7d81697a0b90-catalog-content\") pod \"redhat-operators-6zc4p\" (UID: \"8f460d95-be9a-48f9-a37e-7d81697a0b90\") " pod="openshift-marketplace/redhat-operators-6zc4p" Oct 02 11:27:52 crc kubenswrapper[4766]: I1002 11:27:52.031564 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxm8q\" (UniqueName: \"kubernetes.io/projected/8f460d95-be9a-48f9-a37e-7d81697a0b90-kube-api-access-lxm8q\") pod \"redhat-operators-6zc4p\" (UID: \"8f460d95-be9a-48f9-a37e-7d81697a0b90\") " pod="openshift-marketplace/redhat-operators-6zc4p" Oct 02 11:27:52 crc kubenswrapper[4766]: I1002 11:27:52.166385 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6zc4p" Oct 02 11:27:52 crc kubenswrapper[4766]: I1002 11:27:52.588343 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6zc4p"] Oct 02 11:27:52 crc kubenswrapper[4766]: I1002 11:27:52.969127 4766 generic.go:334] "Generic (PLEG): container finished" podID="8f460d95-be9a-48f9-a37e-7d81697a0b90" containerID="f5e52b16d693a52381e64ec8d2fb4dfa40adb4ba1bca4a700370387afdc3bd95" exitCode=0 Oct 02 11:27:52 crc kubenswrapper[4766]: I1002 11:27:52.969222 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zc4p" event={"ID":"8f460d95-be9a-48f9-a37e-7d81697a0b90","Type":"ContainerDied","Data":"f5e52b16d693a52381e64ec8d2fb4dfa40adb4ba1bca4a700370387afdc3bd95"} Oct 02 11:27:52 crc kubenswrapper[4766]: I1002 11:27:52.969585 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zc4p" event={"ID":"8f460d95-be9a-48f9-a37e-7d81697a0b90","Type":"ContainerStarted","Data":"5c7e1fee47674f09ffddecc1c1048d6405805d01d0cb975503616ed19b4aba46"} Oct 02 11:27:54 crc kubenswrapper[4766]: I1002 11:27:54.433192 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:27:54 crc kubenswrapper[4766]: I1002 11:27:54.433300 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:27:54 crc kubenswrapper[4766]: I1002 11:27:54.984173 4766 generic.go:334] "Generic (PLEG): container finished" podID="8f460d95-be9a-48f9-a37e-7d81697a0b90" containerID="7a7b6253e7cff105168cbf1816b7b3b1ba8da8521311d0e676646119189ac1f3" exitCode=0 Oct 02 11:27:54 crc kubenswrapper[4766]: I1002 11:27:54.984271 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zc4p" event={"ID":"8f460d95-be9a-48f9-a37e-7d81697a0b90","Type":"ContainerDied","Data":"7a7b6253e7cff105168cbf1816b7b3b1ba8da8521311d0e676646119189ac1f3"} Oct 02 11:27:55 crc kubenswrapper[4766]: I1002 11:27:55.994956 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zc4p" event={"ID":"8f460d95-be9a-48f9-a37e-7d81697a0b90","Type":"ContainerStarted","Data":"d196c10b878fa968e0a2e2e43fab211af28592fee0f670c31f6099ec848fc31e"} Oct 02 11:27:56 crc kubenswrapper[4766]: I1002 11:27:56.013324 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6zc4p" podStartSLOduration=2.514945914 podStartE2EDuration="5.013305563s" podCreationTimestamp="2025-10-02 11:27:51 +0000 UTC" firstStartedPulling="2025-10-02 11:27:52.971072923 +0000 UTC m=+2187.913943867" lastFinishedPulling="2025-10-02 11:27:55.469432572 +0000 UTC m=+2190.412303516" observedRunningTime="2025-10-02 11:27:56.013164849 +0000 UTC m=+2190.956035793" watchObservedRunningTime="2025-10-02 11:27:56.013305563 +0000 UTC m=+2190.956176507" Oct 02 11:27:56 crc kubenswrapper[4766]: I1002 11:27:56.389845 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vrdd5" Oct 02 11:27:56 crc kubenswrapper[4766]: I1002 11:27:56.389917 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vrdd5" Oct 02 11:27:56 crc kubenswrapper[4766]: I1002 11:27:56.434968 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vrdd5" Oct 02 11:27:57 crc kubenswrapper[4766]: I1002 11:27:57.045223 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vrdd5" Oct 02 11:27:57 crc kubenswrapper[4766]: I1002 11:27:57.640943 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vrdd5"] Oct 02 11:27:59 crc kubenswrapper[4766]: I1002 11:27:59.012106 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vrdd5" podUID="7fd9cbbd-a1ef-4cda-afd9-c2287891c199" containerName="registry-server" containerID="cri-o://5ca09ae8ca2914ec5bc87d81be79bb67751d7b0f49a4115f1da239a4dd2a5408" gracePeriod=2 Oct 02 11:27:59 crc kubenswrapper[4766]: I1002 11:27:59.417342 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrdd5" Oct 02 11:27:59 crc kubenswrapper[4766]: I1002 11:27:59.515746 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-catalog-content\") pod \"7fd9cbbd-a1ef-4cda-afd9-c2287891c199\" (UID: \"7fd9cbbd-a1ef-4cda-afd9-c2287891c199\") " Oct 02 11:27:59 crc kubenswrapper[4766]: I1002 11:27:59.515852 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvk46\" (UniqueName: \"kubernetes.io/projected/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-kube-api-access-lvk46\") pod \"7fd9cbbd-a1ef-4cda-afd9-c2287891c199\" (UID: \"7fd9cbbd-a1ef-4cda-afd9-c2287891c199\") " Oct 02 11:27:59 crc kubenswrapper[4766]: I1002 11:27:59.515884 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-utilities\") pod \"7fd9cbbd-a1ef-4cda-afd9-c2287891c199\" (UID: \"7fd9cbbd-a1ef-4cda-afd9-c2287891c199\") " Oct 02 11:27:59 crc kubenswrapper[4766]: I1002 11:27:59.516875 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-utilities" (OuterVolumeSpecName: "utilities") pod "7fd9cbbd-a1ef-4cda-afd9-c2287891c199" (UID: "7fd9cbbd-a1ef-4cda-afd9-c2287891c199"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:27:59 crc kubenswrapper[4766]: I1002 11:27:59.522531 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-kube-api-access-lvk46" (OuterVolumeSpecName: "kube-api-access-lvk46") pod "7fd9cbbd-a1ef-4cda-afd9-c2287891c199" (UID: "7fd9cbbd-a1ef-4cda-afd9-c2287891c199"). InnerVolumeSpecName "kube-api-access-lvk46". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:27:59 crc kubenswrapper[4766]: I1002 11:27:59.617555 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvk46\" (UniqueName: \"kubernetes.io/projected/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-kube-api-access-lvk46\") on node \"crc\" DevicePath \"\"" Oct 02 11:27:59 crc kubenswrapper[4766]: I1002 11:27:59.617823 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:28:00 crc kubenswrapper[4766]: I1002 11:28:00.021858 4766 generic.go:334] "Generic (PLEG): container finished" podID="7fd9cbbd-a1ef-4cda-afd9-c2287891c199" containerID="5ca09ae8ca2914ec5bc87d81be79bb67751d7b0f49a4115f1da239a4dd2a5408" exitCode=0 Oct 02 11:28:00 crc kubenswrapper[4766]: I1002 11:28:00.021963 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vrdd5" Oct 02 11:28:00 crc kubenswrapper[4766]: I1002 11:28:00.021954 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrdd5" event={"ID":"7fd9cbbd-a1ef-4cda-afd9-c2287891c199","Type":"ContainerDied","Data":"5ca09ae8ca2914ec5bc87d81be79bb67751d7b0f49a4115f1da239a4dd2a5408"} Oct 02 11:28:00 crc kubenswrapper[4766]: I1002 11:28:00.022607 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vrdd5" event={"ID":"7fd9cbbd-a1ef-4cda-afd9-c2287891c199","Type":"ContainerDied","Data":"96f025c56400ff65e9b4a4c7cae2ea14b99ce71f30cab2d51e580a7e4998e414"} Oct 02 11:28:00 crc kubenswrapper[4766]: I1002 11:28:00.022630 4766 scope.go:117] "RemoveContainer" containerID="5ca09ae8ca2914ec5bc87d81be79bb67751d7b0f49a4115f1da239a4dd2a5408" Oct 02 11:28:00 crc kubenswrapper[4766]: I1002 11:28:00.039180 4766 scope.go:117] "RemoveContainer" containerID="fe254f962d5b4e53ad637509fda4c69f461d56dfeadbd682f7234e69e31c267d" Oct 02 11:28:00 crc kubenswrapper[4766]: I1002 11:28:00.065035 4766 scope.go:117] "RemoveContainer" containerID="fe63daf84a1aafda4ff514884184e304200041c5b7dc236aa77c650cf204d984" Oct 02 11:28:00 crc kubenswrapper[4766]: I1002 11:28:00.081772 4766 scope.go:117] "RemoveContainer" containerID="5ca09ae8ca2914ec5bc87d81be79bb67751d7b0f49a4115f1da239a4dd2a5408" Oct 02 11:28:00 crc kubenswrapper[4766]: E1002 11:28:00.082317 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ca09ae8ca2914ec5bc87d81be79bb67751d7b0f49a4115f1da239a4dd2a5408\": container with ID starting with 5ca09ae8ca2914ec5bc87d81be79bb67751d7b0f49a4115f1da239a4dd2a5408 not found: ID does not exist" containerID="5ca09ae8ca2914ec5bc87d81be79bb67751d7b0f49a4115f1da239a4dd2a5408" Oct 02 11:28:00 crc kubenswrapper[4766]: I1002 11:28:00.082378 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ca09ae8ca2914ec5bc87d81be79bb67751d7b0f49a4115f1da239a4dd2a5408"} err="failed to get container status \"5ca09ae8ca2914ec5bc87d81be79bb67751d7b0f49a4115f1da239a4dd2a5408\": rpc error: code = NotFound desc = could not find container \"5ca09ae8ca2914ec5bc87d81be79bb67751d7b0f49a4115f1da239a4dd2a5408\": container with ID starting with 5ca09ae8ca2914ec5bc87d81be79bb67751d7b0f49a4115f1da239a4dd2a5408 not found: ID does not exist" Oct 02 11:28:00 crc kubenswrapper[4766]: I1002 11:28:00.082412 4766 scope.go:117] "RemoveContainer" containerID="fe254f962d5b4e53ad637509fda4c69f461d56dfeadbd682f7234e69e31c267d" Oct 02 11:28:00 crc kubenswrapper[4766]: E1002 11:28:00.082889 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe254f962d5b4e53ad637509fda4c69f461d56dfeadbd682f7234e69e31c267d\": container with ID starting with fe254f962d5b4e53ad637509fda4c69f461d56dfeadbd682f7234e69e31c267d not found: ID does not exist" containerID="fe254f962d5b4e53ad637509fda4c69f461d56dfeadbd682f7234e69e31c267d" Oct 02 11:28:00 crc kubenswrapper[4766]: I1002 11:28:00.082917 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe254f962d5b4e53ad637509fda4c69f461d56dfeadbd682f7234e69e31c267d"} err="failed to get container status \"fe254f962d5b4e53ad637509fda4c69f461d56dfeadbd682f7234e69e31c267d\": rpc error: code = NotFound desc = could not find container \"fe254f962d5b4e53ad637509fda4c69f461d56dfeadbd682f7234e69e31c267d\": container with ID starting with fe254f962d5b4e53ad637509fda4c69f461d56dfeadbd682f7234e69e31c267d not found: ID does not exist" Oct 02 11:28:00 crc kubenswrapper[4766]: I1002 11:28:00.082936 4766 scope.go:117] "RemoveContainer" containerID="fe63daf84a1aafda4ff514884184e304200041c5b7dc236aa77c650cf204d984" Oct 02 11:28:00 crc kubenswrapper[4766]: E1002 11:28:00.083238 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe63daf84a1aafda4ff514884184e304200041c5b7dc236aa77c650cf204d984\": container with ID starting with fe63daf84a1aafda4ff514884184e304200041c5b7dc236aa77c650cf204d984 not found: ID does not exist" containerID="fe63daf84a1aafda4ff514884184e304200041c5b7dc236aa77c650cf204d984" Oct 02 11:28:00 crc kubenswrapper[4766]: I1002 11:28:00.083269 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe63daf84a1aafda4ff514884184e304200041c5b7dc236aa77c650cf204d984"} err="failed to get container status \"fe63daf84a1aafda4ff514884184e304200041c5b7dc236aa77c650cf204d984\": rpc error: code = NotFound desc = could not find container \"fe63daf84a1aafda4ff514884184e304200041c5b7dc236aa77c650cf204d984\": container with ID starting with fe63daf84a1aafda4ff514884184e304200041c5b7dc236aa77c650cf204d984 not found: ID does not exist" Oct 02 11:28:01 crc kubenswrapper[4766]: I1002 11:28:01.888819 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fd9cbbd-a1ef-4cda-afd9-c2287891c199" (UID: "7fd9cbbd-a1ef-4cda-afd9-c2287891c199"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:28:01 crc kubenswrapper[4766]: I1002 11:28:01.950211 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fd9cbbd-a1ef-4cda-afd9-c2287891c199-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:28:02 crc kubenswrapper[4766]: I1002 11:28:02.141598 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vrdd5"] Oct 02 11:28:02 crc kubenswrapper[4766]: I1002 11:28:02.146634 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vrdd5"] Oct 02 11:28:02 crc kubenswrapper[4766]: I1002 11:28:02.166889 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6zc4p" Oct 02 11:28:02 crc kubenswrapper[4766]: I1002 11:28:02.166944 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6zc4p" Oct 02 11:28:02 crc kubenswrapper[4766]: I1002 11:28:02.206365 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6zc4p" Oct 02 11:28:03 crc kubenswrapper[4766]: I1002 11:28:03.082307 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6zc4p" Oct 02 11:28:03 crc kubenswrapper[4766]: I1002 11:28:03.890444 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fd9cbbd-a1ef-4cda-afd9-c2287891c199" path="/var/lib/kubelet/pods/7fd9cbbd-a1ef-4cda-afd9-c2287891c199/volumes" Oct 02 11:28:04 crc kubenswrapper[4766]: I1002 11:28:04.037806 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6zc4p"] Oct 02 11:28:05 crc kubenswrapper[4766]: I1002 11:28:05.058304 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6zc4p" podUID="8f460d95-be9a-48f9-a37e-7d81697a0b90" containerName="registry-server" containerID="cri-o://d196c10b878fa968e0a2e2e43fab211af28592fee0f670c31f6099ec848fc31e" gracePeriod=2 Oct 02 11:28:05 crc kubenswrapper[4766]: I1002 11:28:05.433560 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6zc4p" Oct 02 11:28:05 crc kubenswrapper[4766]: I1002 11:28:05.597017 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxm8q\" (UniqueName: \"kubernetes.io/projected/8f460d95-be9a-48f9-a37e-7d81697a0b90-kube-api-access-lxm8q\") pod \"8f460d95-be9a-48f9-a37e-7d81697a0b90\" (UID: \"8f460d95-be9a-48f9-a37e-7d81697a0b90\") " Oct 02 11:28:05 crc kubenswrapper[4766]: I1002 11:28:05.597055 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f460d95-be9a-48f9-a37e-7d81697a0b90-catalog-content\") pod \"8f460d95-be9a-48f9-a37e-7d81697a0b90\" (UID: \"8f460d95-be9a-48f9-a37e-7d81697a0b90\") " Oct 02 11:28:05 crc kubenswrapper[4766]: I1002 11:28:05.597088 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f460d95-be9a-48f9-a37e-7d81697a0b90-utilities\") pod \"8f460d95-be9a-48f9-a37e-7d81697a0b90\" (UID: \"8f460d95-be9a-48f9-a37e-7d81697a0b90\") " Oct 02 11:28:05 crc kubenswrapper[4766]: I1002 11:28:05.597921 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f460d95-be9a-48f9-a37e-7d81697a0b90-utilities" (OuterVolumeSpecName: "utilities") pod "8f460d95-be9a-48f9-a37e-7d81697a0b90" (UID: "8f460d95-be9a-48f9-a37e-7d81697a0b90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:28:05 crc kubenswrapper[4766]: I1002 11:28:05.602950 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f460d95-be9a-48f9-a37e-7d81697a0b90-kube-api-access-lxm8q" (OuterVolumeSpecName: "kube-api-access-lxm8q") pod "8f460d95-be9a-48f9-a37e-7d81697a0b90" (UID: "8f460d95-be9a-48f9-a37e-7d81697a0b90"). InnerVolumeSpecName "kube-api-access-lxm8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:28:05 crc kubenswrapper[4766]: I1002 11:28:05.686856 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f460d95-be9a-48f9-a37e-7d81697a0b90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f460d95-be9a-48f9-a37e-7d81697a0b90" (UID: "8f460d95-be9a-48f9-a37e-7d81697a0b90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:28:05 crc kubenswrapper[4766]: I1002 11:28:05.698575 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxm8q\" (UniqueName: \"kubernetes.io/projected/8f460d95-be9a-48f9-a37e-7d81697a0b90-kube-api-access-lxm8q\") on node \"crc\" DevicePath \"\"" Oct 02 11:28:05 crc kubenswrapper[4766]: I1002 11:28:05.699448 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f460d95-be9a-48f9-a37e-7d81697a0b90-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:28:05 crc kubenswrapper[4766]: I1002 11:28:05.699548 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f460d95-be9a-48f9-a37e-7d81697a0b90-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:28:06 crc kubenswrapper[4766]: I1002 11:28:06.068226 4766 generic.go:334] "Generic (PLEG): container finished" podID="8f460d95-be9a-48f9-a37e-7d81697a0b90" containerID="d196c10b878fa968e0a2e2e43fab211af28592fee0f670c31f6099ec848fc31e" exitCode=0 Oct 02 11:28:06 crc kubenswrapper[4766]: I1002 11:28:06.068309 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6zc4p" Oct 02 11:28:06 crc kubenswrapper[4766]: I1002 11:28:06.068335 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zc4p" event={"ID":"8f460d95-be9a-48f9-a37e-7d81697a0b90","Type":"ContainerDied","Data":"d196c10b878fa968e0a2e2e43fab211af28592fee0f670c31f6099ec848fc31e"} Oct 02 11:28:06 crc kubenswrapper[4766]: I1002 11:28:06.068590 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6zc4p" event={"ID":"8f460d95-be9a-48f9-a37e-7d81697a0b90","Type":"ContainerDied","Data":"5c7e1fee47674f09ffddecc1c1048d6405805d01d0cb975503616ed19b4aba46"} Oct 02 11:28:06 crc kubenswrapper[4766]: I1002 11:28:06.068617 4766 scope.go:117] "RemoveContainer" containerID="d196c10b878fa968e0a2e2e43fab211af28592fee0f670c31f6099ec848fc31e" Oct 02 11:28:06 crc kubenswrapper[4766]: I1002 11:28:06.090660 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6zc4p"] Oct 02 11:28:06 crc kubenswrapper[4766]: I1002 11:28:06.094976 4766 scope.go:117] "RemoveContainer" containerID="7a7b6253e7cff105168cbf1816b7b3b1ba8da8521311d0e676646119189ac1f3" Oct 02 11:28:06 crc kubenswrapper[4766]: I1002 11:28:06.095905 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6zc4p"] Oct 02 11:28:06 crc kubenswrapper[4766]: I1002 11:28:06.111856 4766 scope.go:117] "RemoveContainer" containerID="f5e52b16d693a52381e64ec8d2fb4dfa40adb4ba1bca4a700370387afdc3bd95" Oct 02 11:28:06 crc kubenswrapper[4766]: I1002 11:28:06.137661 4766 scope.go:117] "RemoveContainer" containerID="d196c10b878fa968e0a2e2e43fab211af28592fee0f670c31f6099ec848fc31e" Oct 02 11:28:06 crc kubenswrapper[4766]: E1002 11:28:06.138260 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d196c10b878fa968e0a2e2e43fab211af28592fee0f670c31f6099ec848fc31e\": container with ID starting with d196c10b878fa968e0a2e2e43fab211af28592fee0f670c31f6099ec848fc31e not found: ID does not exist" containerID="d196c10b878fa968e0a2e2e43fab211af28592fee0f670c31f6099ec848fc31e" Oct 02 11:28:06 crc kubenswrapper[4766]: I1002 11:28:06.138296 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d196c10b878fa968e0a2e2e43fab211af28592fee0f670c31f6099ec848fc31e"} err="failed to get container status \"d196c10b878fa968e0a2e2e43fab211af28592fee0f670c31f6099ec848fc31e\": rpc error: code = NotFound desc = could not find container \"d196c10b878fa968e0a2e2e43fab211af28592fee0f670c31f6099ec848fc31e\": container with ID starting with d196c10b878fa968e0a2e2e43fab211af28592fee0f670c31f6099ec848fc31e not found: ID does not exist" Oct 02 11:28:06 crc kubenswrapper[4766]: I1002 11:28:06.138320 4766 scope.go:117] "RemoveContainer" containerID="7a7b6253e7cff105168cbf1816b7b3b1ba8da8521311d0e676646119189ac1f3" Oct 02 11:28:06 crc kubenswrapper[4766]: E1002 11:28:06.138689 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a7b6253e7cff105168cbf1816b7b3b1ba8da8521311d0e676646119189ac1f3\": container with ID starting with 7a7b6253e7cff105168cbf1816b7b3b1ba8da8521311d0e676646119189ac1f3 not found: ID does not exist" containerID="7a7b6253e7cff105168cbf1816b7b3b1ba8da8521311d0e676646119189ac1f3" Oct 02 11:28:06 crc kubenswrapper[4766]: I1002 11:28:06.138715 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a7b6253e7cff105168cbf1816b7b3b1ba8da8521311d0e676646119189ac1f3"} err="failed to get container status \"7a7b6253e7cff105168cbf1816b7b3b1ba8da8521311d0e676646119189ac1f3\": rpc error: code = NotFound desc = could not find container \"7a7b6253e7cff105168cbf1816b7b3b1ba8da8521311d0e676646119189ac1f3\": container with ID starting with 7a7b6253e7cff105168cbf1816b7b3b1ba8da8521311d0e676646119189ac1f3 not found: ID does not exist" Oct 02 11:28:06 crc kubenswrapper[4766]: I1002 11:28:06.138731 4766 scope.go:117] "RemoveContainer" containerID="f5e52b16d693a52381e64ec8d2fb4dfa40adb4ba1bca4a700370387afdc3bd95" Oct 02 11:28:06 crc kubenswrapper[4766]: E1002 11:28:06.139416 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5e52b16d693a52381e64ec8d2fb4dfa40adb4ba1bca4a700370387afdc3bd95\": container with ID starting with f5e52b16d693a52381e64ec8d2fb4dfa40adb4ba1bca4a700370387afdc3bd95 not found: ID does not exist" containerID="f5e52b16d693a52381e64ec8d2fb4dfa40adb4ba1bca4a700370387afdc3bd95" Oct 02 11:28:06 crc kubenswrapper[4766]: I1002 11:28:06.139580 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5e52b16d693a52381e64ec8d2fb4dfa40adb4ba1bca4a700370387afdc3bd95"} err="failed to get container status \"f5e52b16d693a52381e64ec8d2fb4dfa40adb4ba1bca4a700370387afdc3bd95\": rpc error: code = NotFound desc = could not find container \"f5e52b16d693a52381e64ec8d2fb4dfa40adb4ba1bca4a700370387afdc3bd95\": container with ID starting with f5e52b16d693a52381e64ec8d2fb4dfa40adb4ba1bca4a700370387afdc3bd95 not found: ID does not exist" Oct 02 11:28:07 crc kubenswrapper[4766]: I1002 11:28:07.905683 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f460d95-be9a-48f9-a37e-7d81697a0b90" path="/var/lib/kubelet/pods/8f460d95-be9a-48f9-a37e-7d81697a0b90/volumes" Oct 02 11:28:24 crc kubenswrapper[4766]: I1002 11:28:24.431900 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:28:24 crc kubenswrapper[4766]: I1002 11:28:24.432898 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:28:24 crc kubenswrapper[4766]: I1002 11:28:24.432974 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 11:28:24 crc kubenswrapper[4766]: I1002 11:28:24.433763 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:28:24 crc kubenswrapper[4766]: I1002 11:28:24.433840 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" gracePeriod=600 Oct 02 11:28:24 crc kubenswrapper[4766]: E1002 11:28:24.576584 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:28:25 crc kubenswrapper[4766]: I1002 11:28:25.217853 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" exitCode=0 Oct 02 11:28:25 crc kubenswrapper[4766]: I1002 11:28:25.217918 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9"} Oct 02 11:28:25 crc kubenswrapper[4766]: I1002 11:28:25.218231 4766 scope.go:117] "RemoveContainer" containerID="ad849014129d0cceee0f98aefb0b7ee04dec448811308b28f18e68707adcd334" Oct 02 11:28:25 crc kubenswrapper[4766]: I1002 11:28:25.218905 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:28:25 crc kubenswrapper[4766]: E1002 11:28:25.219296 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:28:37 crc kubenswrapper[4766]: I1002 11:28:37.881825 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:28:37 crc kubenswrapper[4766]: E1002 11:28:37.882640 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:28:52 crc kubenswrapper[4766]: I1002 11:28:52.882277 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:28:52 crc kubenswrapper[4766]: E1002 11:28:52.883068 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:29:07 crc kubenswrapper[4766]: I1002 11:29:07.882223 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:29:07 crc kubenswrapper[4766]: E1002 11:29:07.883053 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:29:12 crc kubenswrapper[4766]: I1002 11:29:12.730408 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-stpbl"] Oct 02 11:29:12 crc kubenswrapper[4766]: E1002 11:29:12.731555 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f460d95-be9a-48f9-a37e-7d81697a0b90" containerName="extract-content" Oct 02 11:29:12 crc kubenswrapper[4766]: I1002 11:29:12.731579 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f460d95-be9a-48f9-a37e-7d81697a0b90" containerName="extract-content" Oct 02 11:29:12 crc kubenswrapper[4766]: E1002 11:29:12.731597 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fd9cbbd-a1ef-4cda-afd9-c2287891c199" containerName="extract-content" Oct 02 11:29:12 crc kubenswrapper[4766]: I1002 11:29:12.731605 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd9cbbd-a1ef-4cda-afd9-c2287891c199" containerName="extract-content" Oct 02 11:29:12 crc kubenswrapper[4766]: E1002 11:29:12.731630 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f460d95-be9a-48f9-a37e-7d81697a0b90" containerName="extract-utilities" Oct 02 11:29:12 crc kubenswrapper[4766]: I1002 11:29:12.731642 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f460d95-be9a-48f9-a37e-7d81697a0b90" containerName="extract-utilities" Oct 02 11:29:12 crc kubenswrapper[4766]: E1002 11:29:12.731677 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f460d95-be9a-48f9-a37e-7d81697a0b90" containerName="registry-server" Oct 02 11:29:12 crc kubenswrapper[4766]: I1002 11:29:12.731685 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f460d95-be9a-48f9-a37e-7d81697a0b90" containerName="registry-server" Oct 02 11:29:12 crc kubenswrapper[4766]: E1002 11:29:12.731705 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fd9cbbd-a1ef-4cda-afd9-c2287891c199" containerName="extract-utilities" Oct 02 11:29:12 crc kubenswrapper[4766]: I1002 11:29:12.731712 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd9cbbd-a1ef-4cda-afd9-c2287891c199" containerName="extract-utilities" Oct 02 11:29:12 crc kubenswrapper[4766]: E1002 11:29:12.731726 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fd9cbbd-a1ef-4cda-afd9-c2287891c199" containerName="registry-server" Oct 02 11:29:12 crc kubenswrapper[4766]: I1002 11:29:12.731735 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd9cbbd-a1ef-4cda-afd9-c2287891c199" containerName="registry-server" Oct 02 11:29:12 crc kubenswrapper[4766]: I1002 11:29:12.731943 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fd9cbbd-a1ef-4cda-afd9-c2287891c199" containerName="registry-server" Oct 02 11:29:12 crc kubenswrapper[4766]: I1002 11:29:12.731968 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f460d95-be9a-48f9-a37e-7d81697a0b90" containerName="registry-server" Oct 02 11:29:12 crc kubenswrapper[4766]: I1002 11:29:12.733463 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-stpbl" Oct 02 11:29:12 crc kubenswrapper[4766]: I1002 11:29:12.737595 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-stpbl"] Oct 02 11:29:12 crc kubenswrapper[4766]: I1002 11:29:12.925160 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a027f51b-394d-4112-b3dd-8d167ff257c4-utilities\") pod \"community-operators-stpbl\" (UID: \"a027f51b-394d-4112-b3dd-8d167ff257c4\") " pod="openshift-marketplace/community-operators-stpbl" Oct 02 11:29:12 crc kubenswrapper[4766]: I1002 11:29:12.925228 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a027f51b-394d-4112-b3dd-8d167ff257c4-catalog-content\") pod \"community-operators-stpbl\" (UID: \"a027f51b-394d-4112-b3dd-8d167ff257c4\") " pod="openshift-marketplace/community-operators-stpbl" Oct 02 11:29:12 crc kubenswrapper[4766]: I1002 11:29:12.925270 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2qhb\" (UniqueName: \"kubernetes.io/projected/a027f51b-394d-4112-b3dd-8d167ff257c4-kube-api-access-k2qhb\") pod \"community-operators-stpbl\" (UID: \"a027f51b-394d-4112-b3dd-8d167ff257c4\") " pod="openshift-marketplace/community-operators-stpbl" Oct 02 11:29:13 crc kubenswrapper[4766]: I1002 11:29:13.026612 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a027f51b-394d-4112-b3dd-8d167ff257c4-utilities\") pod \"community-operators-stpbl\" (UID: \"a027f51b-394d-4112-b3dd-8d167ff257c4\") " pod="openshift-marketplace/community-operators-stpbl" Oct 02 11:29:13 crc kubenswrapper[4766]: I1002 11:29:13.026739 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a027f51b-394d-4112-b3dd-8d167ff257c4-catalog-content\") pod \"community-operators-stpbl\" (UID: \"a027f51b-394d-4112-b3dd-8d167ff257c4\") " pod="openshift-marketplace/community-operators-stpbl" Oct 02 11:29:13 crc kubenswrapper[4766]: I1002 11:29:13.026770 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2qhb\" (UniqueName: \"kubernetes.io/projected/a027f51b-394d-4112-b3dd-8d167ff257c4-kube-api-access-k2qhb\") pod \"community-operators-stpbl\" (UID: \"a027f51b-394d-4112-b3dd-8d167ff257c4\") " pod="openshift-marketplace/community-operators-stpbl" Oct 02 11:29:13 crc kubenswrapper[4766]: I1002 11:29:13.027263 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a027f51b-394d-4112-b3dd-8d167ff257c4-utilities\") pod \"community-operators-stpbl\" (UID: \"a027f51b-394d-4112-b3dd-8d167ff257c4\") " pod="openshift-marketplace/community-operators-stpbl" Oct 02 11:29:13 crc kubenswrapper[4766]: I1002 11:29:13.027306 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a027f51b-394d-4112-b3dd-8d167ff257c4-catalog-content\") pod \"community-operators-stpbl\" (UID: \"a027f51b-394d-4112-b3dd-8d167ff257c4\") " pod="openshift-marketplace/community-operators-stpbl" Oct 02 11:29:13 crc kubenswrapper[4766]: I1002 11:29:13.059391 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2qhb\" (UniqueName: \"kubernetes.io/projected/a027f51b-394d-4112-b3dd-8d167ff257c4-kube-api-access-k2qhb\") pod \"community-operators-stpbl\" (UID: \"a027f51b-394d-4112-b3dd-8d167ff257c4\") " pod="openshift-marketplace/community-operators-stpbl" Oct 02 11:29:13 crc kubenswrapper[4766]: I1002 11:29:13.356015 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-stpbl" Oct 02 11:29:13 crc kubenswrapper[4766]: I1002 11:29:13.751021 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-stpbl"] Oct 02 11:29:14 crc kubenswrapper[4766]: I1002 11:29:14.561669 4766 generic.go:334] "Generic (PLEG): container finished" podID="a027f51b-394d-4112-b3dd-8d167ff257c4" containerID="211f9d0f389c0cd3e4620da77475fb47146ee0e99d1256bbfd653dba0927dd1d" exitCode=0 Oct 02 11:29:14 crc kubenswrapper[4766]: I1002 11:29:14.561803 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stpbl" event={"ID":"a027f51b-394d-4112-b3dd-8d167ff257c4","Type":"ContainerDied","Data":"211f9d0f389c0cd3e4620da77475fb47146ee0e99d1256bbfd653dba0927dd1d"} Oct 02 11:29:14 crc kubenswrapper[4766]: I1002 11:29:14.562387 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stpbl" event={"ID":"a027f51b-394d-4112-b3dd-8d167ff257c4","Type":"ContainerStarted","Data":"fbb5ab4a77770f5a1ec6a9cd24d88bd0dc501864de33b4cb5ce5f75535a6fe6e"} Oct 02 11:29:16 crc kubenswrapper[4766]: I1002 11:29:16.577213 4766 generic.go:334] "Generic (PLEG): container finished" podID="a027f51b-394d-4112-b3dd-8d167ff257c4" containerID="a460fb95049d4176d06696989ad2b2405ae0cc47486738088cb538e13156fc5c" exitCode=0 Oct 02 11:29:16 crc kubenswrapper[4766]: I1002 11:29:16.577274 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stpbl" event={"ID":"a027f51b-394d-4112-b3dd-8d167ff257c4","Type":"ContainerDied","Data":"a460fb95049d4176d06696989ad2b2405ae0cc47486738088cb538e13156fc5c"} Oct 02 11:29:17 crc kubenswrapper[4766]: I1002 11:29:17.588850 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stpbl" event={"ID":"a027f51b-394d-4112-b3dd-8d167ff257c4","Type":"ContainerStarted","Data":"e72c7dcd4445f5a8890dc4e6e114933aefe1ad1addd03221c93ff1fb3d0ae70a"} Oct 02 11:29:17 crc kubenswrapper[4766]: I1002 11:29:17.607661 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-stpbl" podStartSLOduration=3.107811629 podStartE2EDuration="5.607638424s" podCreationTimestamp="2025-10-02 11:29:12 +0000 UTC" firstStartedPulling="2025-10-02 11:29:14.56466037 +0000 UTC m=+2269.507531314" lastFinishedPulling="2025-10-02 11:29:17.064487165 +0000 UTC m=+2272.007358109" observedRunningTime="2025-10-02 11:29:17.604781402 +0000 UTC m=+2272.547652366" watchObservedRunningTime="2025-10-02 11:29:17.607638424 +0000 UTC m=+2272.550509368" Oct 02 11:29:18 crc kubenswrapper[4766]: I1002 11:29:18.881934 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:29:18 crc kubenswrapper[4766]: E1002 11:29:18.882285 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:29:23 crc kubenswrapper[4766]: I1002 11:29:23.356647 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-stpbl" Oct 02 11:29:23 crc kubenswrapper[4766]: I1002 11:29:23.357358 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-stpbl" Oct 02 11:29:23 crc kubenswrapper[4766]: I1002 11:29:23.406310 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-stpbl" Oct 02 11:29:23 crc kubenswrapper[4766]: I1002 11:29:23.666836 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-stpbl" Oct 02 11:29:23 crc kubenswrapper[4766]: I1002 11:29:23.707575 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-stpbl"] Oct 02 11:29:25 crc kubenswrapper[4766]: I1002 11:29:25.639179 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-stpbl" podUID="a027f51b-394d-4112-b3dd-8d167ff257c4" containerName="registry-server" containerID="cri-o://e72c7dcd4445f5a8890dc4e6e114933aefe1ad1addd03221c93ff1fb3d0ae70a" gracePeriod=2 Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.012710 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-stpbl" Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.031188 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a027f51b-394d-4112-b3dd-8d167ff257c4-catalog-content\") pod \"a027f51b-394d-4112-b3dd-8d167ff257c4\" (UID: \"a027f51b-394d-4112-b3dd-8d167ff257c4\") " Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.031280 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a027f51b-394d-4112-b3dd-8d167ff257c4-utilities\") pod \"a027f51b-394d-4112-b3dd-8d167ff257c4\" (UID: \"a027f51b-394d-4112-b3dd-8d167ff257c4\") " Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.031348 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2qhb\" (UniqueName: \"kubernetes.io/projected/a027f51b-394d-4112-b3dd-8d167ff257c4-kube-api-access-k2qhb\") pod \"a027f51b-394d-4112-b3dd-8d167ff257c4\" (UID: \"a027f51b-394d-4112-b3dd-8d167ff257c4\") " Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.032581 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a027f51b-394d-4112-b3dd-8d167ff257c4-utilities" (OuterVolumeSpecName: "utilities") pod "a027f51b-394d-4112-b3dd-8d167ff257c4" (UID: "a027f51b-394d-4112-b3dd-8d167ff257c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.032882 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a027f51b-394d-4112-b3dd-8d167ff257c4-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.039315 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a027f51b-394d-4112-b3dd-8d167ff257c4-kube-api-access-k2qhb" (OuterVolumeSpecName: "kube-api-access-k2qhb") pod "a027f51b-394d-4112-b3dd-8d167ff257c4" (UID: "a027f51b-394d-4112-b3dd-8d167ff257c4"). InnerVolumeSpecName "kube-api-access-k2qhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.134436 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2qhb\" (UniqueName: \"kubernetes.io/projected/a027f51b-394d-4112-b3dd-8d167ff257c4-kube-api-access-k2qhb\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.403826 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a027f51b-394d-4112-b3dd-8d167ff257c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a027f51b-394d-4112-b3dd-8d167ff257c4" (UID: "a027f51b-394d-4112-b3dd-8d167ff257c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.438345 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a027f51b-394d-4112-b3dd-8d167ff257c4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.649464 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stpbl" event={"ID":"a027f51b-394d-4112-b3dd-8d167ff257c4","Type":"ContainerDied","Data":"e72c7dcd4445f5a8890dc4e6e114933aefe1ad1addd03221c93ff1fb3d0ae70a"} Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.649560 4766 scope.go:117] "RemoveContainer" containerID="e72c7dcd4445f5a8890dc4e6e114933aefe1ad1addd03221c93ff1fb3d0ae70a" Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.649476 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-stpbl" Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.649408 4766 generic.go:334] "Generic (PLEG): container finished" podID="a027f51b-394d-4112-b3dd-8d167ff257c4" containerID="e72c7dcd4445f5a8890dc4e6e114933aefe1ad1addd03221c93ff1fb3d0ae70a" exitCode=0 Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.649808 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-stpbl" event={"ID":"a027f51b-394d-4112-b3dd-8d167ff257c4","Type":"ContainerDied","Data":"fbb5ab4a77770f5a1ec6a9cd24d88bd0dc501864de33b4cb5ce5f75535a6fe6e"} Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.670061 4766 scope.go:117] "RemoveContainer" containerID="a460fb95049d4176d06696989ad2b2405ae0cc47486738088cb538e13156fc5c" Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.684544 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-stpbl"] Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.693920 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-stpbl"] Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.701878 4766 scope.go:117] "RemoveContainer" containerID="211f9d0f389c0cd3e4620da77475fb47146ee0e99d1256bbfd653dba0927dd1d" Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.720313 4766 scope.go:117] "RemoveContainer" containerID="e72c7dcd4445f5a8890dc4e6e114933aefe1ad1addd03221c93ff1fb3d0ae70a" Oct 02 11:29:26 crc kubenswrapper[4766]: E1002 11:29:26.720778 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e72c7dcd4445f5a8890dc4e6e114933aefe1ad1addd03221c93ff1fb3d0ae70a\": container with ID starting with e72c7dcd4445f5a8890dc4e6e114933aefe1ad1addd03221c93ff1fb3d0ae70a not found: ID does not exist" containerID="e72c7dcd4445f5a8890dc4e6e114933aefe1ad1addd03221c93ff1fb3d0ae70a" Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.720827 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e72c7dcd4445f5a8890dc4e6e114933aefe1ad1addd03221c93ff1fb3d0ae70a"} err="failed to get container status \"e72c7dcd4445f5a8890dc4e6e114933aefe1ad1addd03221c93ff1fb3d0ae70a\": rpc error: code = NotFound desc = could not find container \"e72c7dcd4445f5a8890dc4e6e114933aefe1ad1addd03221c93ff1fb3d0ae70a\": container with ID starting with e72c7dcd4445f5a8890dc4e6e114933aefe1ad1addd03221c93ff1fb3d0ae70a not found: ID does not exist" Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.720863 4766 scope.go:117] "RemoveContainer" containerID="a460fb95049d4176d06696989ad2b2405ae0cc47486738088cb538e13156fc5c" Oct 02 11:29:26 crc kubenswrapper[4766]: E1002 11:29:26.721189 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a460fb95049d4176d06696989ad2b2405ae0cc47486738088cb538e13156fc5c\": container with ID starting with a460fb95049d4176d06696989ad2b2405ae0cc47486738088cb538e13156fc5c not found: ID does not exist" containerID="a460fb95049d4176d06696989ad2b2405ae0cc47486738088cb538e13156fc5c" Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.721237 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a460fb95049d4176d06696989ad2b2405ae0cc47486738088cb538e13156fc5c"} err="failed to get container status \"a460fb95049d4176d06696989ad2b2405ae0cc47486738088cb538e13156fc5c\": rpc error: code = NotFound desc = could not find container \"a460fb95049d4176d06696989ad2b2405ae0cc47486738088cb538e13156fc5c\": container with ID starting with a460fb95049d4176d06696989ad2b2405ae0cc47486738088cb538e13156fc5c not found: ID does not exist" Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.721272 4766 scope.go:117] "RemoveContainer" containerID="211f9d0f389c0cd3e4620da77475fb47146ee0e99d1256bbfd653dba0927dd1d" Oct 02 11:29:26 crc kubenswrapper[4766]: E1002 11:29:26.721576 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"211f9d0f389c0cd3e4620da77475fb47146ee0e99d1256bbfd653dba0927dd1d\": container with ID starting with 211f9d0f389c0cd3e4620da77475fb47146ee0e99d1256bbfd653dba0927dd1d not found: ID does not exist" containerID="211f9d0f389c0cd3e4620da77475fb47146ee0e99d1256bbfd653dba0927dd1d" Oct 02 11:29:26 crc kubenswrapper[4766]: I1002 11:29:26.721607 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211f9d0f389c0cd3e4620da77475fb47146ee0e99d1256bbfd653dba0927dd1d"} err="failed to get container status \"211f9d0f389c0cd3e4620da77475fb47146ee0e99d1256bbfd653dba0927dd1d\": rpc error: code = NotFound desc = could not find container \"211f9d0f389c0cd3e4620da77475fb47146ee0e99d1256bbfd653dba0927dd1d\": container with ID starting with 211f9d0f389c0cd3e4620da77475fb47146ee0e99d1256bbfd653dba0927dd1d not found: ID does not exist" Oct 02 11:29:27 crc kubenswrapper[4766]: I1002 11:29:27.892929 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a027f51b-394d-4112-b3dd-8d167ff257c4" path="/var/lib/kubelet/pods/a027f51b-394d-4112-b3dd-8d167ff257c4/volumes" Oct 02 11:29:33 crc kubenswrapper[4766]: I1002 11:29:33.881297 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:29:33 crc kubenswrapper[4766]: E1002 11:29:33.882011 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:29:45 crc kubenswrapper[4766]: I1002 11:29:45.884493 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:29:45 crc kubenswrapper[4766]: E1002 11:29:45.885000 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:29:58 crc kubenswrapper[4766]: I1002 11:29:58.881103 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:29:58 crc kubenswrapper[4766]: E1002 11:29:58.882080 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.143563 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76"] Oct 02 11:30:00 crc kubenswrapper[4766]: E1002 11:30:00.143980 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a027f51b-394d-4112-b3dd-8d167ff257c4" containerName="extract-content" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.143997 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a027f51b-394d-4112-b3dd-8d167ff257c4" containerName="extract-content" Oct 02 11:30:00 crc kubenswrapper[4766]: E1002 11:30:00.144017 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a027f51b-394d-4112-b3dd-8d167ff257c4" containerName="registry-server" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.144023 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a027f51b-394d-4112-b3dd-8d167ff257c4" containerName="registry-server" Oct 02 11:30:00 crc kubenswrapper[4766]: E1002 11:30:00.144032 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a027f51b-394d-4112-b3dd-8d167ff257c4" containerName="extract-utilities" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.144039 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a027f51b-394d-4112-b3dd-8d167ff257c4" containerName="extract-utilities" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.144172 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a027f51b-394d-4112-b3dd-8d167ff257c4" containerName="registry-server" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.144721 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.147495 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.148260 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.150399 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76"] Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.296425 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49b32761-088a-4ccb-a645-a82ceee34ab8-secret-volume\") pod \"collect-profiles-29323410-k7s76\" (UID: \"49b32761-088a-4ccb-a645-a82ceee34ab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.296600 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49b32761-088a-4ccb-a645-a82ceee34ab8-config-volume\") pod \"collect-profiles-29323410-k7s76\" (UID: \"49b32761-088a-4ccb-a645-a82ceee34ab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.296658 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hsrx\" (UniqueName: \"kubernetes.io/projected/49b32761-088a-4ccb-a645-a82ceee34ab8-kube-api-access-9hsrx\") pod \"collect-profiles-29323410-k7s76\" (UID: \"49b32761-088a-4ccb-a645-a82ceee34ab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.397690 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hsrx\" (UniqueName: \"kubernetes.io/projected/49b32761-088a-4ccb-a645-a82ceee34ab8-kube-api-access-9hsrx\") pod \"collect-profiles-29323410-k7s76\" (UID: \"49b32761-088a-4ccb-a645-a82ceee34ab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.397785 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49b32761-088a-4ccb-a645-a82ceee34ab8-secret-volume\") pod \"collect-profiles-29323410-k7s76\" (UID: \"49b32761-088a-4ccb-a645-a82ceee34ab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.397843 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49b32761-088a-4ccb-a645-a82ceee34ab8-config-volume\") pod \"collect-profiles-29323410-k7s76\" (UID: \"49b32761-088a-4ccb-a645-a82ceee34ab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.398708 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49b32761-088a-4ccb-a645-a82ceee34ab8-config-volume\") pod \"collect-profiles-29323410-k7s76\" (UID: \"49b32761-088a-4ccb-a645-a82ceee34ab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.404625 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49b32761-088a-4ccb-a645-a82ceee34ab8-secret-volume\") pod \"collect-profiles-29323410-k7s76\" (UID: \"49b32761-088a-4ccb-a645-a82ceee34ab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.416914 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hsrx\" (UniqueName: \"kubernetes.io/projected/49b32761-088a-4ccb-a645-a82ceee34ab8-kube-api-access-9hsrx\") pod \"collect-profiles-29323410-k7s76\" (UID: \"49b32761-088a-4ccb-a645-a82ceee34ab8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.463049 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76" Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.881808 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76"] Oct 02 11:30:00 crc kubenswrapper[4766]: I1002 11:30:00.921888 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76" event={"ID":"49b32761-088a-4ccb-a645-a82ceee34ab8","Type":"ContainerStarted","Data":"f9b1df208716080cd6a8a1f4c43b506afe610d6f6ad60b12feeefa69aeb729d1"} Oct 02 11:30:01 crc kubenswrapper[4766]: I1002 11:30:01.929628 4766 generic.go:334] "Generic (PLEG): container finished" podID="49b32761-088a-4ccb-a645-a82ceee34ab8" containerID="329cc54c30f8c86ef0d1ab0d3d7620c91f969a6d89f4ce489cd30d755779b9bd" exitCode=0 Oct 02 11:30:01 crc kubenswrapper[4766]: I1002 11:30:01.929736 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76" event={"ID":"49b32761-088a-4ccb-a645-a82ceee34ab8","Type":"ContainerDied","Data":"329cc54c30f8c86ef0d1ab0d3d7620c91f969a6d89f4ce489cd30d755779b9bd"} Oct 02 11:30:03 crc kubenswrapper[4766]: I1002 11:30:03.194063 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76" Oct 02 11:30:03 crc kubenswrapper[4766]: I1002 11:30:03.337803 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hsrx\" (UniqueName: \"kubernetes.io/projected/49b32761-088a-4ccb-a645-a82ceee34ab8-kube-api-access-9hsrx\") pod \"49b32761-088a-4ccb-a645-a82ceee34ab8\" (UID: \"49b32761-088a-4ccb-a645-a82ceee34ab8\") " Oct 02 11:30:03 crc kubenswrapper[4766]: I1002 11:30:03.337881 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49b32761-088a-4ccb-a645-a82ceee34ab8-config-volume\") pod \"49b32761-088a-4ccb-a645-a82ceee34ab8\" (UID: \"49b32761-088a-4ccb-a645-a82ceee34ab8\") " Oct 02 11:30:03 crc kubenswrapper[4766]: I1002 11:30:03.338067 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49b32761-088a-4ccb-a645-a82ceee34ab8-secret-volume\") pod \"49b32761-088a-4ccb-a645-a82ceee34ab8\" (UID: \"49b32761-088a-4ccb-a645-a82ceee34ab8\") " Oct 02 11:30:03 crc kubenswrapper[4766]: I1002 11:30:03.339034 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49b32761-088a-4ccb-a645-a82ceee34ab8-config-volume" (OuterVolumeSpecName: "config-volume") pod "49b32761-088a-4ccb-a645-a82ceee34ab8" (UID: "49b32761-088a-4ccb-a645-a82ceee34ab8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:30:03 crc kubenswrapper[4766]: I1002 11:30:03.344172 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49b32761-088a-4ccb-a645-a82ceee34ab8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "49b32761-088a-4ccb-a645-a82ceee34ab8" (UID: "49b32761-088a-4ccb-a645-a82ceee34ab8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:30:03 crc kubenswrapper[4766]: I1002 11:30:03.344597 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b32761-088a-4ccb-a645-a82ceee34ab8-kube-api-access-9hsrx" (OuterVolumeSpecName: "kube-api-access-9hsrx") pod "49b32761-088a-4ccb-a645-a82ceee34ab8" (UID: "49b32761-088a-4ccb-a645-a82ceee34ab8"). InnerVolumeSpecName "kube-api-access-9hsrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:30:03 crc kubenswrapper[4766]: I1002 11:30:03.439375 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hsrx\" (UniqueName: \"kubernetes.io/projected/49b32761-088a-4ccb-a645-a82ceee34ab8-kube-api-access-9hsrx\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:03 crc kubenswrapper[4766]: I1002 11:30:03.439422 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49b32761-088a-4ccb-a645-a82ceee34ab8-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:03 crc kubenswrapper[4766]: I1002 11:30:03.439434 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49b32761-088a-4ccb-a645-a82ceee34ab8-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:30:03 crc kubenswrapper[4766]: I1002 11:30:03.945905 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76" event={"ID":"49b32761-088a-4ccb-a645-a82ceee34ab8","Type":"ContainerDied","Data":"f9b1df208716080cd6a8a1f4c43b506afe610d6f6ad60b12feeefa69aeb729d1"} Oct 02 11:30:03 crc kubenswrapper[4766]: I1002 11:30:03.946255 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9b1df208716080cd6a8a1f4c43b506afe610d6f6ad60b12feeefa69aeb729d1" Oct 02 11:30:03 crc kubenswrapper[4766]: I1002 11:30:03.945948 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76" Oct 02 11:30:04 crc kubenswrapper[4766]: I1002 11:30:04.271730 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns"] Oct 02 11:30:04 crc kubenswrapper[4766]: I1002 11:30:04.276296 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323365-wmmns"] Oct 02 11:30:05 crc kubenswrapper[4766]: I1002 11:30:05.893488 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac" path="/var/lib/kubelet/pods/67792bcf-c6bb-45ba-b0b6-e7bb3c0276ac/volumes" Oct 02 11:30:11 crc kubenswrapper[4766]: I1002 11:30:11.881025 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:30:11 crc kubenswrapper[4766]: E1002 11:30:11.881523 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:30:24 crc kubenswrapper[4766]: I1002 11:30:24.881191 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:30:24 crc kubenswrapper[4766]: E1002 11:30:24.883774 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:30:33 crc kubenswrapper[4766]: I1002 11:30:33.685072 4766 scope.go:117] "RemoveContainer" containerID="b4f92a7af89f82948375a99c653fe3353833585c19922ad38b613636d7b14f50" Oct 02 11:30:36 crc kubenswrapper[4766]: I1002 11:30:36.881550 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:30:36 crc kubenswrapper[4766]: E1002 11:30:36.882087 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:30:48 crc kubenswrapper[4766]: I1002 11:30:48.882195 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:30:48 crc kubenswrapper[4766]: E1002 11:30:48.882938 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:31:02 crc kubenswrapper[4766]: I1002 11:31:02.881393 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:31:02 crc kubenswrapper[4766]: E1002 11:31:02.882162 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:31:15 crc kubenswrapper[4766]: I1002 11:31:15.886308 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:31:15 crc kubenswrapper[4766]: E1002 11:31:15.887341 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:31:30 crc kubenswrapper[4766]: I1002 11:31:30.882025 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:31:30 crc kubenswrapper[4766]: E1002 11:31:30.883393 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:31:44 crc kubenswrapper[4766]: I1002 11:31:44.881114 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:31:44 crc kubenswrapper[4766]: E1002 11:31:44.881963 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:31:59 crc kubenswrapper[4766]: I1002 11:31:59.882360 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:31:59 crc kubenswrapper[4766]: E1002 11:31:59.883103 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:32:10 crc kubenswrapper[4766]: I1002 11:32:10.881960 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:32:10 crc kubenswrapper[4766]: E1002 11:32:10.884403 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:32:22 crc kubenswrapper[4766]: I1002 11:32:22.880841 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:32:22 crc kubenswrapper[4766]: E1002 11:32:22.881522 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:32:35 crc kubenswrapper[4766]: I1002 11:32:35.885486 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:32:35 crc kubenswrapper[4766]: E1002 11:32:35.887431 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:32:49 crc kubenswrapper[4766]: I1002 11:32:49.881640 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:32:49 crc kubenswrapper[4766]: E1002 11:32:49.882496 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:33:02 crc kubenswrapper[4766]: I1002 11:33:02.880961 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:33:02 crc kubenswrapper[4766]: E1002 11:33:02.882406 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:33:14 crc kubenswrapper[4766]: I1002 11:33:14.881228 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:33:14 crc kubenswrapper[4766]: E1002 11:33:14.882091 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:33:25 crc kubenswrapper[4766]: I1002 11:33:25.886031 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:33:26 crc kubenswrapper[4766]: I1002 11:33:26.436670 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"bfc7260bc86efb2d05b7d466e4b476e211caa5c18fb8377de83db8532e635774"} Oct 02 11:35:54 crc kubenswrapper[4766]: I1002 11:35:54.432221 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:35:54 crc kubenswrapper[4766]: I1002 11:35:54.432837 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:36:24 crc kubenswrapper[4766]: I1002 11:36:24.432171 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:36:24 crc kubenswrapper[4766]: I1002 11:36:24.433695 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:36:54 crc kubenswrapper[4766]: I1002 11:36:54.432187 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:36:54 crc kubenswrapper[4766]: I1002 11:36:54.432791 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:36:54 crc kubenswrapper[4766]: I1002 11:36:54.432834 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 11:36:54 crc kubenswrapper[4766]: I1002 11:36:54.433447 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bfc7260bc86efb2d05b7d466e4b476e211caa5c18fb8377de83db8532e635774"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:36:54 crc kubenswrapper[4766]: I1002 11:36:54.433491 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://bfc7260bc86efb2d05b7d466e4b476e211caa5c18fb8377de83db8532e635774" gracePeriod=600 Oct 02 11:36:54 crc kubenswrapper[4766]: I1002 11:36:54.907579 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="bfc7260bc86efb2d05b7d466e4b476e211caa5c18fb8377de83db8532e635774" exitCode=0 Oct 02 11:36:54 crc kubenswrapper[4766]: I1002 11:36:54.907659 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"bfc7260bc86efb2d05b7d466e4b476e211caa5c18fb8377de83db8532e635774"} Oct 02 11:36:54 crc kubenswrapper[4766]: I1002 11:36:54.907973 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4"} Oct 02 11:36:54 crc kubenswrapper[4766]: I1002 11:36:54.908006 4766 scope.go:117] "RemoveContainer" containerID="196ff70e00f82fc73ac876cf2c9fb0d3e8ea54b0d982b3b2c0a69e4e854e65b9" Oct 02 11:38:54 crc kubenswrapper[4766]: I1002 11:38:54.431662 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:38:54 crc kubenswrapper[4766]: I1002 11:38:54.432264 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:38:56 crc kubenswrapper[4766]: I1002 11:38:56.660923 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ktpt8"] Oct 02 11:38:56 crc kubenswrapper[4766]: E1002 11:38:56.661259 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b32761-088a-4ccb-a645-a82ceee34ab8" containerName="collect-profiles" Oct 02 11:38:56 crc kubenswrapper[4766]: I1002 11:38:56.661277 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b32761-088a-4ccb-a645-a82ceee34ab8" containerName="collect-profiles" Oct 02 11:38:56 crc kubenswrapper[4766]: I1002 11:38:56.661459 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b32761-088a-4ccb-a645-a82ceee34ab8" containerName="collect-profiles" Oct 02 11:38:56 crc kubenswrapper[4766]: I1002 11:38:56.662559 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ktpt8" Oct 02 11:38:56 crc kubenswrapper[4766]: I1002 11:38:56.713616 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ktpt8"] Oct 02 11:38:56 crc kubenswrapper[4766]: I1002 11:38:56.758913 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c680e9-e85a-40f5-90f3-6a48667133c4-catalog-content\") pod \"redhat-operators-ktpt8\" (UID: \"c8c680e9-e85a-40f5-90f3-6a48667133c4\") " pod="openshift-marketplace/redhat-operators-ktpt8" Oct 02 11:38:56 crc kubenswrapper[4766]: I1002 11:38:56.759100 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c680e9-e85a-40f5-90f3-6a48667133c4-utilities\") pod \"redhat-operators-ktpt8\" (UID: \"c8c680e9-e85a-40f5-90f3-6a48667133c4\") " pod="openshift-marketplace/redhat-operators-ktpt8" Oct 02 11:38:56 crc kubenswrapper[4766]: I1002 11:38:56.759168 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7sjb\" (UniqueName: \"kubernetes.io/projected/c8c680e9-e85a-40f5-90f3-6a48667133c4-kube-api-access-l7sjb\") pod \"redhat-operators-ktpt8\" (UID: \"c8c680e9-e85a-40f5-90f3-6a48667133c4\") " pod="openshift-marketplace/redhat-operators-ktpt8" Oct 02 11:38:56 crc kubenswrapper[4766]: I1002 11:38:56.860685 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c680e9-e85a-40f5-90f3-6a48667133c4-utilities\") pod \"redhat-operators-ktpt8\" (UID: \"c8c680e9-e85a-40f5-90f3-6a48667133c4\") " pod="openshift-marketplace/redhat-operators-ktpt8" Oct 02 11:38:56 crc kubenswrapper[4766]: I1002 11:38:56.860743 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7sjb\" (UniqueName: \"kubernetes.io/projected/c8c680e9-e85a-40f5-90f3-6a48667133c4-kube-api-access-l7sjb\") pod \"redhat-operators-ktpt8\" (UID: \"c8c680e9-e85a-40f5-90f3-6a48667133c4\") " pod="openshift-marketplace/redhat-operators-ktpt8" Oct 02 11:38:56 crc kubenswrapper[4766]: I1002 11:38:56.860776 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c680e9-e85a-40f5-90f3-6a48667133c4-catalog-content\") pod \"redhat-operators-ktpt8\" (UID: \"c8c680e9-e85a-40f5-90f3-6a48667133c4\") " pod="openshift-marketplace/redhat-operators-ktpt8" Oct 02 11:38:56 crc kubenswrapper[4766]: I1002 11:38:56.861290 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c680e9-e85a-40f5-90f3-6a48667133c4-utilities\") pod \"redhat-operators-ktpt8\" (UID: \"c8c680e9-e85a-40f5-90f3-6a48667133c4\") " pod="openshift-marketplace/redhat-operators-ktpt8" Oct 02 11:38:56 crc kubenswrapper[4766]: I1002 11:38:56.861327 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c680e9-e85a-40f5-90f3-6a48667133c4-catalog-content\") pod \"redhat-operators-ktpt8\" (UID: \"c8c680e9-e85a-40f5-90f3-6a48667133c4\") " pod="openshift-marketplace/redhat-operators-ktpt8" Oct 02 11:38:56 crc kubenswrapper[4766]: I1002 11:38:56.882833 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7sjb\" (UniqueName: \"kubernetes.io/projected/c8c680e9-e85a-40f5-90f3-6a48667133c4-kube-api-access-l7sjb\") pod \"redhat-operators-ktpt8\" (UID: \"c8c680e9-e85a-40f5-90f3-6a48667133c4\") " pod="openshift-marketplace/redhat-operators-ktpt8" Oct 02 11:38:56 crc kubenswrapper[4766]: I1002 11:38:56.981724 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ktpt8" Oct 02 11:38:57 crc kubenswrapper[4766]: I1002 11:38:57.441940 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ktpt8"] Oct 02 11:38:57 crc kubenswrapper[4766]: I1002 11:38:57.756897 4766 generic.go:334] "Generic (PLEG): container finished" podID="c8c680e9-e85a-40f5-90f3-6a48667133c4" containerID="c7a53ec4af8d9e3ffd7133d54142ab60b55f151b17cceddf48733e0af669880c" exitCode=0 Oct 02 11:38:57 crc kubenswrapper[4766]: I1002 11:38:57.756937 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktpt8" event={"ID":"c8c680e9-e85a-40f5-90f3-6a48667133c4","Type":"ContainerDied","Data":"c7a53ec4af8d9e3ffd7133d54142ab60b55f151b17cceddf48733e0af669880c"} Oct 02 11:38:57 crc kubenswrapper[4766]: I1002 11:38:57.756974 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktpt8" event={"ID":"c8c680e9-e85a-40f5-90f3-6a48667133c4","Type":"ContainerStarted","Data":"efdae6d3d12dff70ef48eda23e39e100f09217e64d23940efaeb065e757cd3e9"} Oct 02 11:38:57 crc kubenswrapper[4766]: I1002 11:38:57.758699 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:38:59 crc kubenswrapper[4766]: I1002 11:38:59.773317 4766 generic.go:334] "Generic (PLEG): container finished" podID="c8c680e9-e85a-40f5-90f3-6a48667133c4" containerID="5e70db21bed588383d21ec3d5c935dcfe39c8505f5b8e534090db6c73bed15d3" exitCode=0 Oct 02 11:38:59 crc kubenswrapper[4766]: I1002 11:38:59.773460 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktpt8" event={"ID":"c8c680e9-e85a-40f5-90f3-6a48667133c4","Type":"ContainerDied","Data":"5e70db21bed588383d21ec3d5c935dcfe39c8505f5b8e534090db6c73bed15d3"} Oct 02 11:39:00 crc kubenswrapper[4766]: I1002 11:39:00.784554 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktpt8" event={"ID":"c8c680e9-e85a-40f5-90f3-6a48667133c4","Type":"ContainerStarted","Data":"f01c3d371f3821dd038db19ef493c7d0729222e00b60d27ec3ecafa0f28643b0"} Oct 02 11:39:00 crc kubenswrapper[4766]: I1002 11:39:00.812674 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ktpt8" podStartSLOduration=2.257975156 podStartE2EDuration="4.812651209s" podCreationTimestamp="2025-10-02 11:38:56 +0000 UTC" firstStartedPulling="2025-10-02 11:38:57.758370612 +0000 UTC m=+2852.701241566" lastFinishedPulling="2025-10-02 11:39:00.313046675 +0000 UTC m=+2855.255917619" observedRunningTime="2025-10-02 11:39:00.809304251 +0000 UTC m=+2855.752175205" watchObservedRunningTime="2025-10-02 11:39:00.812651209 +0000 UTC m=+2855.755522153" Oct 02 11:39:06 crc kubenswrapper[4766]: I1002 11:39:06.982143 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ktpt8" Oct 02 11:39:06 crc kubenswrapper[4766]: I1002 11:39:06.982778 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ktpt8" Oct 02 11:39:07 crc kubenswrapper[4766]: I1002 11:39:07.028475 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ktpt8" Oct 02 11:39:07 crc kubenswrapper[4766]: I1002 11:39:07.891391 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ktpt8" Oct 02 11:39:07 crc kubenswrapper[4766]: I1002 11:39:07.939647 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ktpt8"] Oct 02 11:39:09 crc kubenswrapper[4766]: I1002 11:39:09.850664 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ktpt8" podUID="c8c680e9-e85a-40f5-90f3-6a48667133c4" containerName="registry-server" containerID="cri-o://f01c3d371f3821dd038db19ef493c7d0729222e00b60d27ec3ecafa0f28643b0" gracePeriod=2 Oct 02 11:39:11 crc kubenswrapper[4766]: I1002 11:39:11.878798 4766 generic.go:334] "Generic (PLEG): container finished" podID="c8c680e9-e85a-40f5-90f3-6a48667133c4" containerID="f01c3d371f3821dd038db19ef493c7d0729222e00b60d27ec3ecafa0f28643b0" exitCode=0 Oct 02 11:39:11 crc kubenswrapper[4766]: I1002 11:39:11.878857 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktpt8" event={"ID":"c8c680e9-e85a-40f5-90f3-6a48667133c4","Type":"ContainerDied","Data":"f01c3d371f3821dd038db19ef493c7d0729222e00b60d27ec3ecafa0f28643b0"} Oct 02 11:39:12 crc kubenswrapper[4766]: I1002 11:39:12.070477 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ktpt8" Oct 02 11:39:12 crc kubenswrapper[4766]: I1002 11:39:12.170830 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7sjb\" (UniqueName: \"kubernetes.io/projected/c8c680e9-e85a-40f5-90f3-6a48667133c4-kube-api-access-l7sjb\") pod \"c8c680e9-e85a-40f5-90f3-6a48667133c4\" (UID: \"c8c680e9-e85a-40f5-90f3-6a48667133c4\") " Oct 02 11:39:12 crc kubenswrapper[4766]: I1002 11:39:12.170947 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c680e9-e85a-40f5-90f3-6a48667133c4-utilities\") pod \"c8c680e9-e85a-40f5-90f3-6a48667133c4\" (UID: \"c8c680e9-e85a-40f5-90f3-6a48667133c4\") " Oct 02 11:39:12 crc kubenswrapper[4766]: I1002 11:39:12.171074 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c680e9-e85a-40f5-90f3-6a48667133c4-catalog-content\") pod \"c8c680e9-e85a-40f5-90f3-6a48667133c4\" (UID: \"c8c680e9-e85a-40f5-90f3-6a48667133c4\") " Oct 02 11:39:12 crc kubenswrapper[4766]: I1002 11:39:12.172605 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8c680e9-e85a-40f5-90f3-6a48667133c4-utilities" (OuterVolumeSpecName: "utilities") pod "c8c680e9-e85a-40f5-90f3-6a48667133c4" (UID: "c8c680e9-e85a-40f5-90f3-6a48667133c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:39:12 crc kubenswrapper[4766]: I1002 11:39:12.180060 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8c680e9-e85a-40f5-90f3-6a48667133c4-kube-api-access-l7sjb" (OuterVolumeSpecName: "kube-api-access-l7sjb") pod "c8c680e9-e85a-40f5-90f3-6a48667133c4" (UID: "c8c680e9-e85a-40f5-90f3-6a48667133c4"). InnerVolumeSpecName "kube-api-access-l7sjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:12 crc kubenswrapper[4766]: I1002 11:39:12.264065 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8c680e9-e85a-40f5-90f3-6a48667133c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8c680e9-e85a-40f5-90f3-6a48667133c4" (UID: "c8c680e9-e85a-40f5-90f3-6a48667133c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:39:12 crc kubenswrapper[4766]: I1002 11:39:12.272986 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8c680e9-e85a-40f5-90f3-6a48667133c4-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:12 crc kubenswrapper[4766]: I1002 11:39:12.273035 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8c680e9-e85a-40f5-90f3-6a48667133c4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:12 crc kubenswrapper[4766]: I1002 11:39:12.273051 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7sjb\" (UniqueName: \"kubernetes.io/projected/c8c680e9-e85a-40f5-90f3-6a48667133c4-kube-api-access-l7sjb\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:12 crc kubenswrapper[4766]: I1002 11:39:12.886666 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktpt8" event={"ID":"c8c680e9-e85a-40f5-90f3-6a48667133c4","Type":"ContainerDied","Data":"efdae6d3d12dff70ef48eda23e39e100f09217e64d23940efaeb065e757cd3e9"} Oct 02 11:39:12 crc kubenswrapper[4766]: I1002 11:39:12.886735 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ktpt8" Oct 02 11:39:12 crc kubenswrapper[4766]: I1002 11:39:12.887000 4766 scope.go:117] "RemoveContainer" containerID="f01c3d371f3821dd038db19ef493c7d0729222e00b60d27ec3ecafa0f28643b0" Oct 02 11:39:12 crc kubenswrapper[4766]: I1002 11:39:12.903382 4766 scope.go:117] "RemoveContainer" containerID="5e70db21bed588383d21ec3d5c935dcfe39c8505f5b8e534090db6c73bed15d3" Oct 02 11:39:12 crc kubenswrapper[4766]: I1002 11:39:12.927744 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ktpt8"] Oct 02 11:39:12 crc kubenswrapper[4766]: I1002 11:39:12.935478 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ktpt8"] Oct 02 11:39:12 crc kubenswrapper[4766]: I1002 11:39:12.943264 4766 scope.go:117] "RemoveContainer" containerID="c7a53ec4af8d9e3ffd7133d54142ab60b55f151b17cceddf48733e0af669880c" Oct 02 11:39:13 crc kubenswrapper[4766]: I1002 11:39:13.890328 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8c680e9-e85a-40f5-90f3-6a48667133c4" path="/var/lib/kubelet/pods/c8c680e9-e85a-40f5-90f3-6a48667133c4/volumes" Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.040283 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9nmkc"] Oct 02 11:39:21 crc kubenswrapper[4766]: E1002 11:39:21.040882 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c680e9-e85a-40f5-90f3-6a48667133c4" containerName="registry-server" Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.040894 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c680e9-e85a-40f5-90f3-6a48667133c4" containerName="registry-server" Oct 02 11:39:21 crc kubenswrapper[4766]: E1002 11:39:21.040901 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c680e9-e85a-40f5-90f3-6a48667133c4" containerName="extract-utilities" Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.040908 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c680e9-e85a-40f5-90f3-6a48667133c4" containerName="extract-utilities" Oct 02 11:39:21 crc kubenswrapper[4766]: E1002 11:39:21.040926 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c680e9-e85a-40f5-90f3-6a48667133c4" containerName="extract-content" Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.040932 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c680e9-e85a-40f5-90f3-6a48667133c4" containerName="extract-content" Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.041065 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8c680e9-e85a-40f5-90f3-6a48667133c4" containerName="registry-server" Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.042078 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9nmkc" Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.058068 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9nmkc"] Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.194479 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7642p\" (UniqueName: \"kubernetes.io/projected/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-kube-api-access-7642p\") pod \"community-operators-9nmkc\" (UID: \"4ae15226-6e43-41f6-bc49-bf38cbc13fa4\") " pod="openshift-marketplace/community-operators-9nmkc" Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.194563 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-catalog-content\") pod \"community-operators-9nmkc\" (UID: \"4ae15226-6e43-41f6-bc49-bf38cbc13fa4\") " pod="openshift-marketplace/community-operators-9nmkc" Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.194586 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-utilities\") pod \"community-operators-9nmkc\" (UID: \"4ae15226-6e43-41f6-bc49-bf38cbc13fa4\") " pod="openshift-marketplace/community-operators-9nmkc" Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.295796 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7642p\" (UniqueName: \"kubernetes.io/projected/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-kube-api-access-7642p\") pod \"community-operators-9nmkc\" (UID: \"4ae15226-6e43-41f6-bc49-bf38cbc13fa4\") " pod="openshift-marketplace/community-operators-9nmkc" Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.295876 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-catalog-content\") pod \"community-operators-9nmkc\" (UID: \"4ae15226-6e43-41f6-bc49-bf38cbc13fa4\") " pod="openshift-marketplace/community-operators-9nmkc" Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.295909 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-utilities\") pod \"community-operators-9nmkc\" (UID: \"4ae15226-6e43-41f6-bc49-bf38cbc13fa4\") " pod="openshift-marketplace/community-operators-9nmkc" Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.296360 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-utilities\") pod \"community-operators-9nmkc\" (UID: \"4ae15226-6e43-41f6-bc49-bf38cbc13fa4\") " pod="openshift-marketplace/community-operators-9nmkc" Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.296878 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-catalog-content\") pod \"community-operators-9nmkc\" (UID: \"4ae15226-6e43-41f6-bc49-bf38cbc13fa4\") " pod="openshift-marketplace/community-operators-9nmkc" Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.332448 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7642p\" (UniqueName: \"kubernetes.io/projected/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-kube-api-access-7642p\") pod \"community-operators-9nmkc\" (UID: \"4ae15226-6e43-41f6-bc49-bf38cbc13fa4\") " pod="openshift-marketplace/community-operators-9nmkc" Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.366330 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9nmkc" Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.891257 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9nmkc"] Oct 02 11:39:21 crc kubenswrapper[4766]: I1002 11:39:21.945517 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9nmkc" event={"ID":"4ae15226-6e43-41f6-bc49-bf38cbc13fa4","Type":"ContainerStarted","Data":"a23a7a9209f1a3babb7a9737565e1305ccdf5d55046ba875da32096a3ffb7989"} Oct 02 11:39:22 crc kubenswrapper[4766]: I1002 11:39:22.963880 4766 generic.go:334] "Generic (PLEG): container finished" podID="4ae15226-6e43-41f6-bc49-bf38cbc13fa4" containerID="85531d3ea1375eb0a5fc5d1aea22b2b15079d64b9ba8bbfc8ea08b076eae32b0" exitCode=0 Oct 02 11:39:22 crc kubenswrapper[4766]: I1002 11:39:22.965566 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9nmkc" event={"ID":"4ae15226-6e43-41f6-bc49-bf38cbc13fa4","Type":"ContainerDied","Data":"85531d3ea1375eb0a5fc5d1aea22b2b15079d64b9ba8bbfc8ea08b076eae32b0"} Oct 02 11:39:23 crc kubenswrapper[4766]: I1002 11:39:23.978036 4766 generic.go:334] "Generic (PLEG): container finished" podID="4ae15226-6e43-41f6-bc49-bf38cbc13fa4" containerID="0da873237a701e9838ba3839316f456b28f9748937fc775ff5c3b63bad874fd3" exitCode=0 Oct 02 11:39:23 crc kubenswrapper[4766]: I1002 11:39:23.978143 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9nmkc" event={"ID":"4ae15226-6e43-41f6-bc49-bf38cbc13fa4","Type":"ContainerDied","Data":"0da873237a701e9838ba3839316f456b28f9748937fc775ff5c3b63bad874fd3"} Oct 02 11:39:24 crc kubenswrapper[4766]: I1002 11:39:24.431968 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:39:24 crc kubenswrapper[4766]: I1002 11:39:24.432032 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:39:24 crc kubenswrapper[4766]: I1002 11:39:24.988394 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9nmkc" event={"ID":"4ae15226-6e43-41f6-bc49-bf38cbc13fa4","Type":"ContainerStarted","Data":"8b6951232682990db7f769ad4c3a2719cb410946a407a97e3d1e329bad75720f"} Oct 02 11:39:25 crc kubenswrapper[4766]: I1002 11:39:25.009351 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9nmkc" podStartSLOduration=2.542688899 podStartE2EDuration="4.009331378s" podCreationTimestamp="2025-10-02 11:39:21 +0000 UTC" firstStartedPulling="2025-10-02 11:39:22.970130366 +0000 UTC m=+2877.913001310" lastFinishedPulling="2025-10-02 11:39:24.436772845 +0000 UTC m=+2879.379643789" observedRunningTime="2025-10-02 11:39:25.007898981 +0000 UTC m=+2879.950769935" watchObservedRunningTime="2025-10-02 11:39:25.009331378 +0000 UTC m=+2879.952202322" Oct 02 11:39:31 crc kubenswrapper[4766]: I1002 11:39:31.367484 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9nmkc" Oct 02 11:39:31 crc kubenswrapper[4766]: I1002 11:39:31.368563 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9nmkc" Oct 02 11:39:31 crc kubenswrapper[4766]: I1002 11:39:31.412478 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9nmkc" Oct 02 11:39:32 crc kubenswrapper[4766]: I1002 11:39:32.078843 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9nmkc" Oct 02 11:39:32 crc kubenswrapper[4766]: I1002 11:39:32.123146 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9nmkc"] Oct 02 11:39:34 crc kubenswrapper[4766]: I1002 11:39:34.056396 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9nmkc" podUID="4ae15226-6e43-41f6-bc49-bf38cbc13fa4" containerName="registry-server" containerID="cri-o://8b6951232682990db7f769ad4c3a2719cb410946a407a97e3d1e329bad75720f" gracePeriod=2 Oct 02 11:39:34 crc kubenswrapper[4766]: I1002 11:39:34.453433 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9nmkc" Oct 02 11:39:34 crc kubenswrapper[4766]: I1002 11:39:34.593100 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7642p\" (UniqueName: \"kubernetes.io/projected/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-kube-api-access-7642p\") pod \"4ae15226-6e43-41f6-bc49-bf38cbc13fa4\" (UID: \"4ae15226-6e43-41f6-bc49-bf38cbc13fa4\") " Oct 02 11:39:34 crc kubenswrapper[4766]: I1002 11:39:34.593230 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-catalog-content\") pod \"4ae15226-6e43-41f6-bc49-bf38cbc13fa4\" (UID: \"4ae15226-6e43-41f6-bc49-bf38cbc13fa4\") " Oct 02 11:39:34 crc kubenswrapper[4766]: I1002 11:39:34.593315 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-utilities\") pod \"4ae15226-6e43-41f6-bc49-bf38cbc13fa4\" (UID: \"4ae15226-6e43-41f6-bc49-bf38cbc13fa4\") " Oct 02 11:39:34 crc kubenswrapper[4766]: I1002 11:39:34.595836 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-utilities" (OuterVolumeSpecName: "utilities") pod "4ae15226-6e43-41f6-bc49-bf38cbc13fa4" (UID: "4ae15226-6e43-41f6-bc49-bf38cbc13fa4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:39:34 crc kubenswrapper[4766]: I1002 11:39:34.598034 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-kube-api-access-7642p" (OuterVolumeSpecName: "kube-api-access-7642p") pod "4ae15226-6e43-41f6-bc49-bf38cbc13fa4" (UID: "4ae15226-6e43-41f6-bc49-bf38cbc13fa4"). InnerVolumeSpecName "kube-api-access-7642p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:39:34 crc kubenswrapper[4766]: I1002 11:39:34.646114 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ae15226-6e43-41f6-bc49-bf38cbc13fa4" (UID: "4ae15226-6e43-41f6-bc49-bf38cbc13fa4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:39:34 crc kubenswrapper[4766]: I1002 11:39:34.695542 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:34 crc kubenswrapper[4766]: I1002 11:39:34.695617 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7642p\" (UniqueName: \"kubernetes.io/projected/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-kube-api-access-7642p\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:34 crc kubenswrapper[4766]: I1002 11:39:34.695629 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae15226-6e43-41f6-bc49-bf38cbc13fa4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:39:35 crc kubenswrapper[4766]: I1002 11:39:35.071663 4766 generic.go:334] "Generic (PLEG): container finished" podID="4ae15226-6e43-41f6-bc49-bf38cbc13fa4" containerID="8b6951232682990db7f769ad4c3a2719cb410946a407a97e3d1e329bad75720f" exitCode=0 Oct 02 11:39:35 crc kubenswrapper[4766]: I1002 11:39:35.072299 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9nmkc" event={"ID":"4ae15226-6e43-41f6-bc49-bf38cbc13fa4","Type":"ContainerDied","Data":"8b6951232682990db7f769ad4c3a2719cb410946a407a97e3d1e329bad75720f"} Oct 02 11:39:35 crc kubenswrapper[4766]: I1002 11:39:35.072349 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9nmkc" event={"ID":"4ae15226-6e43-41f6-bc49-bf38cbc13fa4","Type":"ContainerDied","Data":"a23a7a9209f1a3babb7a9737565e1305ccdf5d55046ba875da32096a3ffb7989"} Oct 02 11:39:35 crc kubenswrapper[4766]: I1002 11:39:35.072382 4766 scope.go:117] "RemoveContainer" containerID="8b6951232682990db7f769ad4c3a2719cb410946a407a97e3d1e329bad75720f" Oct 02 11:39:35 crc kubenswrapper[4766]: I1002 11:39:35.072613 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9nmkc" Oct 02 11:39:35 crc kubenswrapper[4766]: I1002 11:39:35.105117 4766 scope.go:117] "RemoveContainer" containerID="0da873237a701e9838ba3839316f456b28f9748937fc775ff5c3b63bad874fd3" Oct 02 11:39:35 crc kubenswrapper[4766]: I1002 11:39:35.111903 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9nmkc"] Oct 02 11:39:35 crc kubenswrapper[4766]: I1002 11:39:35.116787 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9nmkc"] Oct 02 11:39:35 crc kubenswrapper[4766]: I1002 11:39:35.145253 4766 scope.go:117] "RemoveContainer" containerID="85531d3ea1375eb0a5fc5d1aea22b2b15079d64b9ba8bbfc8ea08b076eae32b0" Oct 02 11:39:35 crc kubenswrapper[4766]: I1002 11:39:35.160443 4766 scope.go:117] "RemoveContainer" containerID="8b6951232682990db7f769ad4c3a2719cb410946a407a97e3d1e329bad75720f" Oct 02 11:39:35 crc kubenswrapper[4766]: E1002 11:39:35.161016 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b6951232682990db7f769ad4c3a2719cb410946a407a97e3d1e329bad75720f\": container with ID starting with 8b6951232682990db7f769ad4c3a2719cb410946a407a97e3d1e329bad75720f not found: ID does not exist" containerID="8b6951232682990db7f769ad4c3a2719cb410946a407a97e3d1e329bad75720f" Oct 02 11:39:35 crc kubenswrapper[4766]: I1002 11:39:35.161053 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b6951232682990db7f769ad4c3a2719cb410946a407a97e3d1e329bad75720f"} err="failed to get container status \"8b6951232682990db7f769ad4c3a2719cb410946a407a97e3d1e329bad75720f\": rpc error: code = NotFound desc = could not find container \"8b6951232682990db7f769ad4c3a2719cb410946a407a97e3d1e329bad75720f\": container with ID starting with 8b6951232682990db7f769ad4c3a2719cb410946a407a97e3d1e329bad75720f not found: ID does not exist" Oct 02 11:39:35 crc kubenswrapper[4766]: I1002 11:39:35.161083 4766 scope.go:117] "RemoveContainer" containerID="0da873237a701e9838ba3839316f456b28f9748937fc775ff5c3b63bad874fd3" Oct 02 11:39:35 crc kubenswrapper[4766]: E1002 11:39:35.161412 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0da873237a701e9838ba3839316f456b28f9748937fc775ff5c3b63bad874fd3\": container with ID starting with 0da873237a701e9838ba3839316f456b28f9748937fc775ff5c3b63bad874fd3 not found: ID does not exist" containerID="0da873237a701e9838ba3839316f456b28f9748937fc775ff5c3b63bad874fd3" Oct 02 11:39:35 crc kubenswrapper[4766]: I1002 11:39:35.161465 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da873237a701e9838ba3839316f456b28f9748937fc775ff5c3b63bad874fd3"} err="failed to get container status \"0da873237a701e9838ba3839316f456b28f9748937fc775ff5c3b63bad874fd3\": rpc error: code = NotFound desc = could not find container \"0da873237a701e9838ba3839316f456b28f9748937fc775ff5c3b63bad874fd3\": container with ID starting with 0da873237a701e9838ba3839316f456b28f9748937fc775ff5c3b63bad874fd3 not found: ID does not exist" Oct 02 11:39:35 crc kubenswrapper[4766]: I1002 11:39:35.161493 4766 scope.go:117] "RemoveContainer" containerID="85531d3ea1375eb0a5fc5d1aea22b2b15079d64b9ba8bbfc8ea08b076eae32b0" Oct 02 11:39:35 crc kubenswrapper[4766]: E1002 11:39:35.161954 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85531d3ea1375eb0a5fc5d1aea22b2b15079d64b9ba8bbfc8ea08b076eae32b0\": container with ID starting with 85531d3ea1375eb0a5fc5d1aea22b2b15079d64b9ba8bbfc8ea08b076eae32b0 not found: ID does not exist" containerID="85531d3ea1375eb0a5fc5d1aea22b2b15079d64b9ba8bbfc8ea08b076eae32b0" Oct 02 11:39:35 crc kubenswrapper[4766]: I1002 11:39:35.161980 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85531d3ea1375eb0a5fc5d1aea22b2b15079d64b9ba8bbfc8ea08b076eae32b0"} err="failed to get container status \"85531d3ea1375eb0a5fc5d1aea22b2b15079d64b9ba8bbfc8ea08b076eae32b0\": rpc error: code = NotFound desc = could not find container \"85531d3ea1375eb0a5fc5d1aea22b2b15079d64b9ba8bbfc8ea08b076eae32b0\": container with ID starting with 85531d3ea1375eb0a5fc5d1aea22b2b15079d64b9ba8bbfc8ea08b076eae32b0 not found: ID does not exist" Oct 02 11:39:35 crc kubenswrapper[4766]: I1002 11:39:35.893249 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae15226-6e43-41f6-bc49-bf38cbc13fa4" path="/var/lib/kubelet/pods/4ae15226-6e43-41f6-bc49-bf38cbc13fa4/volumes" Oct 02 11:39:54 crc kubenswrapper[4766]: I1002 11:39:54.432573 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:39:54 crc kubenswrapper[4766]: I1002 11:39:54.434590 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:39:54 crc kubenswrapper[4766]: I1002 11:39:54.434759 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 11:39:54 crc kubenswrapper[4766]: I1002 11:39:54.435578 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:39:54 crc kubenswrapper[4766]: I1002 11:39:54.435751 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" gracePeriod=600 Oct 02 11:39:54 crc kubenswrapper[4766]: E1002 11:39:54.566293 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:39:55 crc kubenswrapper[4766]: I1002 11:39:55.207129 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" exitCode=0 Oct 02 11:39:55 crc kubenswrapper[4766]: I1002 11:39:55.207178 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4"} Oct 02 11:39:55 crc kubenswrapper[4766]: I1002 11:39:55.207216 4766 scope.go:117] "RemoveContainer" containerID="bfc7260bc86efb2d05b7d466e4b476e211caa5c18fb8377de83db8532e635774" Oct 02 11:39:55 crc kubenswrapper[4766]: I1002 11:39:55.207912 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:39:55 crc kubenswrapper[4766]: E1002 11:39:55.208111 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:40:08 crc kubenswrapper[4766]: I1002 11:40:08.881766 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:40:08 crc kubenswrapper[4766]: E1002 11:40:08.882560 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:40:20 crc kubenswrapper[4766]: I1002 11:40:20.881319 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:40:20 crc kubenswrapper[4766]: E1002 11:40:20.882318 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:40:35 crc kubenswrapper[4766]: I1002 11:40:35.892889 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:40:35 crc kubenswrapper[4766]: E1002 11:40:35.895782 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:40:50 crc kubenswrapper[4766]: I1002 11:40:50.880901 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:40:50 crc kubenswrapper[4766]: E1002 11:40:50.881670 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:41:01 crc kubenswrapper[4766]: I1002 11:41:01.881217 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:41:01 crc kubenswrapper[4766]: E1002 11:41:01.881917 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:41:12 crc kubenswrapper[4766]: I1002 11:41:12.882165 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:41:12 crc kubenswrapper[4766]: E1002 11:41:12.883318 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:41:23 crc kubenswrapper[4766]: I1002 11:41:23.880993 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:41:23 crc kubenswrapper[4766]: E1002 11:41:23.881787 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:41:35 crc kubenswrapper[4766]: I1002 11:41:35.886861 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:41:35 crc kubenswrapper[4766]: E1002 11:41:35.888846 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:41:48 crc kubenswrapper[4766]: I1002 11:41:48.882176 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:41:48 crc kubenswrapper[4766]: E1002 11:41:48.883390 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:42:00 crc kubenswrapper[4766]: I1002 11:42:00.881167 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:42:00 crc kubenswrapper[4766]: E1002 11:42:00.881963 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:42:11 crc kubenswrapper[4766]: I1002 11:42:11.881367 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:42:11 crc kubenswrapper[4766]: E1002 11:42:11.882092 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:42:26 crc kubenswrapper[4766]: I1002 11:42:26.881686 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:42:26 crc kubenswrapper[4766]: E1002 11:42:26.882421 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:42:29 crc kubenswrapper[4766]: I1002 11:42:29.600977 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-47wxz"] Oct 02 11:42:29 crc kubenswrapper[4766]: E1002 11:42:29.601993 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae15226-6e43-41f6-bc49-bf38cbc13fa4" containerName="registry-server" Oct 02 11:42:29 crc kubenswrapper[4766]: I1002 11:42:29.602012 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae15226-6e43-41f6-bc49-bf38cbc13fa4" containerName="registry-server" Oct 02 11:42:29 crc kubenswrapper[4766]: E1002 11:42:29.602047 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae15226-6e43-41f6-bc49-bf38cbc13fa4" containerName="extract-content" Oct 02 11:42:29 crc kubenswrapper[4766]: I1002 11:42:29.602197 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae15226-6e43-41f6-bc49-bf38cbc13fa4" containerName="extract-content" Oct 02 11:42:29 crc kubenswrapper[4766]: E1002 11:42:29.602275 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae15226-6e43-41f6-bc49-bf38cbc13fa4" containerName="extract-utilities" Oct 02 11:42:29 crc kubenswrapper[4766]: I1002 11:42:29.602286 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae15226-6e43-41f6-bc49-bf38cbc13fa4" containerName="extract-utilities" Oct 02 11:42:29 crc kubenswrapper[4766]: I1002 11:42:29.602543 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae15226-6e43-41f6-bc49-bf38cbc13fa4" containerName="registry-server" Oct 02 11:42:29 crc kubenswrapper[4766]: I1002 11:42:29.603524 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47wxz" Oct 02 11:42:29 crc kubenswrapper[4766]: I1002 11:42:29.607857 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-47wxz"] Oct 02 11:42:29 crc kubenswrapper[4766]: I1002 11:42:29.707534 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f40dd706-bf49-4f80-9962-b5a90d930582-catalog-content\") pod \"redhat-marketplace-47wxz\" (UID: \"f40dd706-bf49-4f80-9962-b5a90d930582\") " pod="openshift-marketplace/redhat-marketplace-47wxz" Oct 02 11:42:29 crc kubenswrapper[4766]: I1002 11:42:29.707589 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4txr\" (UniqueName: \"kubernetes.io/projected/f40dd706-bf49-4f80-9962-b5a90d930582-kube-api-access-k4txr\") pod \"redhat-marketplace-47wxz\" (UID: \"f40dd706-bf49-4f80-9962-b5a90d930582\") " pod="openshift-marketplace/redhat-marketplace-47wxz" Oct 02 11:42:29 crc kubenswrapper[4766]: I1002 11:42:29.707865 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f40dd706-bf49-4f80-9962-b5a90d930582-utilities\") pod \"redhat-marketplace-47wxz\" (UID: \"f40dd706-bf49-4f80-9962-b5a90d930582\") " pod="openshift-marketplace/redhat-marketplace-47wxz" Oct 02 11:42:29 crc kubenswrapper[4766]: I1002 11:42:29.808876 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f40dd706-bf49-4f80-9962-b5a90d930582-catalog-content\") pod \"redhat-marketplace-47wxz\" (UID: \"f40dd706-bf49-4f80-9962-b5a90d930582\") " pod="openshift-marketplace/redhat-marketplace-47wxz" Oct 02 11:42:29 crc kubenswrapper[4766]: I1002 11:42:29.808945 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4txr\" (UniqueName: \"kubernetes.io/projected/f40dd706-bf49-4f80-9962-b5a90d930582-kube-api-access-k4txr\") pod \"redhat-marketplace-47wxz\" (UID: \"f40dd706-bf49-4f80-9962-b5a90d930582\") " pod="openshift-marketplace/redhat-marketplace-47wxz" Oct 02 11:42:29 crc kubenswrapper[4766]: I1002 11:42:29.808972 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f40dd706-bf49-4f80-9962-b5a90d930582-utilities\") pod \"redhat-marketplace-47wxz\" (UID: \"f40dd706-bf49-4f80-9962-b5a90d930582\") " pod="openshift-marketplace/redhat-marketplace-47wxz" Oct 02 11:42:29 crc kubenswrapper[4766]: I1002 11:42:29.809448 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f40dd706-bf49-4f80-9962-b5a90d930582-catalog-content\") pod \"redhat-marketplace-47wxz\" (UID: \"f40dd706-bf49-4f80-9962-b5a90d930582\") " pod="openshift-marketplace/redhat-marketplace-47wxz" Oct 02 11:42:29 crc kubenswrapper[4766]: I1002 11:42:29.809482 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f40dd706-bf49-4f80-9962-b5a90d930582-utilities\") pod \"redhat-marketplace-47wxz\" (UID: \"f40dd706-bf49-4f80-9962-b5a90d930582\") " pod="openshift-marketplace/redhat-marketplace-47wxz" Oct 02 11:42:29 crc kubenswrapper[4766]: I1002 11:42:29.831224 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4txr\" (UniqueName: \"kubernetes.io/projected/f40dd706-bf49-4f80-9962-b5a90d930582-kube-api-access-k4txr\") pod \"redhat-marketplace-47wxz\" (UID: \"f40dd706-bf49-4f80-9962-b5a90d930582\") " pod="openshift-marketplace/redhat-marketplace-47wxz" Oct 02 11:42:29 crc kubenswrapper[4766]: I1002 11:42:29.929957 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47wxz" Oct 02 11:42:30 crc kubenswrapper[4766]: I1002 11:42:30.343653 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-47wxz"] Oct 02 11:42:31 crc kubenswrapper[4766]: I1002 11:42:31.283908 4766 generic.go:334] "Generic (PLEG): container finished" podID="f40dd706-bf49-4f80-9962-b5a90d930582" containerID="69902430ede64db0081dbb1d413bbb609c39d13948ae40faa49abab488f1f8a7" exitCode=0 Oct 02 11:42:31 crc kubenswrapper[4766]: I1002 11:42:31.284078 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47wxz" event={"ID":"f40dd706-bf49-4f80-9962-b5a90d930582","Type":"ContainerDied","Data":"69902430ede64db0081dbb1d413bbb609c39d13948ae40faa49abab488f1f8a7"} Oct 02 11:42:31 crc kubenswrapper[4766]: I1002 11:42:31.284257 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47wxz" event={"ID":"f40dd706-bf49-4f80-9962-b5a90d930582","Type":"ContainerStarted","Data":"499b169742c859ec10e4c8c7c7d6e96dabedb577aa752fee2e7da49f202934af"} Oct 02 11:42:31 crc kubenswrapper[4766]: I1002 11:42:31.994309 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8phfk"] Oct 02 11:42:31 crc kubenswrapper[4766]: I1002 11:42:31.996325 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8phfk" Oct 02 11:42:32 crc kubenswrapper[4766]: I1002 11:42:32.011856 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8phfk"] Oct 02 11:42:32 crc kubenswrapper[4766]: I1002 11:42:32.058315 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4c7q\" (UniqueName: \"kubernetes.io/projected/598c560f-cd64-46db-8b8b-48b698038890-kube-api-access-z4c7q\") pod \"certified-operators-8phfk\" (UID: \"598c560f-cd64-46db-8b8b-48b698038890\") " pod="openshift-marketplace/certified-operators-8phfk" Oct 02 11:42:32 crc kubenswrapper[4766]: I1002 11:42:32.058436 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/598c560f-cd64-46db-8b8b-48b698038890-catalog-content\") pod \"certified-operators-8phfk\" (UID: \"598c560f-cd64-46db-8b8b-48b698038890\") " pod="openshift-marketplace/certified-operators-8phfk" Oct 02 11:42:32 crc kubenswrapper[4766]: I1002 11:42:32.058485 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/598c560f-cd64-46db-8b8b-48b698038890-utilities\") pod \"certified-operators-8phfk\" (UID: \"598c560f-cd64-46db-8b8b-48b698038890\") " pod="openshift-marketplace/certified-operators-8phfk" Oct 02 11:42:32 crc kubenswrapper[4766]: I1002 11:42:32.159593 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/598c560f-cd64-46db-8b8b-48b698038890-utilities\") pod \"certified-operators-8phfk\" (UID: \"598c560f-cd64-46db-8b8b-48b698038890\") " pod="openshift-marketplace/certified-operators-8phfk" Oct 02 11:42:32 crc kubenswrapper[4766]: I1002 11:42:32.160080 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4c7q\" (UniqueName: \"kubernetes.io/projected/598c560f-cd64-46db-8b8b-48b698038890-kube-api-access-z4c7q\") pod \"certified-operators-8phfk\" (UID: \"598c560f-cd64-46db-8b8b-48b698038890\") " pod="openshift-marketplace/certified-operators-8phfk" Oct 02 11:42:32 crc kubenswrapper[4766]: I1002 11:42:32.160139 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/598c560f-cd64-46db-8b8b-48b698038890-utilities\") pod \"certified-operators-8phfk\" (UID: \"598c560f-cd64-46db-8b8b-48b698038890\") " pod="openshift-marketplace/certified-operators-8phfk" Oct 02 11:42:32 crc kubenswrapper[4766]: I1002 11:42:32.160406 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/598c560f-cd64-46db-8b8b-48b698038890-catalog-content\") pod \"certified-operators-8phfk\" (UID: \"598c560f-cd64-46db-8b8b-48b698038890\") " pod="openshift-marketplace/certified-operators-8phfk" Oct 02 11:42:32 crc kubenswrapper[4766]: I1002 11:42:32.160741 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/598c560f-cd64-46db-8b8b-48b698038890-catalog-content\") pod \"certified-operators-8phfk\" (UID: \"598c560f-cd64-46db-8b8b-48b698038890\") " pod="openshift-marketplace/certified-operators-8phfk" Oct 02 11:42:32 crc kubenswrapper[4766]: I1002 11:42:32.181942 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4c7q\" (UniqueName: \"kubernetes.io/projected/598c560f-cd64-46db-8b8b-48b698038890-kube-api-access-z4c7q\") pod \"certified-operators-8phfk\" (UID: \"598c560f-cd64-46db-8b8b-48b698038890\") " pod="openshift-marketplace/certified-operators-8phfk" Oct 02 11:42:32 crc kubenswrapper[4766]: I1002 11:42:32.303415 4766 generic.go:334] "Generic (PLEG): container finished" podID="f40dd706-bf49-4f80-9962-b5a90d930582" containerID="cbab6310242ad0e68dcdbfb24d74c3122fdf2a6572383a811a8913ee4492aedd" exitCode=0 Oct 02 11:42:32 crc kubenswrapper[4766]: I1002 11:42:32.303461 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47wxz" event={"ID":"f40dd706-bf49-4f80-9962-b5a90d930582","Type":"ContainerDied","Data":"cbab6310242ad0e68dcdbfb24d74c3122fdf2a6572383a811a8913ee4492aedd"} Oct 02 11:42:32 crc kubenswrapper[4766]: I1002 11:42:32.326428 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8phfk" Oct 02 11:42:32 crc kubenswrapper[4766]: I1002 11:42:32.814030 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8phfk"] Oct 02 11:42:33 crc kubenswrapper[4766]: I1002 11:42:33.310812 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47wxz" event={"ID":"f40dd706-bf49-4f80-9962-b5a90d930582","Type":"ContainerStarted","Data":"9f94ab57e182a7d159608fd5fcabdd628de9e76f0c094a2dbd59850edfbaaef6"} Oct 02 11:42:33 crc kubenswrapper[4766]: I1002 11:42:33.314929 4766 generic.go:334] "Generic (PLEG): container finished" podID="598c560f-cd64-46db-8b8b-48b698038890" containerID="fdc2d66ec146c4a39cadefff6c1a302a83fe52137d822e26a731f587a4698071" exitCode=0 Oct 02 11:42:33 crc kubenswrapper[4766]: I1002 11:42:33.314975 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8phfk" event={"ID":"598c560f-cd64-46db-8b8b-48b698038890","Type":"ContainerDied","Data":"fdc2d66ec146c4a39cadefff6c1a302a83fe52137d822e26a731f587a4698071"} Oct 02 11:42:33 crc kubenswrapper[4766]: I1002 11:42:33.315011 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8phfk" event={"ID":"598c560f-cd64-46db-8b8b-48b698038890","Type":"ContainerStarted","Data":"8889afde0e97b257c91a3537c42440020709c4282742b3d6b398dbe65b8b0600"} Oct 02 11:42:33 crc kubenswrapper[4766]: I1002 11:42:33.334095 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-47wxz" podStartSLOduration=2.677501826 podStartE2EDuration="4.334076685s" podCreationTimestamp="2025-10-02 11:42:29 +0000 UTC" firstStartedPulling="2025-10-02 11:42:31.285724675 +0000 UTC m=+3066.228595619" lastFinishedPulling="2025-10-02 11:42:32.942299534 +0000 UTC m=+3067.885170478" observedRunningTime="2025-10-02 11:42:33.333198046 +0000 UTC m=+3068.276069010" watchObservedRunningTime="2025-10-02 11:42:33.334076685 +0000 UTC m=+3068.276947629" Oct 02 11:42:34 crc kubenswrapper[4766]: I1002 11:42:34.323692 4766 generic.go:334] "Generic (PLEG): container finished" podID="598c560f-cd64-46db-8b8b-48b698038890" containerID="2985cfef38d655502917b42f9b9b5eebb8003fb63b3b91d7a54ad16d4de5564f" exitCode=0 Oct 02 11:42:34 crc kubenswrapper[4766]: I1002 11:42:34.323784 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8phfk" event={"ID":"598c560f-cd64-46db-8b8b-48b698038890","Type":"ContainerDied","Data":"2985cfef38d655502917b42f9b9b5eebb8003fb63b3b91d7a54ad16d4de5564f"} Oct 02 11:42:35 crc kubenswrapper[4766]: I1002 11:42:35.332883 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8phfk" event={"ID":"598c560f-cd64-46db-8b8b-48b698038890","Type":"ContainerStarted","Data":"3ba01d2171ef341f6f8da584dafbbd78ddf937eca4c3217af892449d25a4de86"} Oct 02 11:42:35 crc kubenswrapper[4766]: I1002 11:42:35.353464 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8phfk" podStartSLOduration=2.949536916 podStartE2EDuration="4.353448397s" podCreationTimestamp="2025-10-02 11:42:31 +0000 UTC" firstStartedPulling="2025-10-02 11:42:33.316187392 +0000 UTC m=+3068.259058336" lastFinishedPulling="2025-10-02 11:42:34.720098873 +0000 UTC m=+3069.662969817" observedRunningTime="2025-10-02 11:42:35.349691056 +0000 UTC m=+3070.292562010" watchObservedRunningTime="2025-10-02 11:42:35.353448397 +0000 UTC m=+3070.296319341" Oct 02 11:42:39 crc kubenswrapper[4766]: I1002 11:42:39.880975 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:42:39 crc kubenswrapper[4766]: E1002 11:42:39.881842 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:42:39 crc kubenswrapper[4766]: I1002 11:42:39.931331 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-47wxz" Oct 02 11:42:39 crc kubenswrapper[4766]: I1002 11:42:39.931453 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-47wxz" Oct 02 11:42:39 crc kubenswrapper[4766]: I1002 11:42:39.970994 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-47wxz" Oct 02 11:42:40 crc kubenswrapper[4766]: I1002 11:42:40.411789 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-47wxz" Oct 02 11:42:40 crc kubenswrapper[4766]: I1002 11:42:40.460474 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-47wxz"] Oct 02 11:42:42 crc kubenswrapper[4766]: I1002 11:42:42.326813 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8phfk" Oct 02 11:42:42 crc kubenswrapper[4766]: I1002 11:42:42.326861 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8phfk" Oct 02 11:42:42 crc kubenswrapper[4766]: I1002 11:42:42.371409 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8phfk" Oct 02 11:42:42 crc kubenswrapper[4766]: I1002 11:42:42.378134 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-47wxz" podUID="f40dd706-bf49-4f80-9962-b5a90d930582" containerName="registry-server" containerID="cri-o://9f94ab57e182a7d159608fd5fcabdd628de9e76f0c094a2dbd59850edfbaaef6" gracePeriod=2 Oct 02 11:42:42 crc kubenswrapper[4766]: I1002 11:42:42.424477 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8phfk" Oct 02 11:42:42 crc kubenswrapper[4766]: I1002 11:42:42.743851 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47wxz" Oct 02 11:42:42 crc kubenswrapper[4766]: I1002 11:42:42.834353 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f40dd706-bf49-4f80-9962-b5a90d930582-utilities\") pod \"f40dd706-bf49-4f80-9962-b5a90d930582\" (UID: \"f40dd706-bf49-4f80-9962-b5a90d930582\") " Oct 02 11:42:42 crc kubenswrapper[4766]: I1002 11:42:42.834553 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f40dd706-bf49-4f80-9962-b5a90d930582-catalog-content\") pod \"f40dd706-bf49-4f80-9962-b5a90d930582\" (UID: \"f40dd706-bf49-4f80-9962-b5a90d930582\") " Oct 02 11:42:42 crc kubenswrapper[4766]: I1002 11:42:42.834641 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4txr\" (UniqueName: \"kubernetes.io/projected/f40dd706-bf49-4f80-9962-b5a90d930582-kube-api-access-k4txr\") pod \"f40dd706-bf49-4f80-9962-b5a90d930582\" (UID: \"f40dd706-bf49-4f80-9962-b5a90d930582\") " Oct 02 11:42:42 crc kubenswrapper[4766]: I1002 11:42:42.835648 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f40dd706-bf49-4f80-9962-b5a90d930582-utilities" (OuterVolumeSpecName: "utilities") pod "f40dd706-bf49-4f80-9962-b5a90d930582" (UID: "f40dd706-bf49-4f80-9962-b5a90d930582"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:42:42 crc kubenswrapper[4766]: I1002 11:42:42.840603 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f40dd706-bf49-4f80-9962-b5a90d930582-kube-api-access-k4txr" (OuterVolumeSpecName: "kube-api-access-k4txr") pod "f40dd706-bf49-4f80-9962-b5a90d930582" (UID: "f40dd706-bf49-4f80-9962-b5a90d930582"). InnerVolumeSpecName "kube-api-access-k4txr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:42:42 crc kubenswrapper[4766]: I1002 11:42:42.850098 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f40dd706-bf49-4f80-9962-b5a90d930582-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f40dd706-bf49-4f80-9962-b5a90d930582" (UID: "f40dd706-bf49-4f80-9962-b5a90d930582"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:42:42 crc kubenswrapper[4766]: I1002 11:42:42.937135 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f40dd706-bf49-4f80-9962-b5a90d930582-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:42:42 crc kubenswrapper[4766]: I1002 11:42:42.937559 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f40dd706-bf49-4f80-9962-b5a90d930582-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:42:42 crc kubenswrapper[4766]: I1002 11:42:42.937578 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4txr\" (UniqueName: \"kubernetes.io/projected/f40dd706-bf49-4f80-9962-b5a90d930582-kube-api-access-k4txr\") on node \"crc\" DevicePath \"\"" Oct 02 11:42:43 crc kubenswrapper[4766]: I1002 11:42:43.387820 4766 generic.go:334] "Generic (PLEG): container finished" podID="f40dd706-bf49-4f80-9962-b5a90d930582" containerID="9f94ab57e182a7d159608fd5fcabdd628de9e76f0c094a2dbd59850edfbaaef6" exitCode=0 Oct 02 11:42:43 crc kubenswrapper[4766]: I1002 11:42:43.387878 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-47wxz" Oct 02 11:42:43 crc kubenswrapper[4766]: I1002 11:42:43.387926 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47wxz" event={"ID":"f40dd706-bf49-4f80-9962-b5a90d930582","Type":"ContainerDied","Data":"9f94ab57e182a7d159608fd5fcabdd628de9e76f0c094a2dbd59850edfbaaef6"} Oct 02 11:42:43 crc kubenswrapper[4766]: I1002 11:42:43.387977 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-47wxz" event={"ID":"f40dd706-bf49-4f80-9962-b5a90d930582","Type":"ContainerDied","Data":"499b169742c859ec10e4c8c7c7d6e96dabedb577aa752fee2e7da49f202934af"} Oct 02 11:42:43 crc kubenswrapper[4766]: I1002 11:42:43.388014 4766 scope.go:117] "RemoveContainer" containerID="9f94ab57e182a7d159608fd5fcabdd628de9e76f0c094a2dbd59850edfbaaef6" Oct 02 11:42:43 crc kubenswrapper[4766]: I1002 11:42:43.419478 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-47wxz"] Oct 02 11:42:43 crc kubenswrapper[4766]: I1002 11:42:43.423776 4766 scope.go:117] "RemoveContainer" containerID="cbab6310242ad0e68dcdbfb24d74c3122fdf2a6572383a811a8913ee4492aedd" Oct 02 11:42:43 crc kubenswrapper[4766]: I1002 11:42:43.428171 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-47wxz"] Oct 02 11:42:43 crc kubenswrapper[4766]: I1002 11:42:43.440425 4766 scope.go:117] "RemoveContainer" containerID="69902430ede64db0081dbb1d413bbb609c39d13948ae40faa49abab488f1f8a7" Oct 02 11:42:43 crc kubenswrapper[4766]: I1002 11:42:43.466284 4766 scope.go:117] "RemoveContainer" containerID="9f94ab57e182a7d159608fd5fcabdd628de9e76f0c094a2dbd59850edfbaaef6" Oct 02 11:42:43 crc kubenswrapper[4766]: E1002 11:42:43.466889 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f94ab57e182a7d159608fd5fcabdd628de9e76f0c094a2dbd59850edfbaaef6\": container with ID starting with 9f94ab57e182a7d159608fd5fcabdd628de9e76f0c094a2dbd59850edfbaaef6 not found: ID does not exist" containerID="9f94ab57e182a7d159608fd5fcabdd628de9e76f0c094a2dbd59850edfbaaef6" Oct 02 11:42:43 crc kubenswrapper[4766]: I1002 11:42:43.466930 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f94ab57e182a7d159608fd5fcabdd628de9e76f0c094a2dbd59850edfbaaef6"} err="failed to get container status \"9f94ab57e182a7d159608fd5fcabdd628de9e76f0c094a2dbd59850edfbaaef6\": rpc error: code = NotFound desc = could not find container \"9f94ab57e182a7d159608fd5fcabdd628de9e76f0c094a2dbd59850edfbaaef6\": container with ID starting with 9f94ab57e182a7d159608fd5fcabdd628de9e76f0c094a2dbd59850edfbaaef6 not found: ID does not exist" Oct 02 11:42:43 crc kubenswrapper[4766]: I1002 11:42:43.466957 4766 scope.go:117] "RemoveContainer" containerID="cbab6310242ad0e68dcdbfb24d74c3122fdf2a6572383a811a8913ee4492aedd" Oct 02 11:42:43 crc kubenswrapper[4766]: E1002 11:42:43.467285 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbab6310242ad0e68dcdbfb24d74c3122fdf2a6572383a811a8913ee4492aedd\": container with ID starting with cbab6310242ad0e68dcdbfb24d74c3122fdf2a6572383a811a8913ee4492aedd not found: ID does not exist" containerID="cbab6310242ad0e68dcdbfb24d74c3122fdf2a6572383a811a8913ee4492aedd" Oct 02 11:42:43 crc kubenswrapper[4766]: I1002 11:42:43.467307 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbab6310242ad0e68dcdbfb24d74c3122fdf2a6572383a811a8913ee4492aedd"} err="failed to get container status \"cbab6310242ad0e68dcdbfb24d74c3122fdf2a6572383a811a8913ee4492aedd\": rpc error: code = NotFound desc = could not find container \"cbab6310242ad0e68dcdbfb24d74c3122fdf2a6572383a811a8913ee4492aedd\": container with ID starting with cbab6310242ad0e68dcdbfb24d74c3122fdf2a6572383a811a8913ee4492aedd not found: ID does not exist" Oct 02 11:42:43 crc kubenswrapper[4766]: I1002 11:42:43.467322 4766 scope.go:117] "RemoveContainer" containerID="69902430ede64db0081dbb1d413bbb609c39d13948ae40faa49abab488f1f8a7" Oct 02 11:42:43 crc kubenswrapper[4766]: E1002 11:42:43.467656 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69902430ede64db0081dbb1d413bbb609c39d13948ae40faa49abab488f1f8a7\": container with ID starting with 69902430ede64db0081dbb1d413bbb609c39d13948ae40faa49abab488f1f8a7 not found: ID does not exist" containerID="69902430ede64db0081dbb1d413bbb609c39d13948ae40faa49abab488f1f8a7" Oct 02 11:42:43 crc kubenswrapper[4766]: I1002 11:42:43.467746 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69902430ede64db0081dbb1d413bbb609c39d13948ae40faa49abab488f1f8a7"} err="failed to get container status \"69902430ede64db0081dbb1d413bbb609c39d13948ae40faa49abab488f1f8a7\": rpc error: code = NotFound desc = could not find container \"69902430ede64db0081dbb1d413bbb609c39d13948ae40faa49abab488f1f8a7\": container with ID starting with 69902430ede64db0081dbb1d413bbb609c39d13948ae40faa49abab488f1f8a7 not found: ID does not exist" Oct 02 11:42:43 crc kubenswrapper[4766]: I1002 11:42:43.889884 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f40dd706-bf49-4f80-9962-b5a90d930582" path="/var/lib/kubelet/pods/f40dd706-bf49-4f80-9962-b5a90d930582/volumes" Oct 02 11:42:44 crc kubenswrapper[4766]: I1002 11:42:44.403697 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8phfk"] Oct 02 11:42:44 crc kubenswrapper[4766]: I1002 11:42:44.403959 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8phfk" podUID="598c560f-cd64-46db-8b8b-48b698038890" containerName="registry-server" containerID="cri-o://3ba01d2171ef341f6f8da584dafbbd78ddf937eca4c3217af892449d25a4de86" gracePeriod=2 Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.052361 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8phfk" Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.061841 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/598c560f-cd64-46db-8b8b-48b698038890-utilities\") pod \"598c560f-cd64-46db-8b8b-48b698038890\" (UID: \"598c560f-cd64-46db-8b8b-48b698038890\") " Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.061944 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/598c560f-cd64-46db-8b8b-48b698038890-catalog-content\") pod \"598c560f-cd64-46db-8b8b-48b698038890\" (UID: \"598c560f-cd64-46db-8b8b-48b698038890\") " Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.062069 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4c7q\" (UniqueName: \"kubernetes.io/projected/598c560f-cd64-46db-8b8b-48b698038890-kube-api-access-z4c7q\") pod \"598c560f-cd64-46db-8b8b-48b698038890\" (UID: \"598c560f-cd64-46db-8b8b-48b698038890\") " Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.065964 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/598c560f-cd64-46db-8b8b-48b698038890-utilities" (OuterVolumeSpecName: "utilities") pod "598c560f-cd64-46db-8b8b-48b698038890" (UID: "598c560f-cd64-46db-8b8b-48b698038890"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.077052 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598c560f-cd64-46db-8b8b-48b698038890-kube-api-access-z4c7q" (OuterVolumeSpecName: "kube-api-access-z4c7q") pod "598c560f-cd64-46db-8b8b-48b698038890" (UID: "598c560f-cd64-46db-8b8b-48b698038890"). InnerVolumeSpecName "kube-api-access-z4c7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.122172 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/598c560f-cd64-46db-8b8b-48b698038890-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "598c560f-cd64-46db-8b8b-48b698038890" (UID: "598c560f-cd64-46db-8b8b-48b698038890"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.164199 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4c7q\" (UniqueName: \"kubernetes.io/projected/598c560f-cd64-46db-8b8b-48b698038890-kube-api-access-z4c7q\") on node \"crc\" DevicePath \"\"" Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.164244 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/598c560f-cd64-46db-8b8b-48b698038890-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.164255 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/598c560f-cd64-46db-8b8b-48b698038890-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.409880 4766 generic.go:334] "Generic (PLEG): container finished" podID="598c560f-cd64-46db-8b8b-48b698038890" containerID="3ba01d2171ef341f6f8da584dafbbd78ddf937eca4c3217af892449d25a4de86" exitCode=0 Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.409926 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8phfk" event={"ID":"598c560f-cd64-46db-8b8b-48b698038890","Type":"ContainerDied","Data":"3ba01d2171ef341f6f8da584dafbbd78ddf937eca4c3217af892449d25a4de86"} Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.409943 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8phfk" Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.409965 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8phfk" event={"ID":"598c560f-cd64-46db-8b8b-48b698038890","Type":"ContainerDied","Data":"8889afde0e97b257c91a3537c42440020709c4282742b3d6b398dbe65b8b0600"} Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.409984 4766 scope.go:117] "RemoveContainer" containerID="3ba01d2171ef341f6f8da584dafbbd78ddf937eca4c3217af892449d25a4de86" Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.431405 4766 scope.go:117] "RemoveContainer" containerID="2985cfef38d655502917b42f9b9b5eebb8003fb63b3b91d7a54ad16d4de5564f" Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.440894 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8phfk"] Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.446357 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8phfk"] Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.464568 4766 scope.go:117] "RemoveContainer" containerID="fdc2d66ec146c4a39cadefff6c1a302a83fe52137d822e26a731f587a4698071" Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.478203 4766 scope.go:117] "RemoveContainer" containerID="3ba01d2171ef341f6f8da584dafbbd78ddf937eca4c3217af892449d25a4de86" Oct 02 11:42:45 crc kubenswrapper[4766]: E1002 11:42:45.478708 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ba01d2171ef341f6f8da584dafbbd78ddf937eca4c3217af892449d25a4de86\": container with ID starting with 3ba01d2171ef341f6f8da584dafbbd78ddf937eca4c3217af892449d25a4de86 not found: ID does not exist" containerID="3ba01d2171ef341f6f8da584dafbbd78ddf937eca4c3217af892449d25a4de86" Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.478797 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba01d2171ef341f6f8da584dafbbd78ddf937eca4c3217af892449d25a4de86"} err="failed to get container status \"3ba01d2171ef341f6f8da584dafbbd78ddf937eca4c3217af892449d25a4de86\": rpc error: code = NotFound desc = could not find container \"3ba01d2171ef341f6f8da584dafbbd78ddf937eca4c3217af892449d25a4de86\": container with ID starting with 3ba01d2171ef341f6f8da584dafbbd78ddf937eca4c3217af892449d25a4de86 not found: ID does not exist" Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.478871 4766 scope.go:117] "RemoveContainer" containerID="2985cfef38d655502917b42f9b9b5eebb8003fb63b3b91d7a54ad16d4de5564f" Oct 02 11:42:45 crc kubenswrapper[4766]: E1002 11:42:45.479163 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2985cfef38d655502917b42f9b9b5eebb8003fb63b3b91d7a54ad16d4de5564f\": container with ID starting with 2985cfef38d655502917b42f9b9b5eebb8003fb63b3b91d7a54ad16d4de5564f not found: ID does not exist" containerID="2985cfef38d655502917b42f9b9b5eebb8003fb63b3b91d7a54ad16d4de5564f" Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.479240 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2985cfef38d655502917b42f9b9b5eebb8003fb63b3b91d7a54ad16d4de5564f"} err="failed to get container status \"2985cfef38d655502917b42f9b9b5eebb8003fb63b3b91d7a54ad16d4de5564f\": rpc error: code = NotFound desc = could not find container \"2985cfef38d655502917b42f9b9b5eebb8003fb63b3b91d7a54ad16d4de5564f\": container with ID starting with 2985cfef38d655502917b42f9b9b5eebb8003fb63b3b91d7a54ad16d4de5564f not found: ID does not exist" Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.479312 4766 scope.go:117] "RemoveContainer" containerID="fdc2d66ec146c4a39cadefff6c1a302a83fe52137d822e26a731f587a4698071" Oct 02 11:42:45 crc kubenswrapper[4766]: E1002 11:42:45.479844 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdc2d66ec146c4a39cadefff6c1a302a83fe52137d822e26a731f587a4698071\": container with ID starting with fdc2d66ec146c4a39cadefff6c1a302a83fe52137d822e26a731f587a4698071 not found: ID does not exist" containerID="fdc2d66ec146c4a39cadefff6c1a302a83fe52137d822e26a731f587a4698071" Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.480087 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc2d66ec146c4a39cadefff6c1a302a83fe52137d822e26a731f587a4698071"} err="failed to get container status \"fdc2d66ec146c4a39cadefff6c1a302a83fe52137d822e26a731f587a4698071\": rpc error: code = NotFound desc = could not find container \"fdc2d66ec146c4a39cadefff6c1a302a83fe52137d822e26a731f587a4698071\": container with ID starting with fdc2d66ec146c4a39cadefff6c1a302a83fe52137d822e26a731f587a4698071 not found: ID does not exist" Oct 02 11:42:45 crc kubenswrapper[4766]: I1002 11:42:45.893864 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="598c560f-cd64-46db-8b8b-48b698038890" path="/var/lib/kubelet/pods/598c560f-cd64-46db-8b8b-48b698038890/volumes" Oct 02 11:42:54 crc kubenswrapper[4766]: I1002 11:42:54.881970 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:42:54 crc kubenswrapper[4766]: E1002 11:42:54.882901 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:43:07 crc kubenswrapper[4766]: I1002 11:43:07.882319 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:43:07 crc kubenswrapper[4766]: E1002 11:43:07.883910 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:43:18 crc kubenswrapper[4766]: I1002 11:43:18.880999 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:43:18 crc kubenswrapper[4766]: E1002 11:43:18.881891 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:43:33 crc kubenswrapper[4766]: I1002 11:43:33.881096 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:43:33 crc kubenswrapper[4766]: E1002 11:43:33.881853 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:43:45 crc kubenswrapper[4766]: I1002 11:43:45.884767 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:43:45 crc kubenswrapper[4766]: E1002 11:43:45.885561 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:43:56 crc kubenswrapper[4766]: I1002 11:43:56.881647 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:43:56 crc kubenswrapper[4766]: E1002 11:43:56.882357 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:44:08 crc kubenswrapper[4766]: I1002 11:44:08.881832 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:44:08 crc kubenswrapper[4766]: E1002 11:44:08.882596 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:44:19 crc kubenswrapper[4766]: I1002 11:44:19.881923 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:44:19 crc kubenswrapper[4766]: E1002 11:44:19.883166 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:44:33 crc kubenswrapper[4766]: I1002 11:44:33.881325 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:44:33 crc kubenswrapper[4766]: E1002 11:44:33.882331 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:44:44 crc kubenswrapper[4766]: I1002 11:44:44.881248 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:44:44 crc kubenswrapper[4766]: E1002 11:44:44.881938 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:44:56 crc kubenswrapper[4766]: I1002 11:44:56.881448 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:44:57 crc kubenswrapper[4766]: I1002 11:44:57.346137 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"e7876e2092fb7d7edb7c63bf74430daa8e39f342309027fd5fb71f100757fa15"} Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.175521 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp"] Oct 02 11:45:00 crc kubenswrapper[4766]: E1002 11:45:00.176413 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f40dd706-bf49-4f80-9962-b5a90d930582" containerName="extract-utilities" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.176527 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f40dd706-bf49-4f80-9962-b5a90d930582" containerName="extract-utilities" Oct 02 11:45:00 crc kubenswrapper[4766]: E1002 11:45:00.176556 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f40dd706-bf49-4f80-9962-b5a90d930582" containerName="registry-server" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.176565 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f40dd706-bf49-4f80-9962-b5a90d930582" containerName="registry-server" Oct 02 11:45:00 crc kubenswrapper[4766]: E1002 11:45:00.176582 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598c560f-cd64-46db-8b8b-48b698038890" containerName="extract-utilities" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.176591 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="598c560f-cd64-46db-8b8b-48b698038890" containerName="extract-utilities" Oct 02 11:45:00 crc kubenswrapper[4766]: E1002 11:45:00.176602 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598c560f-cd64-46db-8b8b-48b698038890" containerName="registry-server" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.176610 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="598c560f-cd64-46db-8b8b-48b698038890" containerName="registry-server" Oct 02 11:45:00 crc kubenswrapper[4766]: E1002 11:45:00.176623 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598c560f-cd64-46db-8b8b-48b698038890" containerName="extract-content" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.176629 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="598c560f-cd64-46db-8b8b-48b698038890" containerName="extract-content" Oct 02 11:45:00 crc kubenswrapper[4766]: E1002 11:45:00.176654 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f40dd706-bf49-4f80-9962-b5a90d930582" containerName="extract-content" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.176661 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f40dd706-bf49-4f80-9962-b5a90d930582" containerName="extract-content" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.176828 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f40dd706-bf49-4f80-9962-b5a90d930582" containerName="registry-server" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.176842 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="598c560f-cd64-46db-8b8b-48b698038890" containerName="registry-server" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.177420 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.184397 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.185188 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.190194 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp"] Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.263354 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dd6eb36-2135-475c-9b70-610546403d3c-secret-volume\") pod \"collect-profiles-29323425-z7vrp\" (UID: \"1dd6eb36-2135-475c-9b70-610546403d3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.263416 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dd6eb36-2135-475c-9b70-610546403d3c-config-volume\") pod \"collect-profiles-29323425-z7vrp\" (UID: \"1dd6eb36-2135-475c-9b70-610546403d3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.263464 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rf5f\" (UniqueName: \"kubernetes.io/projected/1dd6eb36-2135-475c-9b70-610546403d3c-kube-api-access-4rf5f\") pod \"collect-profiles-29323425-z7vrp\" (UID: \"1dd6eb36-2135-475c-9b70-610546403d3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.364331 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rf5f\" (UniqueName: \"kubernetes.io/projected/1dd6eb36-2135-475c-9b70-610546403d3c-kube-api-access-4rf5f\") pod \"collect-profiles-29323425-z7vrp\" (UID: \"1dd6eb36-2135-475c-9b70-610546403d3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.364417 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dd6eb36-2135-475c-9b70-610546403d3c-secret-volume\") pod \"collect-profiles-29323425-z7vrp\" (UID: \"1dd6eb36-2135-475c-9b70-610546403d3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.364451 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dd6eb36-2135-475c-9b70-610546403d3c-config-volume\") pod \"collect-profiles-29323425-z7vrp\" (UID: \"1dd6eb36-2135-475c-9b70-610546403d3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.365642 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dd6eb36-2135-475c-9b70-610546403d3c-config-volume\") pod \"collect-profiles-29323425-z7vrp\" (UID: \"1dd6eb36-2135-475c-9b70-610546403d3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.372677 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dd6eb36-2135-475c-9b70-610546403d3c-secret-volume\") pod \"collect-profiles-29323425-z7vrp\" (UID: \"1dd6eb36-2135-475c-9b70-610546403d3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.385834 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rf5f\" (UniqueName: \"kubernetes.io/projected/1dd6eb36-2135-475c-9b70-610546403d3c-kube-api-access-4rf5f\") pod \"collect-profiles-29323425-z7vrp\" (UID: \"1dd6eb36-2135-475c-9b70-610546403d3c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.502641 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp" Oct 02 11:45:00 crc kubenswrapper[4766]: I1002 11:45:00.943169 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp"] Oct 02 11:45:00 crc kubenswrapper[4766]: W1002 11:45:00.946237 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dd6eb36_2135_475c_9b70_610546403d3c.slice/crio-f489f0b3bb36393c2204001d385d2d8655689cc441432fc0c4abb2f74c098675 WatchSource:0}: Error finding container f489f0b3bb36393c2204001d385d2d8655689cc441432fc0c4abb2f74c098675: Status 404 returned error can't find the container with id f489f0b3bb36393c2204001d385d2d8655689cc441432fc0c4abb2f74c098675 Oct 02 11:45:01 crc kubenswrapper[4766]: I1002 11:45:01.376463 4766 generic.go:334] "Generic (PLEG): container finished" podID="1dd6eb36-2135-475c-9b70-610546403d3c" containerID="e2c38d23bfeef4e060c7affdc4da706d2f328cbf308777ca9b947ee92af07a66" exitCode=0 Oct 02 11:45:01 crc kubenswrapper[4766]: I1002 11:45:01.376610 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp" event={"ID":"1dd6eb36-2135-475c-9b70-610546403d3c","Type":"ContainerDied","Data":"e2c38d23bfeef4e060c7affdc4da706d2f328cbf308777ca9b947ee92af07a66"} Oct 02 11:45:01 crc kubenswrapper[4766]: I1002 11:45:01.376911 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp" event={"ID":"1dd6eb36-2135-475c-9b70-610546403d3c","Type":"ContainerStarted","Data":"f489f0b3bb36393c2204001d385d2d8655689cc441432fc0c4abb2f74c098675"} Oct 02 11:45:02 crc kubenswrapper[4766]: I1002 11:45:02.688030 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp" Oct 02 11:45:02 crc kubenswrapper[4766]: I1002 11:45:02.797361 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dd6eb36-2135-475c-9b70-610546403d3c-config-volume\") pod \"1dd6eb36-2135-475c-9b70-610546403d3c\" (UID: \"1dd6eb36-2135-475c-9b70-610546403d3c\") " Oct 02 11:45:02 crc kubenswrapper[4766]: I1002 11:45:02.797417 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rf5f\" (UniqueName: \"kubernetes.io/projected/1dd6eb36-2135-475c-9b70-610546403d3c-kube-api-access-4rf5f\") pod \"1dd6eb36-2135-475c-9b70-610546403d3c\" (UID: \"1dd6eb36-2135-475c-9b70-610546403d3c\") " Oct 02 11:45:02 crc kubenswrapper[4766]: I1002 11:45:02.797466 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dd6eb36-2135-475c-9b70-610546403d3c-secret-volume\") pod \"1dd6eb36-2135-475c-9b70-610546403d3c\" (UID: \"1dd6eb36-2135-475c-9b70-610546403d3c\") " Oct 02 11:45:02 crc kubenswrapper[4766]: I1002 11:45:02.799186 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dd6eb36-2135-475c-9b70-610546403d3c-config-volume" (OuterVolumeSpecName: "config-volume") pod "1dd6eb36-2135-475c-9b70-610546403d3c" (UID: "1dd6eb36-2135-475c-9b70-610546403d3c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 11:45:02 crc kubenswrapper[4766]: I1002 11:45:02.811769 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dd6eb36-2135-475c-9b70-610546403d3c-kube-api-access-4rf5f" (OuterVolumeSpecName: "kube-api-access-4rf5f") pod "1dd6eb36-2135-475c-9b70-610546403d3c" (UID: "1dd6eb36-2135-475c-9b70-610546403d3c"). InnerVolumeSpecName "kube-api-access-4rf5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:45:02 crc kubenswrapper[4766]: I1002 11:45:02.811782 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dd6eb36-2135-475c-9b70-610546403d3c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1dd6eb36-2135-475c-9b70-610546403d3c" (UID: "1dd6eb36-2135-475c-9b70-610546403d3c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 11:45:02 crc kubenswrapper[4766]: I1002 11:45:02.899616 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dd6eb36-2135-475c-9b70-610546403d3c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:02 crc kubenswrapper[4766]: I1002 11:45:02.899648 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rf5f\" (UniqueName: \"kubernetes.io/projected/1dd6eb36-2135-475c-9b70-610546403d3c-kube-api-access-4rf5f\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:02 crc kubenswrapper[4766]: I1002 11:45:02.899660 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1dd6eb36-2135-475c-9b70-610546403d3c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 11:45:03 crc kubenswrapper[4766]: I1002 11:45:03.391832 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp" event={"ID":"1dd6eb36-2135-475c-9b70-610546403d3c","Type":"ContainerDied","Data":"f489f0b3bb36393c2204001d385d2d8655689cc441432fc0c4abb2f74c098675"} Oct 02 11:45:03 crc kubenswrapper[4766]: I1002 11:45:03.391868 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f489f0b3bb36393c2204001d385d2d8655689cc441432fc0c4abb2f74c098675" Oct 02 11:45:03 crc kubenswrapper[4766]: I1002 11:45:03.391910 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp" Oct 02 11:45:03 crc kubenswrapper[4766]: I1002 11:45:03.777566 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l"] Oct 02 11:45:03 crc kubenswrapper[4766]: I1002 11:45:03.783432 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323380-rvn2l"] Oct 02 11:45:03 crc kubenswrapper[4766]: I1002 11:45:03.894495 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="281a6f64-2f0a-4020-9b9d-af55767e345d" path="/var/lib/kubelet/pods/281a6f64-2f0a-4020-9b9d-af55767e345d/volumes" Oct 02 11:45:34 crc kubenswrapper[4766]: I1002 11:45:34.058111 4766 scope.go:117] "RemoveContainer" containerID="3e43d0f68fbfaa2ef85508b4393f1f16a73976e3c105f57a2cbaf66524ebe833" Oct 02 11:47:24 crc kubenswrapper[4766]: I1002 11:47:24.432074 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:47:24 crc kubenswrapper[4766]: I1002 11:47:24.432630 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:47:54 crc kubenswrapper[4766]: I1002 11:47:54.432394 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:47:54 crc kubenswrapper[4766]: I1002 11:47:54.432934 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:48:24 crc kubenswrapper[4766]: I1002 11:48:24.434200 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:48:24 crc kubenswrapper[4766]: I1002 11:48:24.434895 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:48:24 crc kubenswrapper[4766]: I1002 11:48:24.434961 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 11:48:24 crc kubenswrapper[4766]: I1002 11:48:24.435689 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e7876e2092fb7d7edb7c63bf74430daa8e39f342309027fd5fb71f100757fa15"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:48:24 crc kubenswrapper[4766]: I1002 11:48:24.435752 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://e7876e2092fb7d7edb7c63bf74430daa8e39f342309027fd5fb71f100757fa15" gracePeriod=600 Oct 02 11:48:24 crc kubenswrapper[4766]: I1002 11:48:24.805494 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="e7876e2092fb7d7edb7c63bf74430daa8e39f342309027fd5fb71f100757fa15" exitCode=0 Oct 02 11:48:24 crc kubenswrapper[4766]: I1002 11:48:24.805624 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"e7876e2092fb7d7edb7c63bf74430daa8e39f342309027fd5fb71f100757fa15"} Oct 02 11:48:24 crc kubenswrapper[4766]: I1002 11:48:24.805879 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba"} Oct 02 11:48:24 crc kubenswrapper[4766]: I1002 11:48:24.805903 4766 scope.go:117] "RemoveContainer" containerID="a2d55c28d7804067621d0ec280f2dddfe8e69fe0433db3dd40d8f553acf584c4" Oct 02 11:50:24 crc kubenswrapper[4766]: I1002 11:50:24.432303 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:50:24 crc kubenswrapper[4766]: I1002 11:50:24.433797 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:50:48 crc kubenswrapper[4766]: I1002 11:50:48.801721 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nk4nt"] Oct 02 11:50:48 crc kubenswrapper[4766]: E1002 11:50:48.804257 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd6eb36-2135-475c-9b70-610546403d3c" containerName="collect-profiles" Oct 02 11:50:48 crc kubenswrapper[4766]: I1002 11:50:48.804272 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd6eb36-2135-475c-9b70-610546403d3c" containerName="collect-profiles" Oct 02 11:50:48 crc kubenswrapper[4766]: I1002 11:50:48.804436 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dd6eb36-2135-475c-9b70-610546403d3c" containerName="collect-profiles" Oct 02 11:50:48 crc kubenswrapper[4766]: I1002 11:50:48.805495 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nk4nt" Oct 02 11:50:48 crc kubenswrapper[4766]: I1002 11:50:48.808146 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nk4nt"] Oct 02 11:50:48 crc kubenswrapper[4766]: I1002 11:50:48.878113 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e6194e4-e0c2-4946-b727-3fc3310f117c-catalog-content\") pod \"community-operators-nk4nt\" (UID: \"2e6194e4-e0c2-4946-b727-3fc3310f117c\") " pod="openshift-marketplace/community-operators-nk4nt" Oct 02 11:50:48 crc kubenswrapper[4766]: I1002 11:50:48.878437 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxp5d\" (UniqueName: \"kubernetes.io/projected/2e6194e4-e0c2-4946-b727-3fc3310f117c-kube-api-access-sxp5d\") pod \"community-operators-nk4nt\" (UID: \"2e6194e4-e0c2-4946-b727-3fc3310f117c\") " pod="openshift-marketplace/community-operators-nk4nt" Oct 02 11:50:48 crc kubenswrapper[4766]: I1002 11:50:48.878467 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e6194e4-e0c2-4946-b727-3fc3310f117c-utilities\") pod \"community-operators-nk4nt\" (UID: \"2e6194e4-e0c2-4946-b727-3fc3310f117c\") " pod="openshift-marketplace/community-operators-nk4nt" Oct 02 11:50:48 crc kubenswrapper[4766]: I1002 11:50:48.980053 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e6194e4-e0c2-4946-b727-3fc3310f117c-utilities\") pod \"community-operators-nk4nt\" (UID: \"2e6194e4-e0c2-4946-b727-3fc3310f117c\") " pod="openshift-marketplace/community-operators-nk4nt" Oct 02 11:50:48 crc kubenswrapper[4766]: I1002 11:50:48.980161 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e6194e4-e0c2-4946-b727-3fc3310f117c-catalog-content\") pod \"community-operators-nk4nt\" (UID: \"2e6194e4-e0c2-4946-b727-3fc3310f117c\") " pod="openshift-marketplace/community-operators-nk4nt" Oct 02 11:50:48 crc kubenswrapper[4766]: I1002 11:50:48.980214 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxp5d\" (UniqueName: \"kubernetes.io/projected/2e6194e4-e0c2-4946-b727-3fc3310f117c-kube-api-access-sxp5d\") pod \"community-operators-nk4nt\" (UID: \"2e6194e4-e0c2-4946-b727-3fc3310f117c\") " pod="openshift-marketplace/community-operators-nk4nt" Oct 02 11:50:48 crc kubenswrapper[4766]: I1002 11:50:48.980814 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e6194e4-e0c2-4946-b727-3fc3310f117c-utilities\") pod \"community-operators-nk4nt\" (UID: \"2e6194e4-e0c2-4946-b727-3fc3310f117c\") " pod="openshift-marketplace/community-operators-nk4nt" Oct 02 11:50:48 crc kubenswrapper[4766]: I1002 11:50:48.980880 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e6194e4-e0c2-4946-b727-3fc3310f117c-catalog-content\") pod \"community-operators-nk4nt\" (UID: \"2e6194e4-e0c2-4946-b727-3fc3310f117c\") " pod="openshift-marketplace/community-operators-nk4nt" Oct 02 11:50:49 crc kubenswrapper[4766]: I1002 11:50:49.000950 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxp5d\" (UniqueName: \"kubernetes.io/projected/2e6194e4-e0c2-4946-b727-3fc3310f117c-kube-api-access-sxp5d\") pod \"community-operators-nk4nt\" (UID: \"2e6194e4-e0c2-4946-b727-3fc3310f117c\") " pod="openshift-marketplace/community-operators-nk4nt" Oct 02 11:50:49 crc kubenswrapper[4766]: I1002 11:50:49.127395 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nk4nt" Oct 02 11:50:49 crc kubenswrapper[4766]: I1002 11:50:49.663452 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nk4nt"] Oct 02 11:50:49 crc kubenswrapper[4766]: I1002 11:50:49.871640 4766 generic.go:334] "Generic (PLEG): container finished" podID="2e6194e4-e0c2-4946-b727-3fc3310f117c" containerID="22371b7a736fe2b8d3b3c2d6930922305bcfe6eb4cb4471d285a5cf95862c33f" exitCode=0 Oct 02 11:50:49 crc kubenswrapper[4766]: I1002 11:50:49.871837 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nk4nt" event={"ID":"2e6194e4-e0c2-4946-b727-3fc3310f117c","Type":"ContainerDied","Data":"22371b7a736fe2b8d3b3c2d6930922305bcfe6eb4cb4471d285a5cf95862c33f"} Oct 02 11:50:49 crc kubenswrapper[4766]: I1002 11:50:49.872071 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nk4nt" event={"ID":"2e6194e4-e0c2-4946-b727-3fc3310f117c","Type":"ContainerStarted","Data":"187b741f43599bbb4c9bffed36c8035b2e288358344db32e4d042305921527c7"} Oct 02 11:50:49 crc kubenswrapper[4766]: I1002 11:50:49.873579 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 11:50:51 crc kubenswrapper[4766]: I1002 11:50:51.888497 4766 generic.go:334] "Generic (PLEG): container finished" podID="2e6194e4-e0c2-4946-b727-3fc3310f117c" containerID="2660e5b50887531b03c10102130147f61d69ffabf11c4d537b02358db62ddfda" exitCode=0 Oct 02 11:50:51 crc kubenswrapper[4766]: I1002 11:50:51.890207 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nk4nt" event={"ID":"2e6194e4-e0c2-4946-b727-3fc3310f117c","Type":"ContainerDied","Data":"2660e5b50887531b03c10102130147f61d69ffabf11c4d537b02358db62ddfda"} Oct 02 11:50:52 crc kubenswrapper[4766]: I1002 11:50:52.899064 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nk4nt" event={"ID":"2e6194e4-e0c2-4946-b727-3fc3310f117c","Type":"ContainerStarted","Data":"9cc9ef3fa71e7e441bde7edfa8b19da672c973c23e0c8f661f90b96a5be77f66"} Oct 02 11:50:52 crc kubenswrapper[4766]: I1002 11:50:52.921096 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nk4nt" podStartSLOduration=2.5265579479999998 podStartE2EDuration="4.921068468s" podCreationTimestamp="2025-10-02 11:50:48 +0000 UTC" firstStartedPulling="2025-10-02 11:50:49.873280846 +0000 UTC m=+3564.816151790" lastFinishedPulling="2025-10-02 11:50:52.267791366 +0000 UTC m=+3567.210662310" observedRunningTime="2025-10-02 11:50:52.917220345 +0000 UTC m=+3567.860091299" watchObservedRunningTime="2025-10-02 11:50:52.921068468 +0000 UTC m=+3567.863939432" Oct 02 11:50:54 crc kubenswrapper[4766]: I1002 11:50:54.431939 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:50:54 crc kubenswrapper[4766]: I1002 11:50:54.432204 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:50:59 crc kubenswrapper[4766]: I1002 11:50:59.128724 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nk4nt" Oct 02 11:50:59 crc kubenswrapper[4766]: I1002 11:50:59.129117 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nk4nt" Oct 02 11:50:59 crc kubenswrapper[4766]: I1002 11:50:59.176944 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nk4nt" Oct 02 11:50:59 crc kubenswrapper[4766]: I1002 11:50:59.990891 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nk4nt" Oct 02 11:51:00 crc kubenswrapper[4766]: I1002 11:51:00.034657 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nk4nt"] Oct 02 11:51:01 crc kubenswrapper[4766]: I1002 11:51:01.961737 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nk4nt" podUID="2e6194e4-e0c2-4946-b727-3fc3310f117c" containerName="registry-server" containerID="cri-o://9cc9ef3fa71e7e441bde7edfa8b19da672c973c23e0c8f661f90b96a5be77f66" gracePeriod=2 Oct 02 11:51:02 crc kubenswrapper[4766]: I1002 11:51:02.355064 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nk4nt" Oct 02 11:51:02 crc kubenswrapper[4766]: I1002 11:51:02.485727 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e6194e4-e0c2-4946-b727-3fc3310f117c-utilities\") pod \"2e6194e4-e0c2-4946-b727-3fc3310f117c\" (UID: \"2e6194e4-e0c2-4946-b727-3fc3310f117c\") " Oct 02 11:51:02 crc kubenswrapper[4766]: I1002 11:51:02.485855 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e6194e4-e0c2-4946-b727-3fc3310f117c-catalog-content\") pod \"2e6194e4-e0c2-4946-b727-3fc3310f117c\" (UID: \"2e6194e4-e0c2-4946-b727-3fc3310f117c\") " Oct 02 11:51:02 crc kubenswrapper[4766]: I1002 11:51:02.485950 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxp5d\" (UniqueName: \"kubernetes.io/projected/2e6194e4-e0c2-4946-b727-3fc3310f117c-kube-api-access-sxp5d\") pod \"2e6194e4-e0c2-4946-b727-3fc3310f117c\" (UID: \"2e6194e4-e0c2-4946-b727-3fc3310f117c\") " Oct 02 11:51:02 crc kubenswrapper[4766]: I1002 11:51:02.487608 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e6194e4-e0c2-4946-b727-3fc3310f117c-utilities" (OuterVolumeSpecName: "utilities") pod "2e6194e4-e0c2-4946-b727-3fc3310f117c" (UID: "2e6194e4-e0c2-4946-b727-3fc3310f117c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:51:02 crc kubenswrapper[4766]: I1002 11:51:02.497867 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e6194e4-e0c2-4946-b727-3fc3310f117c-kube-api-access-sxp5d" (OuterVolumeSpecName: "kube-api-access-sxp5d") pod "2e6194e4-e0c2-4946-b727-3fc3310f117c" (UID: "2e6194e4-e0c2-4946-b727-3fc3310f117c"). InnerVolumeSpecName "kube-api-access-sxp5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:51:02 crc kubenswrapper[4766]: I1002 11:51:02.534858 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e6194e4-e0c2-4946-b727-3fc3310f117c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2e6194e4-e0c2-4946-b727-3fc3310f117c" (UID: "2e6194e4-e0c2-4946-b727-3fc3310f117c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:51:02 crc kubenswrapper[4766]: I1002 11:51:02.589226 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2e6194e4-e0c2-4946-b727-3fc3310f117c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:51:02 crc kubenswrapper[4766]: I1002 11:51:02.589285 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxp5d\" (UniqueName: \"kubernetes.io/projected/2e6194e4-e0c2-4946-b727-3fc3310f117c-kube-api-access-sxp5d\") on node \"crc\" DevicePath \"\"" Oct 02 11:51:02 crc kubenswrapper[4766]: I1002 11:51:02.589307 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2e6194e4-e0c2-4946-b727-3fc3310f117c-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:51:02 crc kubenswrapper[4766]: I1002 11:51:02.968963 4766 generic.go:334] "Generic (PLEG): container finished" podID="2e6194e4-e0c2-4946-b727-3fc3310f117c" containerID="9cc9ef3fa71e7e441bde7edfa8b19da672c973c23e0c8f661f90b96a5be77f66" exitCode=0 Oct 02 11:51:02 crc kubenswrapper[4766]: I1002 11:51:02.969005 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nk4nt" event={"ID":"2e6194e4-e0c2-4946-b727-3fc3310f117c","Type":"ContainerDied","Data":"9cc9ef3fa71e7e441bde7edfa8b19da672c973c23e0c8f661f90b96a5be77f66"} Oct 02 11:51:02 crc kubenswrapper[4766]: I1002 11:51:02.969022 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nk4nt" Oct 02 11:51:02 crc kubenswrapper[4766]: I1002 11:51:02.969041 4766 scope.go:117] "RemoveContainer" containerID="9cc9ef3fa71e7e441bde7edfa8b19da672c973c23e0c8f661f90b96a5be77f66" Oct 02 11:51:02 crc kubenswrapper[4766]: I1002 11:51:02.969029 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nk4nt" event={"ID":"2e6194e4-e0c2-4946-b727-3fc3310f117c","Type":"ContainerDied","Data":"187b741f43599bbb4c9bffed36c8035b2e288358344db32e4d042305921527c7"} Oct 02 11:51:02 crc kubenswrapper[4766]: I1002 11:51:02.986325 4766 scope.go:117] "RemoveContainer" containerID="2660e5b50887531b03c10102130147f61d69ffabf11c4d537b02358db62ddfda" Oct 02 11:51:03 crc kubenswrapper[4766]: I1002 11:51:03.000746 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nk4nt"] Oct 02 11:51:03 crc kubenswrapper[4766]: I1002 11:51:03.005759 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nk4nt"] Oct 02 11:51:03 crc kubenswrapper[4766]: I1002 11:51:03.020291 4766 scope.go:117] "RemoveContainer" containerID="22371b7a736fe2b8d3b3c2d6930922305bcfe6eb4cb4471d285a5cf95862c33f" Oct 02 11:51:03 crc kubenswrapper[4766]: I1002 11:51:03.037563 4766 scope.go:117] "RemoveContainer" containerID="9cc9ef3fa71e7e441bde7edfa8b19da672c973c23e0c8f661f90b96a5be77f66" Oct 02 11:51:03 crc kubenswrapper[4766]: E1002 11:51:03.038374 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cc9ef3fa71e7e441bde7edfa8b19da672c973c23e0c8f661f90b96a5be77f66\": container with ID starting with 9cc9ef3fa71e7e441bde7edfa8b19da672c973c23e0c8f661f90b96a5be77f66 not found: ID does not exist" containerID="9cc9ef3fa71e7e441bde7edfa8b19da672c973c23e0c8f661f90b96a5be77f66" Oct 02 11:51:03 crc kubenswrapper[4766]: I1002 11:51:03.038433 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cc9ef3fa71e7e441bde7edfa8b19da672c973c23e0c8f661f90b96a5be77f66"} err="failed to get container status \"9cc9ef3fa71e7e441bde7edfa8b19da672c973c23e0c8f661f90b96a5be77f66\": rpc error: code = NotFound desc = could not find container \"9cc9ef3fa71e7e441bde7edfa8b19da672c973c23e0c8f661f90b96a5be77f66\": container with ID starting with 9cc9ef3fa71e7e441bde7edfa8b19da672c973c23e0c8f661f90b96a5be77f66 not found: ID does not exist" Oct 02 11:51:03 crc kubenswrapper[4766]: I1002 11:51:03.038471 4766 scope.go:117] "RemoveContainer" containerID="2660e5b50887531b03c10102130147f61d69ffabf11c4d537b02358db62ddfda" Oct 02 11:51:03 crc kubenswrapper[4766]: E1002 11:51:03.038894 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2660e5b50887531b03c10102130147f61d69ffabf11c4d537b02358db62ddfda\": container with ID starting with 2660e5b50887531b03c10102130147f61d69ffabf11c4d537b02358db62ddfda not found: ID does not exist" containerID="2660e5b50887531b03c10102130147f61d69ffabf11c4d537b02358db62ddfda" Oct 02 11:51:03 crc kubenswrapper[4766]: I1002 11:51:03.038933 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2660e5b50887531b03c10102130147f61d69ffabf11c4d537b02358db62ddfda"} err="failed to get container status \"2660e5b50887531b03c10102130147f61d69ffabf11c4d537b02358db62ddfda\": rpc error: code = NotFound desc = could not find container \"2660e5b50887531b03c10102130147f61d69ffabf11c4d537b02358db62ddfda\": container with ID starting with 2660e5b50887531b03c10102130147f61d69ffabf11c4d537b02358db62ddfda not found: ID does not exist" Oct 02 11:51:03 crc kubenswrapper[4766]: I1002 11:51:03.038951 4766 scope.go:117] "RemoveContainer" containerID="22371b7a736fe2b8d3b3c2d6930922305bcfe6eb4cb4471d285a5cf95862c33f" Oct 02 11:51:03 crc kubenswrapper[4766]: E1002 11:51:03.039467 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22371b7a736fe2b8d3b3c2d6930922305bcfe6eb4cb4471d285a5cf95862c33f\": container with ID starting with 22371b7a736fe2b8d3b3c2d6930922305bcfe6eb4cb4471d285a5cf95862c33f not found: ID does not exist" containerID="22371b7a736fe2b8d3b3c2d6930922305bcfe6eb4cb4471d285a5cf95862c33f" Oct 02 11:51:03 crc kubenswrapper[4766]: I1002 11:51:03.039514 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22371b7a736fe2b8d3b3c2d6930922305bcfe6eb4cb4471d285a5cf95862c33f"} err="failed to get container status \"22371b7a736fe2b8d3b3c2d6930922305bcfe6eb4cb4471d285a5cf95862c33f\": rpc error: code = NotFound desc = could not find container \"22371b7a736fe2b8d3b3c2d6930922305bcfe6eb4cb4471d285a5cf95862c33f\": container with ID starting with 22371b7a736fe2b8d3b3c2d6930922305bcfe6eb4cb4471d285a5cf95862c33f not found: ID does not exist" Oct 02 11:51:03 crc kubenswrapper[4766]: I1002 11:51:03.889949 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e6194e4-e0c2-4946-b727-3fc3310f117c" path="/var/lib/kubelet/pods/2e6194e4-e0c2-4946-b727-3fc3310f117c/volumes" Oct 02 11:51:24 crc kubenswrapper[4766]: I1002 11:51:24.432657 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:51:24 crc kubenswrapper[4766]: I1002 11:51:24.433557 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:51:24 crc kubenswrapper[4766]: I1002 11:51:24.433638 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 11:51:24 crc kubenswrapper[4766]: I1002 11:51:24.434775 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:51:24 crc kubenswrapper[4766]: I1002 11:51:24.434885 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" gracePeriod=600 Oct 02 11:51:24 crc kubenswrapper[4766]: E1002 11:51:24.570828 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:51:25 crc kubenswrapper[4766]: I1002 11:51:25.135001 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" exitCode=0 Oct 02 11:51:25 crc kubenswrapper[4766]: I1002 11:51:25.135048 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba"} Oct 02 11:51:25 crc kubenswrapper[4766]: I1002 11:51:25.135112 4766 scope.go:117] "RemoveContainer" containerID="e7876e2092fb7d7edb7c63bf74430daa8e39f342309027fd5fb71f100757fa15" Oct 02 11:51:25 crc kubenswrapper[4766]: I1002 11:51:25.136329 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:51:25 crc kubenswrapper[4766]: E1002 11:51:25.136825 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:51:37 crc kubenswrapper[4766]: I1002 11:51:37.882402 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:51:37 crc kubenswrapper[4766]: E1002 11:51:37.883230 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:51:51 crc kubenswrapper[4766]: I1002 11:51:51.881810 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:51:51 crc kubenswrapper[4766]: E1002 11:51:51.882654 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:52:06 crc kubenswrapper[4766]: I1002 11:52:06.881921 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:52:06 crc kubenswrapper[4766]: E1002 11:52:06.882798 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:52:17 crc kubenswrapper[4766]: I1002 11:52:17.881346 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:52:17 crc kubenswrapper[4766]: E1002 11:52:17.882071 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:52:29 crc kubenswrapper[4766]: I1002 11:52:29.881157 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:52:29 crc kubenswrapper[4766]: E1002 11:52:29.881930 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:52:42 crc kubenswrapper[4766]: I1002 11:52:42.880852 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:52:42 crc kubenswrapper[4766]: E1002 11:52:42.881640 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:52:54 crc kubenswrapper[4766]: I1002 11:52:54.881426 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:52:54 crc kubenswrapper[4766]: E1002 11:52:54.882257 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:53:09 crc kubenswrapper[4766]: I1002 11:53:09.881400 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:53:09 crc kubenswrapper[4766]: E1002 11:53:09.882271 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:53:21 crc kubenswrapper[4766]: I1002 11:53:21.881801 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:53:21 crc kubenswrapper[4766]: E1002 11:53:21.882678 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.219267 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bx5rn"] Oct 02 11:53:32 crc kubenswrapper[4766]: E1002 11:53:32.220145 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6194e4-e0c2-4946-b727-3fc3310f117c" containerName="extract-utilities" Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.220161 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6194e4-e0c2-4946-b727-3fc3310f117c" containerName="extract-utilities" Oct 02 11:53:32 crc kubenswrapper[4766]: E1002 11:53:32.220189 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6194e4-e0c2-4946-b727-3fc3310f117c" containerName="extract-content" Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.220197 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6194e4-e0c2-4946-b727-3fc3310f117c" containerName="extract-content" Oct 02 11:53:32 crc kubenswrapper[4766]: E1002 11:53:32.220222 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e6194e4-e0c2-4946-b727-3fc3310f117c" containerName="registry-server" Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.220230 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e6194e4-e0c2-4946-b727-3fc3310f117c" containerName="registry-server" Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.220395 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e6194e4-e0c2-4946-b727-3fc3310f117c" containerName="registry-server" Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.221842 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bx5rn" Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.269070 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bx5rn"] Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.323154 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh8dr\" (UniqueName: \"kubernetes.io/projected/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-kube-api-access-lh8dr\") pod \"redhat-marketplace-bx5rn\" (UID: \"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c\") " pod="openshift-marketplace/redhat-marketplace-bx5rn" Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.323241 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-utilities\") pod \"redhat-marketplace-bx5rn\" (UID: \"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c\") " pod="openshift-marketplace/redhat-marketplace-bx5rn" Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.323273 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-catalog-content\") pod \"redhat-marketplace-bx5rn\" (UID: \"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c\") " pod="openshift-marketplace/redhat-marketplace-bx5rn" Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.424269 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-utilities\") pod \"redhat-marketplace-bx5rn\" (UID: \"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c\") " pod="openshift-marketplace/redhat-marketplace-bx5rn" Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.424343 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-catalog-content\") pod \"redhat-marketplace-bx5rn\" (UID: \"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c\") " pod="openshift-marketplace/redhat-marketplace-bx5rn" Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.424445 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh8dr\" (UniqueName: \"kubernetes.io/projected/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-kube-api-access-lh8dr\") pod \"redhat-marketplace-bx5rn\" (UID: \"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c\") " pod="openshift-marketplace/redhat-marketplace-bx5rn" Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.425473 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-utilities\") pod \"redhat-marketplace-bx5rn\" (UID: \"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c\") " pod="openshift-marketplace/redhat-marketplace-bx5rn" Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.425481 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-catalog-content\") pod \"redhat-marketplace-bx5rn\" (UID: \"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c\") " pod="openshift-marketplace/redhat-marketplace-bx5rn" Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.451148 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh8dr\" (UniqueName: \"kubernetes.io/projected/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-kube-api-access-lh8dr\") pod \"redhat-marketplace-bx5rn\" (UID: \"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c\") " pod="openshift-marketplace/redhat-marketplace-bx5rn" Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.561863 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bx5rn" Oct 02 11:53:32 crc kubenswrapper[4766]: I1002 11:53:32.978413 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bx5rn"] Oct 02 11:53:33 crc kubenswrapper[4766]: I1002 11:53:33.142426 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx5rn" event={"ID":"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c","Type":"ContainerStarted","Data":"1e700965f540c686b89daea77952ca90bd72f9e63b5c68db31e23f9a5615fb6b"} Oct 02 11:53:34 crc kubenswrapper[4766]: I1002 11:53:34.149437 4766 generic.go:334] "Generic (PLEG): container finished" podID="e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c" containerID="7a15aa2346958ab6d80917d3519b7d792388efc36f5828acc09c2d49c07a248f" exitCode=0 Oct 02 11:53:34 crc kubenswrapper[4766]: I1002 11:53:34.149483 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx5rn" event={"ID":"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c","Type":"ContainerDied","Data":"7a15aa2346958ab6d80917d3519b7d792388efc36f5828acc09c2d49c07a248f"} Oct 02 11:53:35 crc kubenswrapper[4766]: I1002 11:53:35.162225 4766 generic.go:334] "Generic (PLEG): container finished" podID="e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c" containerID="4f9c5198ab3149d59da07deac7586e95590e2db725223cdd8027e139a0047780" exitCode=0 Oct 02 11:53:35 crc kubenswrapper[4766]: I1002 11:53:35.162356 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx5rn" event={"ID":"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c","Type":"ContainerDied","Data":"4f9c5198ab3149d59da07deac7586e95590e2db725223cdd8027e139a0047780"} Oct 02 11:53:35 crc kubenswrapper[4766]: I1002 11:53:35.885480 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:53:35 crc kubenswrapper[4766]: E1002 11:53:35.885765 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:53:36 crc kubenswrapper[4766]: I1002 11:53:36.171856 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx5rn" event={"ID":"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c","Type":"ContainerStarted","Data":"f540a5413e492d4d871f11aa95e0f40eb1f0245b36c8842eba93b7a4040fee05"} Oct 02 11:53:36 crc kubenswrapper[4766]: I1002 11:53:36.191650 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bx5rn" podStartSLOduration=2.608807429 podStartE2EDuration="4.191631336s" podCreationTimestamp="2025-10-02 11:53:32 +0000 UTC" firstStartedPulling="2025-10-02 11:53:34.15152883 +0000 UTC m=+3729.094399774" lastFinishedPulling="2025-10-02 11:53:35.734352737 +0000 UTC m=+3730.677223681" observedRunningTime="2025-10-02 11:53:36.189883849 +0000 UTC m=+3731.132754793" watchObservedRunningTime="2025-10-02 11:53:36.191631336 +0000 UTC m=+3731.134502280" Oct 02 11:53:42 crc kubenswrapper[4766]: I1002 11:53:42.562408 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bx5rn" Oct 02 11:53:42 crc kubenswrapper[4766]: I1002 11:53:42.563754 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bx5rn" Oct 02 11:53:42 crc kubenswrapper[4766]: I1002 11:53:42.607161 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bx5rn" Oct 02 11:53:43 crc kubenswrapper[4766]: I1002 11:53:43.272728 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bx5rn" Oct 02 11:53:43 crc kubenswrapper[4766]: I1002 11:53:43.326983 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bx5rn"] Oct 02 11:53:45 crc kubenswrapper[4766]: I1002 11:53:45.235992 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bx5rn" podUID="e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c" containerName="registry-server" containerID="cri-o://f540a5413e492d4d871f11aa95e0f40eb1f0245b36c8842eba93b7a4040fee05" gracePeriod=2 Oct 02 11:53:45 crc kubenswrapper[4766]: I1002 11:53:45.642307 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bx5rn" Oct 02 11:53:45 crc kubenswrapper[4766]: I1002 11:53:45.708190 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh8dr\" (UniqueName: \"kubernetes.io/projected/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-kube-api-access-lh8dr\") pod \"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c\" (UID: \"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c\") " Oct 02 11:53:45 crc kubenswrapper[4766]: I1002 11:53:45.708396 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-utilities\") pod \"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c\" (UID: \"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c\") " Oct 02 11:53:45 crc kubenswrapper[4766]: I1002 11:53:45.708533 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-catalog-content\") pod \"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c\" (UID: \"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c\") " Oct 02 11:53:45 crc kubenswrapper[4766]: I1002 11:53:45.709588 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-utilities" (OuterVolumeSpecName: "utilities") pod "e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c" (UID: "e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:53:45 crc kubenswrapper[4766]: I1002 11:53:45.716250 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-kube-api-access-lh8dr" (OuterVolumeSpecName: "kube-api-access-lh8dr") pod "e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c" (UID: "e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c"). InnerVolumeSpecName "kube-api-access-lh8dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:53:45 crc kubenswrapper[4766]: I1002 11:53:45.721165 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c" (UID: "e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:53:45 crc kubenswrapper[4766]: I1002 11:53:45.810588 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:53:45 crc kubenswrapper[4766]: I1002 11:53:45.810625 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh8dr\" (UniqueName: \"kubernetes.io/projected/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-kube-api-access-lh8dr\") on node \"crc\" DevicePath \"\"" Oct 02 11:53:45 crc kubenswrapper[4766]: I1002 11:53:45.810636 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:53:46 crc kubenswrapper[4766]: I1002 11:53:46.247159 4766 generic.go:334] "Generic (PLEG): container finished" podID="e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c" containerID="f540a5413e492d4d871f11aa95e0f40eb1f0245b36c8842eba93b7a4040fee05" exitCode=0 Oct 02 11:53:46 crc kubenswrapper[4766]: I1002 11:53:46.247213 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bx5rn" Oct 02 11:53:46 crc kubenswrapper[4766]: I1002 11:53:46.247222 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx5rn" event={"ID":"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c","Type":"ContainerDied","Data":"f540a5413e492d4d871f11aa95e0f40eb1f0245b36c8842eba93b7a4040fee05"} Oct 02 11:53:46 crc kubenswrapper[4766]: I1002 11:53:46.247269 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx5rn" event={"ID":"e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c","Type":"ContainerDied","Data":"1e700965f540c686b89daea77952ca90bd72f9e63b5c68db31e23f9a5615fb6b"} Oct 02 11:53:46 crc kubenswrapper[4766]: I1002 11:53:46.247296 4766 scope.go:117] "RemoveContainer" containerID="f540a5413e492d4d871f11aa95e0f40eb1f0245b36c8842eba93b7a4040fee05" Oct 02 11:53:46 crc kubenswrapper[4766]: I1002 11:53:46.270461 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bx5rn"] Oct 02 11:53:46 crc kubenswrapper[4766]: I1002 11:53:46.274287 4766 scope.go:117] "RemoveContainer" containerID="4f9c5198ab3149d59da07deac7586e95590e2db725223cdd8027e139a0047780" Oct 02 11:53:46 crc kubenswrapper[4766]: I1002 11:53:46.279891 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bx5rn"] Oct 02 11:53:46 crc kubenswrapper[4766]: I1002 11:53:46.293429 4766 scope.go:117] "RemoveContainer" containerID="7a15aa2346958ab6d80917d3519b7d792388efc36f5828acc09c2d49c07a248f" Oct 02 11:53:46 crc kubenswrapper[4766]: I1002 11:53:46.312712 4766 scope.go:117] "RemoveContainer" containerID="f540a5413e492d4d871f11aa95e0f40eb1f0245b36c8842eba93b7a4040fee05" Oct 02 11:53:46 crc kubenswrapper[4766]: E1002 11:53:46.313266 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f540a5413e492d4d871f11aa95e0f40eb1f0245b36c8842eba93b7a4040fee05\": container with ID starting with f540a5413e492d4d871f11aa95e0f40eb1f0245b36c8842eba93b7a4040fee05 not found: ID does not exist" containerID="f540a5413e492d4d871f11aa95e0f40eb1f0245b36c8842eba93b7a4040fee05" Oct 02 11:53:46 crc kubenswrapper[4766]: I1002 11:53:46.313309 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f540a5413e492d4d871f11aa95e0f40eb1f0245b36c8842eba93b7a4040fee05"} err="failed to get container status \"f540a5413e492d4d871f11aa95e0f40eb1f0245b36c8842eba93b7a4040fee05\": rpc error: code = NotFound desc = could not find container \"f540a5413e492d4d871f11aa95e0f40eb1f0245b36c8842eba93b7a4040fee05\": container with ID starting with f540a5413e492d4d871f11aa95e0f40eb1f0245b36c8842eba93b7a4040fee05 not found: ID does not exist" Oct 02 11:53:46 crc kubenswrapper[4766]: I1002 11:53:46.313334 4766 scope.go:117] "RemoveContainer" containerID="4f9c5198ab3149d59da07deac7586e95590e2db725223cdd8027e139a0047780" Oct 02 11:53:46 crc kubenswrapper[4766]: E1002 11:53:46.313953 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f9c5198ab3149d59da07deac7586e95590e2db725223cdd8027e139a0047780\": container with ID starting with 4f9c5198ab3149d59da07deac7586e95590e2db725223cdd8027e139a0047780 not found: ID does not exist" containerID="4f9c5198ab3149d59da07deac7586e95590e2db725223cdd8027e139a0047780" Oct 02 11:53:46 crc kubenswrapper[4766]: I1002 11:53:46.314061 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f9c5198ab3149d59da07deac7586e95590e2db725223cdd8027e139a0047780"} err="failed to get container status \"4f9c5198ab3149d59da07deac7586e95590e2db725223cdd8027e139a0047780\": rpc error: code = NotFound desc = could not find container \"4f9c5198ab3149d59da07deac7586e95590e2db725223cdd8027e139a0047780\": container with ID starting with 4f9c5198ab3149d59da07deac7586e95590e2db725223cdd8027e139a0047780 not found: ID does not exist" Oct 02 11:53:46 crc kubenswrapper[4766]: I1002 11:53:46.314161 4766 scope.go:117] "RemoveContainer" containerID="7a15aa2346958ab6d80917d3519b7d792388efc36f5828acc09c2d49c07a248f" Oct 02 11:53:46 crc kubenswrapper[4766]: E1002 11:53:46.314528 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a15aa2346958ab6d80917d3519b7d792388efc36f5828acc09c2d49c07a248f\": container with ID starting with 7a15aa2346958ab6d80917d3519b7d792388efc36f5828acc09c2d49c07a248f not found: ID does not exist" containerID="7a15aa2346958ab6d80917d3519b7d792388efc36f5828acc09c2d49c07a248f" Oct 02 11:53:46 crc kubenswrapper[4766]: I1002 11:53:46.314563 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a15aa2346958ab6d80917d3519b7d792388efc36f5828acc09c2d49c07a248f"} err="failed to get container status \"7a15aa2346958ab6d80917d3519b7d792388efc36f5828acc09c2d49c07a248f\": rpc error: code = NotFound desc = could not find container \"7a15aa2346958ab6d80917d3519b7d792388efc36f5828acc09c2d49c07a248f\": container with ID starting with 7a15aa2346958ab6d80917d3519b7d792388efc36f5828acc09c2d49c07a248f not found: ID does not exist" Oct 02 11:53:47 crc kubenswrapper[4766]: I1002 11:53:47.881034 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:53:47 crc kubenswrapper[4766]: E1002 11:53:47.881722 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:53:47 crc kubenswrapper[4766]: I1002 11:53:47.889009 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c" path="/var/lib/kubelet/pods/e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c/volumes" Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.284370 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qjcq5"] Oct 02 11:54:00 crc kubenswrapper[4766]: E1002 11:54:00.285269 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c" containerName="registry-server" Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.285286 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c" containerName="registry-server" Oct 02 11:54:00 crc kubenswrapper[4766]: E1002 11:54:00.285313 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c" containerName="extract-utilities" Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.285324 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c" containerName="extract-utilities" Oct 02 11:54:00 crc kubenswrapper[4766]: E1002 11:54:00.285352 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c" containerName="extract-content" Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.285360 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c" containerName="extract-content" Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.285613 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="e86cb15b-cff4-40b7-9d8c-2d4e5ff1253c" containerName="registry-server" Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.286794 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjcq5" Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.299049 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qjcq5"] Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.414401 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djtjr\" (UniqueName: \"kubernetes.io/projected/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-kube-api-access-djtjr\") pod \"certified-operators-qjcq5\" (UID: \"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a\") " pod="openshift-marketplace/certified-operators-qjcq5" Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.414489 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-catalog-content\") pod \"certified-operators-qjcq5\" (UID: \"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a\") " pod="openshift-marketplace/certified-operators-qjcq5" Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.414600 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-utilities\") pod \"certified-operators-qjcq5\" (UID: \"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a\") " pod="openshift-marketplace/certified-operators-qjcq5" Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.516197 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-catalog-content\") pod \"certified-operators-qjcq5\" (UID: \"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a\") " pod="openshift-marketplace/certified-operators-qjcq5" Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.516257 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-utilities\") pod \"certified-operators-qjcq5\" (UID: \"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a\") " pod="openshift-marketplace/certified-operators-qjcq5" Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.516346 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djtjr\" (UniqueName: \"kubernetes.io/projected/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-kube-api-access-djtjr\") pod \"certified-operators-qjcq5\" (UID: \"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a\") " pod="openshift-marketplace/certified-operators-qjcq5" Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.517004 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-catalog-content\") pod \"certified-operators-qjcq5\" (UID: \"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a\") " pod="openshift-marketplace/certified-operators-qjcq5" Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.517152 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-utilities\") pod \"certified-operators-qjcq5\" (UID: \"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a\") " pod="openshift-marketplace/certified-operators-qjcq5" Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.535959 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djtjr\" (UniqueName: \"kubernetes.io/projected/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-kube-api-access-djtjr\") pod \"certified-operators-qjcq5\" (UID: \"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a\") " pod="openshift-marketplace/certified-operators-qjcq5" Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.604084 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjcq5" Oct 02 11:54:00 crc kubenswrapper[4766]: I1002 11:54:00.881133 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:54:00 crc kubenswrapper[4766]: E1002 11:54:00.881710 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:54:01 crc kubenswrapper[4766]: I1002 11:54:01.081282 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qjcq5"] Oct 02 11:54:01 crc kubenswrapper[4766]: I1002 11:54:01.373871 4766 generic.go:334] "Generic (PLEG): container finished" podID="6b1f739c-6128-4da8-a7e9-12ab8fa19f3a" containerID="bd8aec079d8023f622d084999511b1e2325d4bf230b558f247caad39222d2170" exitCode=0 Oct 02 11:54:01 crc kubenswrapper[4766]: I1002 11:54:01.373918 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjcq5" event={"ID":"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a","Type":"ContainerDied","Data":"bd8aec079d8023f622d084999511b1e2325d4bf230b558f247caad39222d2170"} Oct 02 11:54:01 crc kubenswrapper[4766]: I1002 11:54:01.373979 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjcq5" event={"ID":"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a","Type":"ContainerStarted","Data":"0a66a08a6b2fab7e6aaeaf5876ac48a1152ee1ce5c96b3491adfd18d1402224b"} Oct 02 11:54:02 crc kubenswrapper[4766]: I1002 11:54:02.400616 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjcq5" event={"ID":"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a","Type":"ContainerStarted","Data":"0af43d58467b46b1e77dd959a1719d64fc12f884990a2205d136825e018d4631"} Oct 02 11:54:03 crc kubenswrapper[4766]: I1002 11:54:03.410337 4766 generic.go:334] "Generic (PLEG): container finished" podID="6b1f739c-6128-4da8-a7e9-12ab8fa19f3a" containerID="0af43d58467b46b1e77dd959a1719d64fc12f884990a2205d136825e018d4631" exitCode=0 Oct 02 11:54:03 crc kubenswrapper[4766]: I1002 11:54:03.410570 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjcq5" event={"ID":"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a","Type":"ContainerDied","Data":"0af43d58467b46b1e77dd959a1719d64fc12f884990a2205d136825e018d4631"} Oct 02 11:54:05 crc kubenswrapper[4766]: I1002 11:54:05.430670 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjcq5" event={"ID":"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a","Type":"ContainerStarted","Data":"b7edab5513a42ad7ebef5b61139fd013c0138d671ef36506f6308d112db4fed3"} Oct 02 11:54:05 crc kubenswrapper[4766]: I1002 11:54:05.446565 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qjcq5" podStartSLOduration=2.402069425 podStartE2EDuration="5.446543821s" podCreationTimestamp="2025-10-02 11:54:00 +0000 UTC" firstStartedPulling="2025-10-02 11:54:01.375570286 +0000 UTC m=+3756.318441230" lastFinishedPulling="2025-10-02 11:54:04.420044682 +0000 UTC m=+3759.362915626" observedRunningTime="2025-10-02 11:54:05.445529588 +0000 UTC m=+3760.388400532" watchObservedRunningTime="2025-10-02 11:54:05.446543821 +0000 UTC m=+3760.389414775" Oct 02 11:54:10 crc kubenswrapper[4766]: I1002 11:54:10.604400 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qjcq5" Oct 02 11:54:10 crc kubenswrapper[4766]: I1002 11:54:10.605020 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qjcq5" Oct 02 11:54:10 crc kubenswrapper[4766]: I1002 11:54:10.642448 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qjcq5" Oct 02 11:54:11 crc kubenswrapper[4766]: I1002 11:54:11.507133 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qjcq5" Oct 02 11:54:11 crc kubenswrapper[4766]: I1002 11:54:11.590484 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjcq5"] Oct 02 11:54:13 crc kubenswrapper[4766]: I1002 11:54:13.484786 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qjcq5" podUID="6b1f739c-6128-4da8-a7e9-12ab8fa19f3a" containerName="registry-server" containerID="cri-o://b7edab5513a42ad7ebef5b61139fd013c0138d671ef36506f6308d112db4fed3" gracePeriod=2 Oct 02 11:54:13 crc kubenswrapper[4766]: I1002 11:54:13.897118 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjcq5" Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.002046 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-catalog-content\") pod \"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a\" (UID: \"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a\") " Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.002115 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-utilities\") pod \"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a\" (UID: \"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a\") " Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.002186 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djtjr\" (UniqueName: \"kubernetes.io/projected/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-kube-api-access-djtjr\") pod \"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a\" (UID: \"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a\") " Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.002786 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-utilities" (OuterVolumeSpecName: "utilities") pod "6b1f739c-6128-4da8-a7e9-12ab8fa19f3a" (UID: "6b1f739c-6128-4da8-a7e9-12ab8fa19f3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.006716 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-kube-api-access-djtjr" (OuterVolumeSpecName: "kube-api-access-djtjr") pod "6b1f739c-6128-4da8-a7e9-12ab8fa19f3a" (UID: "6b1f739c-6128-4da8-a7e9-12ab8fa19f3a"). InnerVolumeSpecName "kube-api-access-djtjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.047515 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b1f739c-6128-4da8-a7e9-12ab8fa19f3a" (UID: "6b1f739c-6128-4da8-a7e9-12ab8fa19f3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.103659 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.103703 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.103723 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djtjr\" (UniqueName: \"kubernetes.io/projected/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a-kube-api-access-djtjr\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.501147 4766 generic.go:334] "Generic (PLEG): container finished" podID="6b1f739c-6128-4da8-a7e9-12ab8fa19f3a" containerID="b7edab5513a42ad7ebef5b61139fd013c0138d671ef36506f6308d112db4fed3" exitCode=0 Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.501215 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjcq5" event={"ID":"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a","Type":"ContainerDied","Data":"b7edab5513a42ad7ebef5b61139fd013c0138d671ef36506f6308d112db4fed3"} Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.501305 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qjcq5" event={"ID":"6b1f739c-6128-4da8-a7e9-12ab8fa19f3a","Type":"ContainerDied","Data":"0a66a08a6b2fab7e6aaeaf5876ac48a1152ee1ce5c96b3491adfd18d1402224b"} Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.501347 4766 scope.go:117] "RemoveContainer" containerID="b7edab5513a42ad7ebef5b61139fd013c0138d671ef36506f6308d112db4fed3" Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.501865 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qjcq5" Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.533975 4766 scope.go:117] "RemoveContainer" containerID="0af43d58467b46b1e77dd959a1719d64fc12f884990a2205d136825e018d4631" Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.559401 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qjcq5"] Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.568450 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qjcq5"] Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.584750 4766 scope.go:117] "RemoveContainer" containerID="bd8aec079d8023f622d084999511b1e2325d4bf230b558f247caad39222d2170" Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.605427 4766 scope.go:117] "RemoveContainer" containerID="b7edab5513a42ad7ebef5b61139fd013c0138d671ef36506f6308d112db4fed3" Oct 02 11:54:14 crc kubenswrapper[4766]: E1002 11:54:14.605815 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7edab5513a42ad7ebef5b61139fd013c0138d671ef36506f6308d112db4fed3\": container with ID starting with b7edab5513a42ad7ebef5b61139fd013c0138d671ef36506f6308d112db4fed3 not found: ID does not exist" containerID="b7edab5513a42ad7ebef5b61139fd013c0138d671ef36506f6308d112db4fed3" Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.605845 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7edab5513a42ad7ebef5b61139fd013c0138d671ef36506f6308d112db4fed3"} err="failed to get container status \"b7edab5513a42ad7ebef5b61139fd013c0138d671ef36506f6308d112db4fed3\": rpc error: code = NotFound desc = could not find container \"b7edab5513a42ad7ebef5b61139fd013c0138d671ef36506f6308d112db4fed3\": container with ID starting with b7edab5513a42ad7ebef5b61139fd013c0138d671ef36506f6308d112db4fed3 not found: ID does not exist" Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.605868 4766 scope.go:117] "RemoveContainer" containerID="0af43d58467b46b1e77dd959a1719d64fc12f884990a2205d136825e018d4631" Oct 02 11:54:14 crc kubenswrapper[4766]: E1002 11:54:14.606257 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0af43d58467b46b1e77dd959a1719d64fc12f884990a2205d136825e018d4631\": container with ID starting with 0af43d58467b46b1e77dd959a1719d64fc12f884990a2205d136825e018d4631 not found: ID does not exist" containerID="0af43d58467b46b1e77dd959a1719d64fc12f884990a2205d136825e018d4631" Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.606356 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0af43d58467b46b1e77dd959a1719d64fc12f884990a2205d136825e018d4631"} err="failed to get container status \"0af43d58467b46b1e77dd959a1719d64fc12f884990a2205d136825e018d4631\": rpc error: code = NotFound desc = could not find container \"0af43d58467b46b1e77dd959a1719d64fc12f884990a2205d136825e018d4631\": container with ID starting with 0af43d58467b46b1e77dd959a1719d64fc12f884990a2205d136825e018d4631 not found: ID does not exist" Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.606438 4766 scope.go:117] "RemoveContainer" containerID="bd8aec079d8023f622d084999511b1e2325d4bf230b558f247caad39222d2170" Oct 02 11:54:14 crc kubenswrapper[4766]: E1002 11:54:14.606791 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8aec079d8023f622d084999511b1e2325d4bf230b558f247caad39222d2170\": container with ID starting with bd8aec079d8023f622d084999511b1e2325d4bf230b558f247caad39222d2170 not found: ID does not exist" containerID="bd8aec079d8023f622d084999511b1e2325d4bf230b558f247caad39222d2170" Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.606819 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8aec079d8023f622d084999511b1e2325d4bf230b558f247caad39222d2170"} err="failed to get container status \"bd8aec079d8023f622d084999511b1e2325d4bf230b558f247caad39222d2170\": rpc error: code = NotFound desc = could not find container \"bd8aec079d8023f622d084999511b1e2325d4bf230b558f247caad39222d2170\": container with ID starting with bd8aec079d8023f622d084999511b1e2325d4bf230b558f247caad39222d2170 not found: ID does not exist" Oct 02 11:54:14 crc kubenswrapper[4766]: I1002 11:54:14.881491 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:54:14 crc kubenswrapper[4766]: E1002 11:54:14.881903 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:54:15 crc kubenswrapper[4766]: I1002 11:54:15.891861 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b1f739c-6128-4da8-a7e9-12ab8fa19f3a" path="/var/lib/kubelet/pods/6b1f739c-6128-4da8-a7e9-12ab8fa19f3a/volumes" Oct 02 11:54:29 crc kubenswrapper[4766]: I1002 11:54:29.881595 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:54:29 crc kubenswrapper[4766]: E1002 11:54:29.882656 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:54:39 crc kubenswrapper[4766]: I1002 11:54:39.261818 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-svwzq"] Oct 02 11:54:39 crc kubenswrapper[4766]: E1002 11:54:39.262567 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1f739c-6128-4da8-a7e9-12ab8fa19f3a" containerName="extract-utilities" Oct 02 11:54:39 crc kubenswrapper[4766]: I1002 11:54:39.262580 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1f739c-6128-4da8-a7e9-12ab8fa19f3a" containerName="extract-utilities" Oct 02 11:54:39 crc kubenswrapper[4766]: E1002 11:54:39.262589 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1f739c-6128-4da8-a7e9-12ab8fa19f3a" containerName="extract-content" Oct 02 11:54:39 crc kubenswrapper[4766]: I1002 11:54:39.262595 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1f739c-6128-4da8-a7e9-12ab8fa19f3a" containerName="extract-content" Oct 02 11:54:39 crc kubenswrapper[4766]: E1002 11:54:39.262609 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b1f739c-6128-4da8-a7e9-12ab8fa19f3a" containerName="registry-server" Oct 02 11:54:39 crc kubenswrapper[4766]: I1002 11:54:39.262615 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b1f739c-6128-4da8-a7e9-12ab8fa19f3a" containerName="registry-server" Oct 02 11:54:39 crc kubenswrapper[4766]: I1002 11:54:39.262773 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b1f739c-6128-4da8-a7e9-12ab8fa19f3a" containerName="registry-server" Oct 02 11:54:39 crc kubenswrapper[4766]: I1002 11:54:39.263768 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svwzq" Oct 02 11:54:39 crc kubenswrapper[4766]: I1002 11:54:39.270874 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-svwzq"] Oct 02 11:54:39 crc kubenswrapper[4766]: I1002 11:54:39.358584 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-utilities\") pod \"redhat-operators-svwzq\" (UID: \"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b\") " pod="openshift-marketplace/redhat-operators-svwzq" Oct 02 11:54:39 crc kubenswrapper[4766]: I1002 11:54:39.358656 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-catalog-content\") pod \"redhat-operators-svwzq\" (UID: \"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b\") " pod="openshift-marketplace/redhat-operators-svwzq" Oct 02 11:54:39 crc kubenswrapper[4766]: I1002 11:54:39.358710 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbwr7\" (UniqueName: \"kubernetes.io/projected/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-kube-api-access-pbwr7\") pod \"redhat-operators-svwzq\" (UID: \"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b\") " pod="openshift-marketplace/redhat-operators-svwzq" Oct 02 11:54:39 crc kubenswrapper[4766]: I1002 11:54:39.460335 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-utilities\") pod \"redhat-operators-svwzq\" (UID: \"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b\") " pod="openshift-marketplace/redhat-operators-svwzq" Oct 02 11:54:39 crc kubenswrapper[4766]: I1002 11:54:39.460401 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-catalog-content\") pod \"redhat-operators-svwzq\" (UID: \"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b\") " pod="openshift-marketplace/redhat-operators-svwzq" Oct 02 11:54:39 crc kubenswrapper[4766]: I1002 11:54:39.460433 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbwr7\" (UniqueName: \"kubernetes.io/projected/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-kube-api-access-pbwr7\") pod \"redhat-operators-svwzq\" (UID: \"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b\") " pod="openshift-marketplace/redhat-operators-svwzq" Oct 02 11:54:39 crc kubenswrapper[4766]: I1002 11:54:39.461087 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-utilities\") pod \"redhat-operators-svwzq\" (UID: \"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b\") " pod="openshift-marketplace/redhat-operators-svwzq" Oct 02 11:54:39 crc kubenswrapper[4766]: I1002 11:54:39.461087 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-catalog-content\") pod \"redhat-operators-svwzq\" (UID: \"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b\") " pod="openshift-marketplace/redhat-operators-svwzq" Oct 02 11:54:39 crc kubenswrapper[4766]: I1002 11:54:39.481789 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbwr7\" (UniqueName: \"kubernetes.io/projected/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-kube-api-access-pbwr7\") pod \"redhat-operators-svwzq\" (UID: \"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b\") " pod="openshift-marketplace/redhat-operators-svwzq" Oct 02 11:54:39 crc kubenswrapper[4766]: I1002 11:54:39.596916 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svwzq" Oct 02 11:54:40 crc kubenswrapper[4766]: I1002 11:54:40.056972 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-svwzq"] Oct 02 11:54:40 crc kubenswrapper[4766]: I1002 11:54:40.728379 4766 generic.go:334] "Generic (PLEG): container finished" podID="7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b" containerID="4e3e5a52eef67611fbd4b54b45a53d9e44dfbfcd4927dff2fa9d6972fe5fee89" exitCode=0 Oct 02 11:54:40 crc kubenswrapper[4766]: I1002 11:54:40.728536 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svwzq" event={"ID":"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b","Type":"ContainerDied","Data":"4e3e5a52eef67611fbd4b54b45a53d9e44dfbfcd4927dff2fa9d6972fe5fee89"} Oct 02 11:54:40 crc kubenswrapper[4766]: I1002 11:54:40.728856 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svwzq" event={"ID":"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b","Type":"ContainerStarted","Data":"5c516f1dd477e9b807e50ddec918fdf1048202a53bb168d228691af5d274c956"} Oct 02 11:54:42 crc kubenswrapper[4766]: I1002 11:54:42.744682 4766 generic.go:334] "Generic (PLEG): container finished" podID="7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b" containerID="03d5d099d46d818ecad272b7851bcb6cb1230e71119e96be621683f128bd5773" exitCode=0 Oct 02 11:54:42 crc kubenswrapper[4766]: I1002 11:54:42.744744 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svwzq" event={"ID":"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b","Type":"ContainerDied","Data":"03d5d099d46d818ecad272b7851bcb6cb1230e71119e96be621683f128bd5773"} Oct 02 11:54:43 crc kubenswrapper[4766]: I1002 11:54:43.880994 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:54:43 crc kubenswrapper[4766]: E1002 11:54:43.881877 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:54:44 crc kubenswrapper[4766]: I1002 11:54:44.765624 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svwzq" event={"ID":"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b","Type":"ContainerStarted","Data":"e7bd54bff1b2b2d904e0f5721e36c5df1140e801dd3659bb15b5a34aa96c1427"} Oct 02 11:54:44 crc kubenswrapper[4766]: I1002 11:54:44.795723 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-svwzq" podStartSLOduration=3.137700347 podStartE2EDuration="5.795704441s" podCreationTimestamp="2025-10-02 11:54:39 +0000 UTC" firstStartedPulling="2025-10-02 11:54:40.73080467 +0000 UTC m=+3795.673675634" lastFinishedPulling="2025-10-02 11:54:43.388808774 +0000 UTC m=+3798.331679728" observedRunningTime="2025-10-02 11:54:44.79133603 +0000 UTC m=+3799.734207004" watchObservedRunningTime="2025-10-02 11:54:44.795704441 +0000 UTC m=+3799.738575385" Oct 02 11:54:49 crc kubenswrapper[4766]: I1002 11:54:49.597547 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-svwzq" Oct 02 11:54:49 crc kubenswrapper[4766]: I1002 11:54:49.597835 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-svwzq" Oct 02 11:54:49 crc kubenswrapper[4766]: I1002 11:54:49.643639 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-svwzq" Oct 02 11:54:49 crc kubenswrapper[4766]: I1002 11:54:49.856942 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-svwzq" Oct 02 11:54:49 crc kubenswrapper[4766]: I1002 11:54:49.920864 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-svwzq"] Oct 02 11:54:51 crc kubenswrapper[4766]: I1002 11:54:51.830710 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-svwzq" podUID="7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b" containerName="registry-server" containerID="cri-o://e7bd54bff1b2b2d904e0f5721e36c5df1140e801dd3659bb15b5a34aa96c1427" gracePeriod=2 Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.286711 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svwzq" Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.458319 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-utilities\") pod \"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b\" (UID: \"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b\") " Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.458476 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbwr7\" (UniqueName: \"kubernetes.io/projected/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-kube-api-access-pbwr7\") pod \"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b\" (UID: \"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b\") " Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.458528 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-catalog-content\") pod \"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b\" (UID: \"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b\") " Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.461860 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-utilities" (OuterVolumeSpecName: "utilities") pod "7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b" (UID: "7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.464411 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-kube-api-access-pbwr7" (OuterVolumeSpecName: "kube-api-access-pbwr7") pod "7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b" (UID: "7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b"). InnerVolumeSpecName "kube-api-access-pbwr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.560300 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.560335 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbwr7\" (UniqueName: \"kubernetes.io/projected/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-kube-api-access-pbwr7\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.560984 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b" (UID: "7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.661543 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.838564 4766 generic.go:334] "Generic (PLEG): container finished" podID="7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b" containerID="e7bd54bff1b2b2d904e0f5721e36c5df1140e801dd3659bb15b5a34aa96c1427" exitCode=0 Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.838693 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svwzq" event={"ID":"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b","Type":"ContainerDied","Data":"e7bd54bff1b2b2d904e0f5721e36c5df1140e801dd3659bb15b5a34aa96c1427"} Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.838768 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svwzq" Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.839661 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svwzq" event={"ID":"7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b","Type":"ContainerDied","Data":"5c516f1dd477e9b807e50ddec918fdf1048202a53bb168d228691af5d274c956"} Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.839694 4766 scope.go:117] "RemoveContainer" containerID="e7bd54bff1b2b2d904e0f5721e36c5df1140e801dd3659bb15b5a34aa96c1427" Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.855316 4766 scope.go:117] "RemoveContainer" containerID="03d5d099d46d818ecad272b7851bcb6cb1230e71119e96be621683f128bd5773" Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.867532 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-svwzq"] Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.880106 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-svwzq"] Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.903751 4766 scope.go:117] "RemoveContainer" containerID="4e3e5a52eef67611fbd4b54b45a53d9e44dfbfcd4927dff2fa9d6972fe5fee89" Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.921137 4766 scope.go:117] "RemoveContainer" containerID="e7bd54bff1b2b2d904e0f5721e36c5df1140e801dd3659bb15b5a34aa96c1427" Oct 02 11:54:52 crc kubenswrapper[4766]: E1002 11:54:52.921590 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7bd54bff1b2b2d904e0f5721e36c5df1140e801dd3659bb15b5a34aa96c1427\": container with ID starting with e7bd54bff1b2b2d904e0f5721e36c5df1140e801dd3659bb15b5a34aa96c1427 not found: ID does not exist" containerID="e7bd54bff1b2b2d904e0f5721e36c5df1140e801dd3659bb15b5a34aa96c1427" Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.921630 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7bd54bff1b2b2d904e0f5721e36c5df1140e801dd3659bb15b5a34aa96c1427"} err="failed to get container status \"e7bd54bff1b2b2d904e0f5721e36c5df1140e801dd3659bb15b5a34aa96c1427\": rpc error: code = NotFound desc = could not find container \"e7bd54bff1b2b2d904e0f5721e36c5df1140e801dd3659bb15b5a34aa96c1427\": container with ID starting with e7bd54bff1b2b2d904e0f5721e36c5df1140e801dd3659bb15b5a34aa96c1427 not found: ID does not exist" Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.921658 4766 scope.go:117] "RemoveContainer" containerID="03d5d099d46d818ecad272b7851bcb6cb1230e71119e96be621683f128bd5773" Oct 02 11:54:52 crc kubenswrapper[4766]: E1002 11:54:52.921999 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d5d099d46d818ecad272b7851bcb6cb1230e71119e96be621683f128bd5773\": container with ID starting with 03d5d099d46d818ecad272b7851bcb6cb1230e71119e96be621683f128bd5773 not found: ID does not exist" containerID="03d5d099d46d818ecad272b7851bcb6cb1230e71119e96be621683f128bd5773" Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.922054 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d5d099d46d818ecad272b7851bcb6cb1230e71119e96be621683f128bd5773"} err="failed to get container status \"03d5d099d46d818ecad272b7851bcb6cb1230e71119e96be621683f128bd5773\": rpc error: code = NotFound desc = could not find container \"03d5d099d46d818ecad272b7851bcb6cb1230e71119e96be621683f128bd5773\": container with ID starting with 03d5d099d46d818ecad272b7851bcb6cb1230e71119e96be621683f128bd5773 not found: ID does not exist" Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.922089 4766 scope.go:117] "RemoveContainer" containerID="4e3e5a52eef67611fbd4b54b45a53d9e44dfbfcd4927dff2fa9d6972fe5fee89" Oct 02 11:54:52 crc kubenswrapper[4766]: E1002 11:54:52.922396 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3e5a52eef67611fbd4b54b45a53d9e44dfbfcd4927dff2fa9d6972fe5fee89\": container with ID starting with 4e3e5a52eef67611fbd4b54b45a53d9e44dfbfcd4927dff2fa9d6972fe5fee89 not found: ID does not exist" containerID="4e3e5a52eef67611fbd4b54b45a53d9e44dfbfcd4927dff2fa9d6972fe5fee89" Oct 02 11:54:52 crc kubenswrapper[4766]: I1002 11:54:52.922428 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3e5a52eef67611fbd4b54b45a53d9e44dfbfcd4927dff2fa9d6972fe5fee89"} err="failed to get container status \"4e3e5a52eef67611fbd4b54b45a53d9e44dfbfcd4927dff2fa9d6972fe5fee89\": rpc error: code = NotFound desc = could not find container \"4e3e5a52eef67611fbd4b54b45a53d9e44dfbfcd4927dff2fa9d6972fe5fee89\": container with ID starting with 4e3e5a52eef67611fbd4b54b45a53d9e44dfbfcd4927dff2fa9d6972fe5fee89 not found: ID does not exist" Oct 02 11:54:53 crc kubenswrapper[4766]: I1002 11:54:53.891471 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b" path="/var/lib/kubelet/pods/7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b/volumes" Oct 02 11:54:56 crc kubenswrapper[4766]: I1002 11:54:56.881670 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:54:56 crc kubenswrapper[4766]: E1002 11:54:56.881863 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:55:10 crc kubenswrapper[4766]: I1002 11:55:10.881528 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:55:10 crc kubenswrapper[4766]: E1002 11:55:10.882163 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:55:21 crc kubenswrapper[4766]: I1002 11:55:21.882267 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:55:21 crc kubenswrapper[4766]: E1002 11:55:21.883270 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:55:32 crc kubenswrapper[4766]: I1002 11:55:32.881894 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:55:32 crc kubenswrapper[4766]: E1002 11:55:32.883658 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:55:46 crc kubenswrapper[4766]: I1002 11:55:46.881568 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:55:46 crc kubenswrapper[4766]: E1002 11:55:46.882465 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:56:01 crc kubenswrapper[4766]: I1002 11:56:01.881398 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:56:01 crc kubenswrapper[4766]: E1002 11:56:01.882358 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:56:15 crc kubenswrapper[4766]: I1002 11:56:15.892043 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:56:15 crc kubenswrapper[4766]: E1002 11:56:15.893557 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 11:56:29 crc kubenswrapper[4766]: I1002 11:56:29.882158 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 11:56:30 crc kubenswrapper[4766]: I1002 11:56:30.581144 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"1985a4a59e3bf633e1e013e6bc422dd2a6b792a2cf12a1a6a5b7903df0efb469"} Oct 02 11:58:54 crc kubenswrapper[4766]: I1002 11:58:54.432430 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:58:54 crc kubenswrapper[4766]: I1002 11:58:54.433246 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:59:24 crc kubenswrapper[4766]: I1002 11:59:24.432680 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:59:24 crc kubenswrapper[4766]: I1002 11:59:24.433272 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:59:54 crc kubenswrapper[4766]: I1002 11:59:54.432794 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 11:59:54 crc kubenswrapper[4766]: I1002 11:59:54.434178 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 11:59:54 crc kubenswrapper[4766]: I1002 11:59:54.434303 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 11:59:54 crc kubenswrapper[4766]: I1002 11:59:54.435946 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1985a4a59e3bf633e1e013e6bc422dd2a6b792a2cf12a1a6a5b7903df0efb469"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 11:59:54 crc kubenswrapper[4766]: I1002 11:59:54.436069 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://1985a4a59e3bf633e1e013e6bc422dd2a6b792a2cf12a1a6a5b7903df0efb469" gracePeriod=600 Oct 02 11:59:55 crc kubenswrapper[4766]: I1002 11:59:55.205605 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="1985a4a59e3bf633e1e013e6bc422dd2a6b792a2cf12a1a6a5b7903df0efb469" exitCode=0 Oct 02 11:59:55 crc kubenswrapper[4766]: I1002 11:59:55.205716 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"1985a4a59e3bf633e1e013e6bc422dd2a6b792a2cf12a1a6a5b7903df0efb469"} Oct 02 11:59:55 crc kubenswrapper[4766]: I1002 11:59:55.206639 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f"} Oct 02 11:59:55 crc kubenswrapper[4766]: I1002 11:59:55.206719 4766 scope.go:117] "RemoveContainer" containerID="0e253c7382fabafce4bca073cf17d119e679ca59e7f249133d12c99efa9935ba" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.153163 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n"] Oct 02 12:00:00 crc kubenswrapper[4766]: E1002 12:00:00.154004 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b" containerName="extract-content" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.154018 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b" containerName="extract-content" Oct 02 12:00:00 crc kubenswrapper[4766]: E1002 12:00:00.154036 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b" containerName="registry-server" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.154043 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b" containerName="registry-server" Oct 02 12:00:00 crc kubenswrapper[4766]: E1002 12:00:00.154057 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b" containerName="extract-utilities" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.154062 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b" containerName="extract-utilities" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.154208 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7431e3de-b2ed-4b43-ab45-c7dad2ea1d1b" containerName="registry-server" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.154659 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.156782 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.157138 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.167319 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n"] Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.268782 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50cd8c1e-373f-4ca6-b413-678459c490f1-config-volume\") pod \"collect-profiles-29323440-8bn8n\" (UID: \"50cd8c1e-373f-4ca6-b413-678459c490f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.269171 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50cd8c1e-373f-4ca6-b413-678459c490f1-secret-volume\") pod \"collect-profiles-29323440-8bn8n\" (UID: \"50cd8c1e-373f-4ca6-b413-678459c490f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.269251 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d54t5\" (UniqueName: \"kubernetes.io/projected/50cd8c1e-373f-4ca6-b413-678459c490f1-kube-api-access-d54t5\") pod \"collect-profiles-29323440-8bn8n\" (UID: \"50cd8c1e-373f-4ca6-b413-678459c490f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.370248 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50cd8c1e-373f-4ca6-b413-678459c490f1-secret-volume\") pod \"collect-profiles-29323440-8bn8n\" (UID: \"50cd8c1e-373f-4ca6-b413-678459c490f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.370370 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d54t5\" (UniqueName: \"kubernetes.io/projected/50cd8c1e-373f-4ca6-b413-678459c490f1-kube-api-access-d54t5\") pod \"collect-profiles-29323440-8bn8n\" (UID: \"50cd8c1e-373f-4ca6-b413-678459c490f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.370404 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50cd8c1e-373f-4ca6-b413-678459c490f1-config-volume\") pod \"collect-profiles-29323440-8bn8n\" (UID: \"50cd8c1e-373f-4ca6-b413-678459c490f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.371553 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50cd8c1e-373f-4ca6-b413-678459c490f1-config-volume\") pod \"collect-profiles-29323440-8bn8n\" (UID: \"50cd8c1e-373f-4ca6-b413-678459c490f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.378494 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50cd8c1e-373f-4ca6-b413-678459c490f1-secret-volume\") pod \"collect-profiles-29323440-8bn8n\" (UID: \"50cd8c1e-373f-4ca6-b413-678459c490f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.387280 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d54t5\" (UniqueName: \"kubernetes.io/projected/50cd8c1e-373f-4ca6-b413-678459c490f1-kube-api-access-d54t5\") pod \"collect-profiles-29323440-8bn8n\" (UID: \"50cd8c1e-373f-4ca6-b413-678459c490f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.477278 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n" Oct 02 12:00:00 crc kubenswrapper[4766]: I1002 12:00:00.914690 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n"] Oct 02 12:00:00 crc kubenswrapper[4766]: W1002 12:00:00.922731 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50cd8c1e_373f_4ca6_b413_678459c490f1.slice/crio-9bd9f44b5d26629a92d15ee8f7bf0aa63ce07ae8dcfebc5bdd44d28b00fa6a38 WatchSource:0}: Error finding container 9bd9f44b5d26629a92d15ee8f7bf0aa63ce07ae8dcfebc5bdd44d28b00fa6a38: Status 404 returned error can't find the container with id 9bd9f44b5d26629a92d15ee8f7bf0aa63ce07ae8dcfebc5bdd44d28b00fa6a38 Oct 02 12:00:01 crc kubenswrapper[4766]: I1002 12:00:01.258715 4766 generic.go:334] "Generic (PLEG): container finished" podID="50cd8c1e-373f-4ca6-b413-678459c490f1" containerID="5e4b4196a9a0f10f8ef7aea7e40b04b02466fa4eafd1f40b8f1c86e33154c90f" exitCode=0 Oct 02 12:00:01 crc kubenswrapper[4766]: I1002 12:00:01.258771 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n" event={"ID":"50cd8c1e-373f-4ca6-b413-678459c490f1","Type":"ContainerDied","Data":"5e4b4196a9a0f10f8ef7aea7e40b04b02466fa4eafd1f40b8f1c86e33154c90f"} Oct 02 12:00:01 crc kubenswrapper[4766]: I1002 12:00:01.258801 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n" event={"ID":"50cd8c1e-373f-4ca6-b413-678459c490f1","Type":"ContainerStarted","Data":"9bd9f44b5d26629a92d15ee8f7bf0aa63ce07ae8dcfebc5bdd44d28b00fa6a38"} Oct 02 12:00:02 crc kubenswrapper[4766]: I1002 12:00:02.544782 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n" Oct 02 12:00:02 crc kubenswrapper[4766]: I1002 12:00:02.616514 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d54t5\" (UniqueName: \"kubernetes.io/projected/50cd8c1e-373f-4ca6-b413-678459c490f1-kube-api-access-d54t5\") pod \"50cd8c1e-373f-4ca6-b413-678459c490f1\" (UID: \"50cd8c1e-373f-4ca6-b413-678459c490f1\") " Oct 02 12:00:02 crc kubenswrapper[4766]: I1002 12:00:02.616574 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50cd8c1e-373f-4ca6-b413-678459c490f1-config-volume\") pod \"50cd8c1e-373f-4ca6-b413-678459c490f1\" (UID: \"50cd8c1e-373f-4ca6-b413-678459c490f1\") " Oct 02 12:00:02 crc kubenswrapper[4766]: I1002 12:00:02.616604 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50cd8c1e-373f-4ca6-b413-678459c490f1-secret-volume\") pod \"50cd8c1e-373f-4ca6-b413-678459c490f1\" (UID: \"50cd8c1e-373f-4ca6-b413-678459c490f1\") " Oct 02 12:00:02 crc kubenswrapper[4766]: I1002 12:00:02.617872 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50cd8c1e-373f-4ca6-b413-678459c490f1-config-volume" (OuterVolumeSpecName: "config-volume") pod "50cd8c1e-373f-4ca6-b413-678459c490f1" (UID: "50cd8c1e-373f-4ca6-b413-678459c490f1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:00:02 crc kubenswrapper[4766]: I1002 12:00:02.622945 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50cd8c1e-373f-4ca6-b413-678459c490f1-kube-api-access-d54t5" (OuterVolumeSpecName: "kube-api-access-d54t5") pod "50cd8c1e-373f-4ca6-b413-678459c490f1" (UID: "50cd8c1e-373f-4ca6-b413-678459c490f1"). InnerVolumeSpecName "kube-api-access-d54t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:00:02 crc kubenswrapper[4766]: I1002 12:00:02.623176 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50cd8c1e-373f-4ca6-b413-678459c490f1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "50cd8c1e-373f-4ca6-b413-678459c490f1" (UID: "50cd8c1e-373f-4ca6-b413-678459c490f1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:00:02 crc kubenswrapper[4766]: I1002 12:00:02.717817 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d54t5\" (UniqueName: \"kubernetes.io/projected/50cd8c1e-373f-4ca6-b413-678459c490f1-kube-api-access-d54t5\") on node \"crc\" DevicePath \"\"" Oct 02 12:00:02 crc kubenswrapper[4766]: I1002 12:00:02.717854 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50cd8c1e-373f-4ca6-b413-678459c490f1-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:00:02 crc kubenswrapper[4766]: I1002 12:00:02.717864 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50cd8c1e-373f-4ca6-b413-678459c490f1-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:00:03 crc kubenswrapper[4766]: I1002 12:00:03.271604 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n" event={"ID":"50cd8c1e-373f-4ca6-b413-678459c490f1","Type":"ContainerDied","Data":"9bd9f44b5d26629a92d15ee8f7bf0aa63ce07ae8dcfebc5bdd44d28b00fa6a38"} Oct 02 12:00:03 crc kubenswrapper[4766]: I1002 12:00:03.271640 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bd9f44b5d26629a92d15ee8f7bf0aa63ce07ae8dcfebc5bdd44d28b00fa6a38" Oct 02 12:00:03 crc kubenswrapper[4766]: I1002 12:00:03.271691 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n" Oct 02 12:00:03 crc kubenswrapper[4766]: I1002 12:00:03.626175 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr"] Oct 02 12:00:03 crc kubenswrapper[4766]: I1002 12:00:03.630807 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323395-b4hvr"] Oct 02 12:00:03 crc kubenswrapper[4766]: I1002 12:00:03.898243 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27b54e2f-1607-453e-8a7a-cd9d111e7d24" path="/var/lib/kubelet/pods/27b54e2f-1607-453e-8a7a-cd9d111e7d24/volumes" Oct 02 12:00:34 crc kubenswrapper[4766]: I1002 12:00:34.358367 4766 scope.go:117] "RemoveContainer" containerID="90183a3a44b60d7ca321c396aec23d59060319163bf7c3bc6b2f96bc5fbbbc4f" Oct 02 12:00:54 crc kubenswrapper[4766]: I1002 12:00:54.797621 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mdvrr"] Oct 02 12:00:54 crc kubenswrapper[4766]: E1002 12:00:54.798681 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50cd8c1e-373f-4ca6-b413-678459c490f1" containerName="collect-profiles" Oct 02 12:00:54 crc kubenswrapper[4766]: I1002 12:00:54.798695 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="50cd8c1e-373f-4ca6-b413-678459c490f1" containerName="collect-profiles" Oct 02 12:00:54 crc kubenswrapper[4766]: I1002 12:00:54.798849 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="50cd8c1e-373f-4ca6-b413-678459c490f1" containerName="collect-profiles" Oct 02 12:00:54 crc kubenswrapper[4766]: I1002 12:00:54.799922 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mdvrr" Oct 02 12:00:54 crc kubenswrapper[4766]: I1002 12:00:54.822371 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mdvrr"] Oct 02 12:00:54 crc kubenswrapper[4766]: I1002 12:00:54.979247 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a6dffc7-de32-4888-aa3e-8753d03e8016-utilities\") pod \"community-operators-mdvrr\" (UID: \"4a6dffc7-de32-4888-aa3e-8753d03e8016\") " pod="openshift-marketplace/community-operators-mdvrr" Oct 02 12:00:54 crc kubenswrapper[4766]: I1002 12:00:54.979580 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pfng\" (UniqueName: \"kubernetes.io/projected/4a6dffc7-de32-4888-aa3e-8753d03e8016-kube-api-access-9pfng\") pod \"community-operators-mdvrr\" (UID: \"4a6dffc7-de32-4888-aa3e-8753d03e8016\") " pod="openshift-marketplace/community-operators-mdvrr" Oct 02 12:00:54 crc kubenswrapper[4766]: I1002 12:00:54.979753 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a6dffc7-de32-4888-aa3e-8753d03e8016-catalog-content\") pod \"community-operators-mdvrr\" (UID: \"4a6dffc7-de32-4888-aa3e-8753d03e8016\") " pod="openshift-marketplace/community-operators-mdvrr" Oct 02 12:00:55 crc kubenswrapper[4766]: I1002 12:00:55.081985 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pfng\" (UniqueName: \"kubernetes.io/projected/4a6dffc7-de32-4888-aa3e-8753d03e8016-kube-api-access-9pfng\") pod \"community-operators-mdvrr\" (UID: \"4a6dffc7-de32-4888-aa3e-8753d03e8016\") " pod="openshift-marketplace/community-operators-mdvrr" Oct 02 12:00:55 crc kubenswrapper[4766]: I1002 12:00:55.082091 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a6dffc7-de32-4888-aa3e-8753d03e8016-catalog-content\") pod \"community-operators-mdvrr\" (UID: \"4a6dffc7-de32-4888-aa3e-8753d03e8016\") " pod="openshift-marketplace/community-operators-mdvrr" Oct 02 12:00:55 crc kubenswrapper[4766]: I1002 12:00:55.082129 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a6dffc7-de32-4888-aa3e-8753d03e8016-utilities\") pod \"community-operators-mdvrr\" (UID: \"4a6dffc7-de32-4888-aa3e-8753d03e8016\") " pod="openshift-marketplace/community-operators-mdvrr" Oct 02 12:00:55 crc kubenswrapper[4766]: I1002 12:00:55.082846 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a6dffc7-de32-4888-aa3e-8753d03e8016-catalog-content\") pod \"community-operators-mdvrr\" (UID: \"4a6dffc7-de32-4888-aa3e-8753d03e8016\") " pod="openshift-marketplace/community-operators-mdvrr" Oct 02 12:00:55 crc kubenswrapper[4766]: I1002 12:00:55.082934 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a6dffc7-de32-4888-aa3e-8753d03e8016-utilities\") pod \"community-operators-mdvrr\" (UID: \"4a6dffc7-de32-4888-aa3e-8753d03e8016\") " pod="openshift-marketplace/community-operators-mdvrr" Oct 02 12:00:55 crc kubenswrapper[4766]: I1002 12:00:55.108103 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pfng\" (UniqueName: \"kubernetes.io/projected/4a6dffc7-de32-4888-aa3e-8753d03e8016-kube-api-access-9pfng\") pod \"community-operators-mdvrr\" (UID: \"4a6dffc7-de32-4888-aa3e-8753d03e8016\") " pod="openshift-marketplace/community-operators-mdvrr" Oct 02 12:00:55 crc kubenswrapper[4766]: I1002 12:00:55.127409 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mdvrr" Oct 02 12:00:55 crc kubenswrapper[4766]: I1002 12:00:55.693470 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mdvrr"] Oct 02 12:00:55 crc kubenswrapper[4766]: I1002 12:00:55.747398 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mdvrr" event={"ID":"4a6dffc7-de32-4888-aa3e-8753d03e8016","Type":"ContainerStarted","Data":"2a0c5829c0099a6b81ac8feafe13305ca3a0a6ca00f67266b928745c5ac90529"} Oct 02 12:00:56 crc kubenswrapper[4766]: I1002 12:00:56.761151 4766 generic.go:334] "Generic (PLEG): container finished" podID="4a6dffc7-de32-4888-aa3e-8753d03e8016" containerID="3ddddc1883f5759d62fc15003f5c6922fe892dfad20b14c5b10a2a10a0eceb90" exitCode=0 Oct 02 12:00:56 crc kubenswrapper[4766]: I1002 12:00:56.761231 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mdvrr" event={"ID":"4a6dffc7-de32-4888-aa3e-8753d03e8016","Type":"ContainerDied","Data":"3ddddc1883f5759d62fc15003f5c6922fe892dfad20b14c5b10a2a10a0eceb90"} Oct 02 12:00:56 crc kubenswrapper[4766]: I1002 12:00:56.763450 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:00:58 crc kubenswrapper[4766]: I1002 12:00:58.780327 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mdvrr" event={"ID":"4a6dffc7-de32-4888-aa3e-8753d03e8016","Type":"ContainerStarted","Data":"f9666fca3127e550d89eb4546d908bbff9a4698b2806d7e5eebba36d36366964"} Oct 02 12:00:59 crc kubenswrapper[4766]: I1002 12:00:59.801463 4766 generic.go:334] "Generic (PLEG): container finished" podID="4a6dffc7-de32-4888-aa3e-8753d03e8016" containerID="f9666fca3127e550d89eb4546d908bbff9a4698b2806d7e5eebba36d36366964" exitCode=0 Oct 02 12:00:59 crc kubenswrapper[4766]: I1002 12:00:59.801582 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mdvrr" event={"ID":"4a6dffc7-de32-4888-aa3e-8753d03e8016","Type":"ContainerDied","Data":"f9666fca3127e550d89eb4546d908bbff9a4698b2806d7e5eebba36d36366964"} Oct 02 12:01:00 crc kubenswrapper[4766]: I1002 12:01:00.822375 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mdvrr" event={"ID":"4a6dffc7-de32-4888-aa3e-8753d03e8016","Type":"ContainerStarted","Data":"d5c6ac42231c3fa60f69243fddf1769b1ae694ffeb173f37f50ecd5f291edc91"} Oct 02 12:01:00 crc kubenswrapper[4766]: I1002 12:01:00.845880 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mdvrr" podStartSLOduration=3.1867820829999998 podStartE2EDuration="6.845859938s" podCreationTimestamp="2025-10-02 12:00:54 +0000 UTC" firstStartedPulling="2025-10-02 12:00:56.763232977 +0000 UTC m=+4171.706103921" lastFinishedPulling="2025-10-02 12:01:00.422310822 +0000 UTC m=+4175.365181776" observedRunningTime="2025-10-02 12:01:00.840919589 +0000 UTC m=+4175.783790543" watchObservedRunningTime="2025-10-02 12:01:00.845859938 +0000 UTC m=+4175.788730882" Oct 02 12:01:05 crc kubenswrapper[4766]: I1002 12:01:05.128408 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mdvrr" Oct 02 12:01:05 crc kubenswrapper[4766]: I1002 12:01:05.128956 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mdvrr" Oct 02 12:01:05 crc kubenswrapper[4766]: I1002 12:01:05.177215 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mdvrr" Oct 02 12:01:05 crc kubenswrapper[4766]: I1002 12:01:05.897282 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mdvrr" Oct 02 12:01:05 crc kubenswrapper[4766]: I1002 12:01:05.940920 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mdvrr"] Oct 02 12:01:07 crc kubenswrapper[4766]: I1002 12:01:07.880742 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mdvrr" podUID="4a6dffc7-de32-4888-aa3e-8753d03e8016" containerName="registry-server" containerID="cri-o://d5c6ac42231c3fa60f69243fddf1769b1ae694ffeb173f37f50ecd5f291edc91" gracePeriod=2 Oct 02 12:01:08 crc kubenswrapper[4766]: I1002 12:01:08.892341 4766 generic.go:334] "Generic (PLEG): container finished" podID="4a6dffc7-de32-4888-aa3e-8753d03e8016" containerID="d5c6ac42231c3fa60f69243fddf1769b1ae694ffeb173f37f50ecd5f291edc91" exitCode=0 Oct 02 12:01:08 crc kubenswrapper[4766]: I1002 12:01:08.892425 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mdvrr" event={"ID":"4a6dffc7-de32-4888-aa3e-8753d03e8016","Type":"ContainerDied","Data":"d5c6ac42231c3fa60f69243fddf1769b1ae694ffeb173f37f50ecd5f291edc91"} Oct 02 12:01:08 crc kubenswrapper[4766]: I1002 12:01:08.938932 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mdvrr" Oct 02 12:01:09 crc kubenswrapper[4766]: I1002 12:01:09.116658 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a6dffc7-de32-4888-aa3e-8753d03e8016-catalog-content\") pod \"4a6dffc7-de32-4888-aa3e-8753d03e8016\" (UID: \"4a6dffc7-de32-4888-aa3e-8753d03e8016\") " Oct 02 12:01:09 crc kubenswrapper[4766]: I1002 12:01:09.116752 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a6dffc7-de32-4888-aa3e-8753d03e8016-utilities\") pod \"4a6dffc7-de32-4888-aa3e-8753d03e8016\" (UID: \"4a6dffc7-de32-4888-aa3e-8753d03e8016\") " Oct 02 12:01:09 crc kubenswrapper[4766]: I1002 12:01:09.116786 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pfng\" (UniqueName: \"kubernetes.io/projected/4a6dffc7-de32-4888-aa3e-8753d03e8016-kube-api-access-9pfng\") pod \"4a6dffc7-de32-4888-aa3e-8753d03e8016\" (UID: \"4a6dffc7-de32-4888-aa3e-8753d03e8016\") " Oct 02 12:01:09 crc kubenswrapper[4766]: I1002 12:01:09.117697 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a6dffc7-de32-4888-aa3e-8753d03e8016-utilities" (OuterVolumeSpecName: "utilities") pod "4a6dffc7-de32-4888-aa3e-8753d03e8016" (UID: "4a6dffc7-de32-4888-aa3e-8753d03e8016"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:01:09 crc kubenswrapper[4766]: I1002 12:01:09.123214 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a6dffc7-de32-4888-aa3e-8753d03e8016-kube-api-access-9pfng" (OuterVolumeSpecName: "kube-api-access-9pfng") pod "4a6dffc7-de32-4888-aa3e-8753d03e8016" (UID: "4a6dffc7-de32-4888-aa3e-8753d03e8016"). InnerVolumeSpecName "kube-api-access-9pfng". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:01:09 crc kubenswrapper[4766]: I1002 12:01:09.168688 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a6dffc7-de32-4888-aa3e-8753d03e8016-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a6dffc7-de32-4888-aa3e-8753d03e8016" (UID: "4a6dffc7-de32-4888-aa3e-8753d03e8016"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:01:09 crc kubenswrapper[4766]: I1002 12:01:09.218558 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a6dffc7-de32-4888-aa3e-8753d03e8016-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:01:09 crc kubenswrapper[4766]: I1002 12:01:09.218666 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pfng\" (UniqueName: \"kubernetes.io/projected/4a6dffc7-de32-4888-aa3e-8753d03e8016-kube-api-access-9pfng\") on node \"crc\" DevicePath \"\"" Oct 02 12:01:09 crc kubenswrapper[4766]: I1002 12:01:09.218689 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a6dffc7-de32-4888-aa3e-8753d03e8016-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:01:09 crc kubenswrapper[4766]: I1002 12:01:09.905048 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mdvrr" event={"ID":"4a6dffc7-de32-4888-aa3e-8753d03e8016","Type":"ContainerDied","Data":"2a0c5829c0099a6b81ac8feafe13305ca3a0a6ca00f67266b928745c5ac90529"} Oct 02 12:01:09 crc kubenswrapper[4766]: I1002 12:01:09.905137 4766 scope.go:117] "RemoveContainer" containerID="d5c6ac42231c3fa60f69243fddf1769b1ae694ffeb173f37f50ecd5f291edc91" Oct 02 12:01:09 crc kubenswrapper[4766]: I1002 12:01:09.905144 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mdvrr" Oct 02 12:01:09 crc kubenswrapper[4766]: I1002 12:01:09.942285 4766 scope.go:117] "RemoveContainer" containerID="f9666fca3127e550d89eb4546d908bbff9a4698b2806d7e5eebba36d36366964" Oct 02 12:01:09 crc kubenswrapper[4766]: I1002 12:01:09.964611 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mdvrr"] Oct 02 12:01:09 crc kubenswrapper[4766]: I1002 12:01:09.970529 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mdvrr"] Oct 02 12:01:10 crc kubenswrapper[4766]: I1002 12:01:10.008674 4766 scope.go:117] "RemoveContainer" containerID="3ddddc1883f5759d62fc15003f5c6922fe892dfad20b14c5b10a2a10a0eceb90" Oct 02 12:01:11 crc kubenswrapper[4766]: I1002 12:01:11.892796 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a6dffc7-de32-4888-aa3e-8753d03e8016" path="/var/lib/kubelet/pods/4a6dffc7-de32-4888-aa3e-8753d03e8016/volumes" Oct 02 12:01:54 crc kubenswrapper[4766]: I1002 12:01:54.432450 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:01:54 crc kubenswrapper[4766]: I1002 12:01:54.433058 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:02:24 crc kubenswrapper[4766]: I1002 12:02:24.432041 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:02:24 crc kubenswrapper[4766]: I1002 12:02:24.432923 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:02:54 crc kubenswrapper[4766]: I1002 12:02:54.432224 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:02:54 crc kubenswrapper[4766]: I1002 12:02:54.432981 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:02:54 crc kubenswrapper[4766]: I1002 12:02:54.433058 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 12:02:54 crc kubenswrapper[4766]: I1002 12:02:54.433870 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:02:54 crc kubenswrapper[4766]: I1002 12:02:54.433939 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" gracePeriod=600 Oct 02 12:02:54 crc kubenswrapper[4766]: E1002 12:02:54.559122 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:02:54 crc kubenswrapper[4766]: I1002 12:02:54.797343 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" exitCode=0 Oct 02 12:02:54 crc kubenswrapper[4766]: I1002 12:02:54.797409 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f"} Oct 02 12:02:54 crc kubenswrapper[4766]: I1002 12:02:54.797459 4766 scope.go:117] "RemoveContainer" containerID="1985a4a59e3bf633e1e013e6bc422dd2a6b792a2cf12a1a6a5b7903df0efb469" Oct 02 12:02:54 crc kubenswrapper[4766]: I1002 12:02:54.797841 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:02:54 crc kubenswrapper[4766]: E1002 12:02:54.798068 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:03:05 crc kubenswrapper[4766]: I1002 12:03:05.889248 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:03:05 crc kubenswrapper[4766]: E1002 12:03:05.890963 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:03:17 crc kubenswrapper[4766]: I1002 12:03:17.881677 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:03:17 crc kubenswrapper[4766]: E1002 12:03:17.882808 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:03:29 crc kubenswrapper[4766]: I1002 12:03:29.881776 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:03:29 crc kubenswrapper[4766]: E1002 12:03:29.883101 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:03:42 crc kubenswrapper[4766]: I1002 12:03:42.882079 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:03:42 crc kubenswrapper[4766]: E1002 12:03:42.883293 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:03:57 crc kubenswrapper[4766]: I1002 12:03:57.882734 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:03:57 crc kubenswrapper[4766]: E1002 12:03:57.884059 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:04:03 crc kubenswrapper[4766]: I1002 12:04:03.514854 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n8m77"] Oct 02 12:04:03 crc kubenswrapper[4766]: E1002 12:04:03.517912 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6dffc7-de32-4888-aa3e-8753d03e8016" containerName="registry-server" Oct 02 12:04:03 crc kubenswrapper[4766]: I1002 12:04:03.517931 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6dffc7-de32-4888-aa3e-8753d03e8016" containerName="registry-server" Oct 02 12:04:03 crc kubenswrapper[4766]: E1002 12:04:03.517944 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6dffc7-de32-4888-aa3e-8753d03e8016" containerName="extract-content" Oct 02 12:04:03 crc kubenswrapper[4766]: I1002 12:04:03.517950 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6dffc7-de32-4888-aa3e-8753d03e8016" containerName="extract-content" Oct 02 12:04:03 crc kubenswrapper[4766]: E1002 12:04:03.517975 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a6dffc7-de32-4888-aa3e-8753d03e8016" containerName="extract-utilities" Oct 02 12:04:03 crc kubenswrapper[4766]: I1002 12:04:03.517981 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a6dffc7-de32-4888-aa3e-8753d03e8016" containerName="extract-utilities" Oct 02 12:04:03 crc kubenswrapper[4766]: I1002 12:04:03.518126 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a6dffc7-de32-4888-aa3e-8753d03e8016" containerName="registry-server" Oct 02 12:04:03 crc kubenswrapper[4766]: I1002 12:04:03.519307 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n8m77" Oct 02 12:04:03 crc kubenswrapper[4766]: I1002 12:04:03.529328 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n8m77"] Oct 02 12:04:03 crc kubenswrapper[4766]: I1002 12:04:03.672452 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3132350-14b2-4817-9ce6-5f81c68a36e9-catalog-content\") pod \"redhat-marketplace-n8m77\" (UID: \"f3132350-14b2-4817-9ce6-5f81c68a36e9\") " pod="openshift-marketplace/redhat-marketplace-n8m77" Oct 02 12:04:03 crc kubenswrapper[4766]: I1002 12:04:03.672534 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdflm\" (UniqueName: \"kubernetes.io/projected/f3132350-14b2-4817-9ce6-5f81c68a36e9-kube-api-access-xdflm\") pod \"redhat-marketplace-n8m77\" (UID: \"f3132350-14b2-4817-9ce6-5f81c68a36e9\") " pod="openshift-marketplace/redhat-marketplace-n8m77" Oct 02 12:04:03 crc kubenswrapper[4766]: I1002 12:04:03.672676 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3132350-14b2-4817-9ce6-5f81c68a36e9-utilities\") pod \"redhat-marketplace-n8m77\" (UID: \"f3132350-14b2-4817-9ce6-5f81c68a36e9\") " pod="openshift-marketplace/redhat-marketplace-n8m77" Oct 02 12:04:03 crc kubenswrapper[4766]: I1002 12:04:03.774250 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdflm\" (UniqueName: \"kubernetes.io/projected/f3132350-14b2-4817-9ce6-5f81c68a36e9-kube-api-access-xdflm\") pod \"redhat-marketplace-n8m77\" (UID: \"f3132350-14b2-4817-9ce6-5f81c68a36e9\") " pod="openshift-marketplace/redhat-marketplace-n8m77" Oct 02 12:04:03 crc kubenswrapper[4766]: I1002 12:04:03.774387 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3132350-14b2-4817-9ce6-5f81c68a36e9-utilities\") pod \"redhat-marketplace-n8m77\" (UID: \"f3132350-14b2-4817-9ce6-5f81c68a36e9\") " pod="openshift-marketplace/redhat-marketplace-n8m77" Oct 02 12:04:03 crc kubenswrapper[4766]: I1002 12:04:03.774438 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3132350-14b2-4817-9ce6-5f81c68a36e9-catalog-content\") pod \"redhat-marketplace-n8m77\" (UID: \"f3132350-14b2-4817-9ce6-5f81c68a36e9\") " pod="openshift-marketplace/redhat-marketplace-n8m77" Oct 02 12:04:03 crc kubenswrapper[4766]: I1002 12:04:03.775813 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3132350-14b2-4817-9ce6-5f81c68a36e9-utilities\") pod \"redhat-marketplace-n8m77\" (UID: \"f3132350-14b2-4817-9ce6-5f81c68a36e9\") " pod="openshift-marketplace/redhat-marketplace-n8m77" Oct 02 12:04:03 crc kubenswrapper[4766]: I1002 12:04:03.776013 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3132350-14b2-4817-9ce6-5f81c68a36e9-catalog-content\") pod \"redhat-marketplace-n8m77\" (UID: \"f3132350-14b2-4817-9ce6-5f81c68a36e9\") " pod="openshift-marketplace/redhat-marketplace-n8m77" Oct 02 12:04:03 crc kubenswrapper[4766]: I1002 12:04:03.797759 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdflm\" (UniqueName: \"kubernetes.io/projected/f3132350-14b2-4817-9ce6-5f81c68a36e9-kube-api-access-xdflm\") pod \"redhat-marketplace-n8m77\" (UID: \"f3132350-14b2-4817-9ce6-5f81c68a36e9\") " pod="openshift-marketplace/redhat-marketplace-n8m77" Oct 02 12:04:03 crc kubenswrapper[4766]: I1002 12:04:03.851733 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n8m77" Oct 02 12:04:04 crc kubenswrapper[4766]: I1002 12:04:04.365828 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n8m77"] Oct 02 12:04:04 crc kubenswrapper[4766]: I1002 12:04:04.448828 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8m77" event={"ID":"f3132350-14b2-4817-9ce6-5f81c68a36e9","Type":"ContainerStarted","Data":"ecdd45f88edd4883ee02aeefd8b2364728d2141d7be66cf097be01066572f6eb"} Oct 02 12:04:05 crc kubenswrapper[4766]: I1002 12:04:05.461026 4766 generic.go:334] "Generic (PLEG): container finished" podID="f3132350-14b2-4817-9ce6-5f81c68a36e9" containerID="b073b06b5dbe93db170d8aaeeb6be74b9ee89e64fc64395f5b8b95070339d759" exitCode=0 Oct 02 12:04:05 crc kubenswrapper[4766]: I1002 12:04:05.461110 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8m77" event={"ID":"f3132350-14b2-4817-9ce6-5f81c68a36e9","Type":"ContainerDied","Data":"b073b06b5dbe93db170d8aaeeb6be74b9ee89e64fc64395f5b8b95070339d759"} Oct 02 12:04:10 crc kubenswrapper[4766]: I1002 12:04:10.503488 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8m77" event={"ID":"f3132350-14b2-4817-9ce6-5f81c68a36e9","Type":"ContainerStarted","Data":"49dc31fd890b92de20920330b7bab9aa9766ef46f0ff6a51fd44e1f5adf188d0"} Oct 02 12:04:11 crc kubenswrapper[4766]: I1002 12:04:11.513691 4766 generic.go:334] "Generic (PLEG): container finished" podID="f3132350-14b2-4817-9ce6-5f81c68a36e9" containerID="49dc31fd890b92de20920330b7bab9aa9766ef46f0ff6a51fd44e1f5adf188d0" exitCode=0 Oct 02 12:04:11 crc kubenswrapper[4766]: I1002 12:04:11.513753 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8m77" event={"ID":"f3132350-14b2-4817-9ce6-5f81c68a36e9","Type":"ContainerDied","Data":"49dc31fd890b92de20920330b7bab9aa9766ef46f0ff6a51fd44e1f5adf188d0"} Oct 02 12:04:11 crc kubenswrapper[4766]: I1002 12:04:11.881999 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:04:11 crc kubenswrapper[4766]: E1002 12:04:11.882349 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:04:12 crc kubenswrapper[4766]: I1002 12:04:12.542944 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8m77" event={"ID":"f3132350-14b2-4817-9ce6-5f81c68a36e9","Type":"ContainerStarted","Data":"30b96a3732c241302895484d3093dc07a9b940518ddd6dff8b7cfc7f6ac385bc"} Oct 02 12:04:12 crc kubenswrapper[4766]: I1002 12:04:12.569305 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n8m77" podStartSLOduration=2.988109965 podStartE2EDuration="9.56928273s" podCreationTimestamp="2025-10-02 12:04:03 +0000 UTC" firstStartedPulling="2025-10-02 12:04:05.463372239 +0000 UTC m=+4360.406243183" lastFinishedPulling="2025-10-02 12:04:12.044544984 +0000 UTC m=+4366.987415948" observedRunningTime="2025-10-02 12:04:12.565990735 +0000 UTC m=+4367.508861689" watchObservedRunningTime="2025-10-02 12:04:12.56928273 +0000 UTC m=+4367.512153674" Oct 02 12:04:13 crc kubenswrapper[4766]: I1002 12:04:13.734887 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6g65t"] Oct 02 12:04:13 crc kubenswrapper[4766]: I1002 12:04:13.736858 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6g65t" Oct 02 12:04:13 crc kubenswrapper[4766]: I1002 12:04:13.750255 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6g65t"] Oct 02 12:04:13 crc kubenswrapper[4766]: I1002 12:04:13.847962 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b802bf3-ceaa-40df-8054-78ce904f36e8-utilities\") pod \"certified-operators-6g65t\" (UID: \"1b802bf3-ceaa-40df-8054-78ce904f36e8\") " pod="openshift-marketplace/certified-operators-6g65t" Oct 02 12:04:13 crc kubenswrapper[4766]: I1002 12:04:13.848008 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b802bf3-ceaa-40df-8054-78ce904f36e8-catalog-content\") pod \"certified-operators-6g65t\" (UID: \"1b802bf3-ceaa-40df-8054-78ce904f36e8\") " pod="openshift-marketplace/certified-operators-6g65t" Oct 02 12:04:13 crc kubenswrapper[4766]: I1002 12:04:13.848054 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpg9w\" (UniqueName: \"kubernetes.io/projected/1b802bf3-ceaa-40df-8054-78ce904f36e8-kube-api-access-kpg9w\") pod \"certified-operators-6g65t\" (UID: \"1b802bf3-ceaa-40df-8054-78ce904f36e8\") " pod="openshift-marketplace/certified-operators-6g65t" Oct 02 12:04:13 crc kubenswrapper[4766]: I1002 12:04:13.852584 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n8m77" Oct 02 12:04:13 crc kubenswrapper[4766]: I1002 12:04:13.852642 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n8m77" Oct 02 12:04:13 crc kubenswrapper[4766]: I1002 12:04:13.899899 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n8m77" Oct 02 12:04:13 crc kubenswrapper[4766]: I1002 12:04:13.950358 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpg9w\" (UniqueName: \"kubernetes.io/projected/1b802bf3-ceaa-40df-8054-78ce904f36e8-kube-api-access-kpg9w\") pod \"certified-operators-6g65t\" (UID: \"1b802bf3-ceaa-40df-8054-78ce904f36e8\") " pod="openshift-marketplace/certified-operators-6g65t" Oct 02 12:04:13 crc kubenswrapper[4766]: I1002 12:04:13.950483 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b802bf3-ceaa-40df-8054-78ce904f36e8-utilities\") pod \"certified-operators-6g65t\" (UID: \"1b802bf3-ceaa-40df-8054-78ce904f36e8\") " pod="openshift-marketplace/certified-operators-6g65t" Oct 02 12:04:13 crc kubenswrapper[4766]: I1002 12:04:13.950524 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b802bf3-ceaa-40df-8054-78ce904f36e8-catalog-content\") pod \"certified-operators-6g65t\" (UID: \"1b802bf3-ceaa-40df-8054-78ce904f36e8\") " pod="openshift-marketplace/certified-operators-6g65t" Oct 02 12:04:13 crc kubenswrapper[4766]: I1002 12:04:13.951190 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b802bf3-ceaa-40df-8054-78ce904f36e8-utilities\") pod \"certified-operators-6g65t\" (UID: \"1b802bf3-ceaa-40df-8054-78ce904f36e8\") " pod="openshift-marketplace/certified-operators-6g65t" Oct 02 12:04:13 crc kubenswrapper[4766]: I1002 12:04:13.951566 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b802bf3-ceaa-40df-8054-78ce904f36e8-catalog-content\") pod \"certified-operators-6g65t\" (UID: \"1b802bf3-ceaa-40df-8054-78ce904f36e8\") " pod="openshift-marketplace/certified-operators-6g65t" Oct 02 12:04:13 crc kubenswrapper[4766]: I1002 12:04:13.986672 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpg9w\" (UniqueName: \"kubernetes.io/projected/1b802bf3-ceaa-40df-8054-78ce904f36e8-kube-api-access-kpg9w\") pod \"certified-operators-6g65t\" (UID: \"1b802bf3-ceaa-40df-8054-78ce904f36e8\") " pod="openshift-marketplace/certified-operators-6g65t" Oct 02 12:04:14 crc kubenswrapper[4766]: I1002 12:04:14.054770 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6g65t" Oct 02 12:04:14 crc kubenswrapper[4766]: I1002 12:04:14.568818 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6g65t"] Oct 02 12:04:14 crc kubenswrapper[4766]: W1002 12:04:14.572089 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b802bf3_ceaa_40df_8054_78ce904f36e8.slice/crio-f880c5ada1a4087c5bac221ede0abac7efb5773dc50505c897b2dcc53ccf4bd6 WatchSource:0}: Error finding container f880c5ada1a4087c5bac221ede0abac7efb5773dc50505c897b2dcc53ccf4bd6: Status 404 returned error can't find the container with id f880c5ada1a4087c5bac221ede0abac7efb5773dc50505c897b2dcc53ccf4bd6 Oct 02 12:04:15 crc kubenswrapper[4766]: I1002 12:04:15.569330 4766 generic.go:334] "Generic (PLEG): container finished" podID="1b802bf3-ceaa-40df-8054-78ce904f36e8" containerID="af7275235b01d0da2f19917cd37462c5cd14e16787b995873040fe288988ac63" exitCode=0 Oct 02 12:04:15 crc kubenswrapper[4766]: I1002 12:04:15.569471 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6g65t" event={"ID":"1b802bf3-ceaa-40df-8054-78ce904f36e8","Type":"ContainerDied","Data":"af7275235b01d0da2f19917cd37462c5cd14e16787b995873040fe288988ac63"} Oct 02 12:04:15 crc kubenswrapper[4766]: I1002 12:04:15.569896 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6g65t" event={"ID":"1b802bf3-ceaa-40df-8054-78ce904f36e8","Type":"ContainerStarted","Data":"f880c5ada1a4087c5bac221ede0abac7efb5773dc50505c897b2dcc53ccf4bd6"} Oct 02 12:04:17 crc kubenswrapper[4766]: I1002 12:04:17.597740 4766 generic.go:334] "Generic (PLEG): container finished" podID="1b802bf3-ceaa-40df-8054-78ce904f36e8" containerID="862cdce1581ad4422fe73a0d65c32639712f8e1ce1cab67b4a241fe5f4eade76" exitCode=0 Oct 02 12:04:17 crc kubenswrapper[4766]: I1002 12:04:17.597814 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6g65t" event={"ID":"1b802bf3-ceaa-40df-8054-78ce904f36e8","Type":"ContainerDied","Data":"862cdce1581ad4422fe73a0d65c32639712f8e1ce1cab67b4a241fe5f4eade76"} Oct 02 12:04:19 crc kubenswrapper[4766]: I1002 12:04:19.620868 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6g65t" event={"ID":"1b802bf3-ceaa-40df-8054-78ce904f36e8","Type":"ContainerStarted","Data":"af1c50744db02408d933fc783e8cadd5ec739b543fe4058743dd169153aa8872"} Oct 02 12:04:19 crc kubenswrapper[4766]: I1002 12:04:19.645988 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6g65t" podStartSLOduration=3.7268481810000003 podStartE2EDuration="6.645958317s" podCreationTimestamp="2025-10-02 12:04:13 +0000 UTC" firstStartedPulling="2025-10-02 12:04:15.573133399 +0000 UTC m=+4370.516004343" lastFinishedPulling="2025-10-02 12:04:18.492243495 +0000 UTC m=+4373.435114479" observedRunningTime="2025-10-02 12:04:19.64043095 +0000 UTC m=+4374.583301894" watchObservedRunningTime="2025-10-02 12:04:19.645958317 +0000 UTC m=+4374.588829281" Oct 02 12:04:22 crc kubenswrapper[4766]: I1002 12:04:22.881960 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:04:22 crc kubenswrapper[4766]: E1002 12:04:22.882436 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:04:23 crc kubenswrapper[4766]: I1002 12:04:23.911842 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n8m77" Oct 02 12:04:24 crc kubenswrapper[4766]: I1002 12:04:24.055277 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6g65t" Oct 02 12:04:24 crc kubenswrapper[4766]: I1002 12:04:24.055378 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6g65t" Oct 02 12:04:24 crc kubenswrapper[4766]: I1002 12:04:24.104630 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6g65t" Oct 02 12:04:24 crc kubenswrapper[4766]: I1002 12:04:24.716555 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6g65t" Oct 02 12:04:26 crc kubenswrapper[4766]: I1002 12:04:26.927817 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n8m77"] Oct 02 12:04:26 crc kubenswrapper[4766]: I1002 12:04:26.928078 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n8m77" podUID="f3132350-14b2-4817-9ce6-5f81c68a36e9" containerName="registry-server" containerID="cri-o://30b96a3732c241302895484d3093dc07a9b940518ddd6dff8b7cfc7f6ac385bc" gracePeriod=2 Oct 02 12:04:27 crc kubenswrapper[4766]: I1002 12:04:27.693185 4766 generic.go:334] "Generic (PLEG): container finished" podID="f3132350-14b2-4817-9ce6-5f81c68a36e9" containerID="30b96a3732c241302895484d3093dc07a9b940518ddd6dff8b7cfc7f6ac385bc" exitCode=0 Oct 02 12:04:27 crc kubenswrapper[4766]: I1002 12:04:27.693219 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8m77" event={"ID":"f3132350-14b2-4817-9ce6-5f81c68a36e9","Type":"ContainerDied","Data":"30b96a3732c241302895484d3093dc07a9b940518ddd6dff8b7cfc7f6ac385bc"} Oct 02 12:04:27 crc kubenswrapper[4766]: I1002 12:04:27.903921 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n8m77" Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.093217 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdflm\" (UniqueName: \"kubernetes.io/projected/f3132350-14b2-4817-9ce6-5f81c68a36e9-kube-api-access-xdflm\") pod \"f3132350-14b2-4817-9ce6-5f81c68a36e9\" (UID: \"f3132350-14b2-4817-9ce6-5f81c68a36e9\") " Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.093657 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3132350-14b2-4817-9ce6-5f81c68a36e9-utilities\") pod \"f3132350-14b2-4817-9ce6-5f81c68a36e9\" (UID: \"f3132350-14b2-4817-9ce6-5f81c68a36e9\") " Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.093703 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3132350-14b2-4817-9ce6-5f81c68a36e9-catalog-content\") pod \"f3132350-14b2-4817-9ce6-5f81c68a36e9\" (UID: \"f3132350-14b2-4817-9ce6-5f81c68a36e9\") " Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.094444 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3132350-14b2-4817-9ce6-5f81c68a36e9-utilities" (OuterVolumeSpecName: "utilities") pod "f3132350-14b2-4817-9ce6-5f81c68a36e9" (UID: "f3132350-14b2-4817-9ce6-5f81c68a36e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.100460 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3132350-14b2-4817-9ce6-5f81c68a36e9-kube-api-access-xdflm" (OuterVolumeSpecName: "kube-api-access-xdflm") pod "f3132350-14b2-4817-9ce6-5f81c68a36e9" (UID: "f3132350-14b2-4817-9ce6-5f81c68a36e9"). InnerVolumeSpecName "kube-api-access-xdflm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.107292 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3132350-14b2-4817-9ce6-5f81c68a36e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3132350-14b2-4817-9ce6-5f81c68a36e9" (UID: "f3132350-14b2-4817-9ce6-5f81c68a36e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.195283 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdflm\" (UniqueName: \"kubernetes.io/projected/f3132350-14b2-4817-9ce6-5f81c68a36e9-kube-api-access-xdflm\") on node \"crc\" DevicePath \"\"" Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.195320 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3132350-14b2-4817-9ce6-5f81c68a36e9-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.195330 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3132350-14b2-4817-9ce6-5f81c68a36e9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.705229 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n8m77" event={"ID":"f3132350-14b2-4817-9ce6-5f81c68a36e9","Type":"ContainerDied","Data":"ecdd45f88edd4883ee02aeefd8b2364728d2141d7be66cf097be01066572f6eb"} Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.705314 4766 scope.go:117] "RemoveContainer" containerID="30b96a3732c241302895484d3093dc07a9b940518ddd6dff8b7cfc7f6ac385bc" Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.705321 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n8m77" Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.729349 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6g65t"] Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.729652 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6g65t" podUID="1b802bf3-ceaa-40df-8054-78ce904f36e8" containerName="registry-server" containerID="cri-o://af1c50744db02408d933fc783e8cadd5ec739b543fe4058743dd169153aa8872" gracePeriod=2 Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.741744 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n8m77"] Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.742816 4766 scope.go:117] "RemoveContainer" containerID="49dc31fd890b92de20920330b7bab9aa9766ef46f0ff6a51fd44e1f5adf188d0" Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.746184 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n8m77"] Oct 02 12:04:28 crc kubenswrapper[4766]: I1002 12:04:28.772516 4766 scope.go:117] "RemoveContainer" containerID="b073b06b5dbe93db170d8aaeeb6be74b9ee89e64fc64395f5b8b95070339d759" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.239309 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6g65t" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.414412 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b802bf3-ceaa-40df-8054-78ce904f36e8-catalog-content\") pod \"1b802bf3-ceaa-40df-8054-78ce904f36e8\" (UID: \"1b802bf3-ceaa-40df-8054-78ce904f36e8\") " Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.414517 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b802bf3-ceaa-40df-8054-78ce904f36e8-utilities\") pod \"1b802bf3-ceaa-40df-8054-78ce904f36e8\" (UID: \"1b802bf3-ceaa-40df-8054-78ce904f36e8\") " Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.414580 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpg9w\" (UniqueName: \"kubernetes.io/projected/1b802bf3-ceaa-40df-8054-78ce904f36e8-kube-api-access-kpg9w\") pod \"1b802bf3-ceaa-40df-8054-78ce904f36e8\" (UID: \"1b802bf3-ceaa-40df-8054-78ce904f36e8\") " Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.415823 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b802bf3-ceaa-40df-8054-78ce904f36e8-utilities" (OuterVolumeSpecName: "utilities") pod "1b802bf3-ceaa-40df-8054-78ce904f36e8" (UID: "1b802bf3-ceaa-40df-8054-78ce904f36e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.421761 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b802bf3-ceaa-40df-8054-78ce904f36e8-kube-api-access-kpg9w" (OuterVolumeSpecName: "kube-api-access-kpg9w") pod "1b802bf3-ceaa-40df-8054-78ce904f36e8" (UID: "1b802bf3-ceaa-40df-8054-78ce904f36e8"). InnerVolumeSpecName "kube-api-access-kpg9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.467184 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b802bf3-ceaa-40df-8054-78ce904f36e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b802bf3-ceaa-40df-8054-78ce904f36e8" (UID: "1b802bf3-ceaa-40df-8054-78ce904f36e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.516075 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpg9w\" (UniqueName: \"kubernetes.io/projected/1b802bf3-ceaa-40df-8054-78ce904f36e8-kube-api-access-kpg9w\") on node \"crc\" DevicePath \"\"" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.516111 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b802bf3-ceaa-40df-8054-78ce904f36e8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.516120 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b802bf3-ceaa-40df-8054-78ce904f36e8-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.716126 4766 generic.go:334] "Generic (PLEG): container finished" podID="1b802bf3-ceaa-40df-8054-78ce904f36e8" containerID="af1c50744db02408d933fc783e8cadd5ec739b543fe4058743dd169153aa8872" exitCode=0 Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.716230 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6g65t" event={"ID":"1b802bf3-ceaa-40df-8054-78ce904f36e8","Type":"ContainerDied","Data":"af1c50744db02408d933fc783e8cadd5ec739b543fe4058743dd169153aa8872"} Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.716297 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6g65t" event={"ID":"1b802bf3-ceaa-40df-8054-78ce904f36e8","Type":"ContainerDied","Data":"f880c5ada1a4087c5bac221ede0abac7efb5773dc50505c897b2dcc53ccf4bd6"} Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.716196 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6g65t" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.716320 4766 scope.go:117] "RemoveContainer" containerID="af1c50744db02408d933fc783e8cadd5ec739b543fe4058743dd169153aa8872" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.741702 4766 scope.go:117] "RemoveContainer" containerID="862cdce1581ad4422fe73a0d65c32639712f8e1ce1cab67b4a241fe5f4eade76" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.751123 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6g65t"] Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.758136 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6g65t"] Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.792427 4766 scope.go:117] "RemoveContainer" containerID="af7275235b01d0da2f19917cd37462c5cd14e16787b995873040fe288988ac63" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.811542 4766 scope.go:117] "RemoveContainer" containerID="af1c50744db02408d933fc783e8cadd5ec739b543fe4058743dd169153aa8872" Oct 02 12:04:29 crc kubenswrapper[4766]: E1002 12:04:29.821549 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af1c50744db02408d933fc783e8cadd5ec739b543fe4058743dd169153aa8872\": container with ID starting with af1c50744db02408d933fc783e8cadd5ec739b543fe4058743dd169153aa8872 not found: ID does not exist" containerID="af1c50744db02408d933fc783e8cadd5ec739b543fe4058743dd169153aa8872" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.821596 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af1c50744db02408d933fc783e8cadd5ec739b543fe4058743dd169153aa8872"} err="failed to get container status \"af1c50744db02408d933fc783e8cadd5ec739b543fe4058743dd169153aa8872\": rpc error: code = NotFound desc = could not find container \"af1c50744db02408d933fc783e8cadd5ec739b543fe4058743dd169153aa8872\": container with ID starting with af1c50744db02408d933fc783e8cadd5ec739b543fe4058743dd169153aa8872 not found: ID does not exist" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.821631 4766 scope.go:117] "RemoveContainer" containerID="862cdce1581ad4422fe73a0d65c32639712f8e1ce1cab67b4a241fe5f4eade76" Oct 02 12:04:29 crc kubenswrapper[4766]: E1002 12:04:29.821996 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"862cdce1581ad4422fe73a0d65c32639712f8e1ce1cab67b4a241fe5f4eade76\": container with ID starting with 862cdce1581ad4422fe73a0d65c32639712f8e1ce1cab67b4a241fe5f4eade76 not found: ID does not exist" containerID="862cdce1581ad4422fe73a0d65c32639712f8e1ce1cab67b4a241fe5f4eade76" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.822061 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"862cdce1581ad4422fe73a0d65c32639712f8e1ce1cab67b4a241fe5f4eade76"} err="failed to get container status \"862cdce1581ad4422fe73a0d65c32639712f8e1ce1cab67b4a241fe5f4eade76\": rpc error: code = NotFound desc = could not find container \"862cdce1581ad4422fe73a0d65c32639712f8e1ce1cab67b4a241fe5f4eade76\": container with ID starting with 862cdce1581ad4422fe73a0d65c32639712f8e1ce1cab67b4a241fe5f4eade76 not found: ID does not exist" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.822080 4766 scope.go:117] "RemoveContainer" containerID="af7275235b01d0da2f19917cd37462c5cd14e16787b995873040fe288988ac63" Oct 02 12:04:29 crc kubenswrapper[4766]: E1002 12:04:29.822462 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af7275235b01d0da2f19917cd37462c5cd14e16787b995873040fe288988ac63\": container with ID starting with af7275235b01d0da2f19917cd37462c5cd14e16787b995873040fe288988ac63 not found: ID does not exist" containerID="af7275235b01d0da2f19917cd37462c5cd14e16787b995873040fe288988ac63" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.822489 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af7275235b01d0da2f19917cd37462c5cd14e16787b995873040fe288988ac63"} err="failed to get container status \"af7275235b01d0da2f19917cd37462c5cd14e16787b995873040fe288988ac63\": rpc error: code = NotFound desc = could not find container \"af7275235b01d0da2f19917cd37462c5cd14e16787b995873040fe288988ac63\": container with ID starting with af7275235b01d0da2f19917cd37462c5cd14e16787b995873040fe288988ac63 not found: ID does not exist" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.895799 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b802bf3-ceaa-40df-8054-78ce904f36e8" path="/var/lib/kubelet/pods/1b802bf3-ceaa-40df-8054-78ce904f36e8/volumes" Oct 02 12:04:29 crc kubenswrapper[4766]: I1002 12:04:29.896539 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3132350-14b2-4817-9ce6-5f81c68a36e9" path="/var/lib/kubelet/pods/f3132350-14b2-4817-9ce6-5f81c68a36e9/volumes" Oct 02 12:04:36 crc kubenswrapper[4766]: I1002 12:04:36.881466 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:04:36 crc kubenswrapper[4766]: E1002 12:04:36.882195 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:04:47 crc kubenswrapper[4766]: I1002 12:04:47.881987 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:04:47 crc kubenswrapper[4766]: E1002 12:04:47.883215 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:05:00 crc kubenswrapper[4766]: I1002 12:05:00.881043 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:05:00 crc kubenswrapper[4766]: E1002 12:05:00.883737 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:05:11 crc kubenswrapper[4766]: I1002 12:05:11.881816 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:05:11 crc kubenswrapper[4766]: E1002 12:05:11.883021 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:05:15 crc kubenswrapper[4766]: I1002 12:05:15.775297 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cfflg"] Oct 02 12:05:15 crc kubenswrapper[4766]: E1002 12:05:15.776066 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b802bf3-ceaa-40df-8054-78ce904f36e8" containerName="extract-utilities" Oct 02 12:05:15 crc kubenswrapper[4766]: I1002 12:05:15.776080 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b802bf3-ceaa-40df-8054-78ce904f36e8" containerName="extract-utilities" Oct 02 12:05:15 crc kubenswrapper[4766]: E1002 12:05:15.776101 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3132350-14b2-4817-9ce6-5f81c68a36e9" containerName="registry-server" Oct 02 12:05:15 crc kubenswrapper[4766]: I1002 12:05:15.776107 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3132350-14b2-4817-9ce6-5f81c68a36e9" containerName="registry-server" Oct 02 12:05:15 crc kubenswrapper[4766]: E1002 12:05:15.776124 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3132350-14b2-4817-9ce6-5f81c68a36e9" containerName="extract-utilities" Oct 02 12:05:15 crc kubenswrapper[4766]: I1002 12:05:15.776131 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3132350-14b2-4817-9ce6-5f81c68a36e9" containerName="extract-utilities" Oct 02 12:05:15 crc kubenswrapper[4766]: E1002 12:05:15.776145 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b802bf3-ceaa-40df-8054-78ce904f36e8" containerName="registry-server" Oct 02 12:05:15 crc kubenswrapper[4766]: I1002 12:05:15.776151 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b802bf3-ceaa-40df-8054-78ce904f36e8" containerName="registry-server" Oct 02 12:05:15 crc kubenswrapper[4766]: E1002 12:05:15.776161 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3132350-14b2-4817-9ce6-5f81c68a36e9" containerName="extract-content" Oct 02 12:05:15 crc kubenswrapper[4766]: I1002 12:05:15.776167 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3132350-14b2-4817-9ce6-5f81c68a36e9" containerName="extract-content" Oct 02 12:05:15 crc kubenswrapper[4766]: E1002 12:05:15.776177 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b802bf3-ceaa-40df-8054-78ce904f36e8" containerName="extract-content" Oct 02 12:05:15 crc kubenswrapper[4766]: I1002 12:05:15.776182 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b802bf3-ceaa-40df-8054-78ce904f36e8" containerName="extract-content" Oct 02 12:05:15 crc kubenswrapper[4766]: I1002 12:05:15.776314 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3132350-14b2-4817-9ce6-5f81c68a36e9" containerName="registry-server" Oct 02 12:05:15 crc kubenswrapper[4766]: I1002 12:05:15.776322 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b802bf3-ceaa-40df-8054-78ce904f36e8" containerName="registry-server" Oct 02 12:05:15 crc kubenswrapper[4766]: I1002 12:05:15.777378 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cfflg" Oct 02 12:05:15 crc kubenswrapper[4766]: I1002 12:05:15.787845 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cfflg"] Oct 02 12:05:15 crc kubenswrapper[4766]: I1002 12:05:15.961893 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjlbz\" (UniqueName: \"kubernetes.io/projected/41213cb3-c99b-400d-b355-eb3b80506a64-kube-api-access-hjlbz\") pod \"redhat-operators-cfflg\" (UID: \"41213cb3-c99b-400d-b355-eb3b80506a64\") " pod="openshift-marketplace/redhat-operators-cfflg" Oct 02 12:05:15 crc kubenswrapper[4766]: I1002 12:05:15.962042 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41213cb3-c99b-400d-b355-eb3b80506a64-utilities\") pod \"redhat-operators-cfflg\" (UID: \"41213cb3-c99b-400d-b355-eb3b80506a64\") " pod="openshift-marketplace/redhat-operators-cfflg" Oct 02 12:05:15 crc kubenswrapper[4766]: I1002 12:05:15.962787 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41213cb3-c99b-400d-b355-eb3b80506a64-catalog-content\") pod \"redhat-operators-cfflg\" (UID: \"41213cb3-c99b-400d-b355-eb3b80506a64\") " pod="openshift-marketplace/redhat-operators-cfflg" Oct 02 12:05:16 crc kubenswrapper[4766]: I1002 12:05:16.064715 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjlbz\" (UniqueName: \"kubernetes.io/projected/41213cb3-c99b-400d-b355-eb3b80506a64-kube-api-access-hjlbz\") pod \"redhat-operators-cfflg\" (UID: \"41213cb3-c99b-400d-b355-eb3b80506a64\") " pod="openshift-marketplace/redhat-operators-cfflg" Oct 02 12:05:16 crc kubenswrapper[4766]: I1002 12:05:16.064820 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41213cb3-c99b-400d-b355-eb3b80506a64-utilities\") pod \"redhat-operators-cfflg\" (UID: \"41213cb3-c99b-400d-b355-eb3b80506a64\") " pod="openshift-marketplace/redhat-operators-cfflg" Oct 02 12:05:16 crc kubenswrapper[4766]: I1002 12:05:16.064855 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41213cb3-c99b-400d-b355-eb3b80506a64-catalog-content\") pod \"redhat-operators-cfflg\" (UID: \"41213cb3-c99b-400d-b355-eb3b80506a64\") " pod="openshift-marketplace/redhat-operators-cfflg" Oct 02 12:05:16 crc kubenswrapper[4766]: I1002 12:05:16.065553 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41213cb3-c99b-400d-b355-eb3b80506a64-catalog-content\") pod \"redhat-operators-cfflg\" (UID: \"41213cb3-c99b-400d-b355-eb3b80506a64\") " pod="openshift-marketplace/redhat-operators-cfflg" Oct 02 12:05:16 crc kubenswrapper[4766]: I1002 12:05:16.065698 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41213cb3-c99b-400d-b355-eb3b80506a64-utilities\") pod \"redhat-operators-cfflg\" (UID: \"41213cb3-c99b-400d-b355-eb3b80506a64\") " pod="openshift-marketplace/redhat-operators-cfflg" Oct 02 12:05:16 crc kubenswrapper[4766]: I1002 12:05:16.083674 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjlbz\" (UniqueName: \"kubernetes.io/projected/41213cb3-c99b-400d-b355-eb3b80506a64-kube-api-access-hjlbz\") pod \"redhat-operators-cfflg\" (UID: \"41213cb3-c99b-400d-b355-eb3b80506a64\") " pod="openshift-marketplace/redhat-operators-cfflg" Oct 02 12:05:16 crc kubenswrapper[4766]: I1002 12:05:16.099307 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cfflg" Oct 02 12:05:16 crc kubenswrapper[4766]: I1002 12:05:16.588423 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cfflg"] Oct 02 12:05:17 crc kubenswrapper[4766]: I1002 12:05:17.129871 4766 generic.go:334] "Generic (PLEG): container finished" podID="41213cb3-c99b-400d-b355-eb3b80506a64" containerID="f42cbff396454764316c78e1885ffdf8606f81ba8f74774edd031ef8946b2171" exitCode=0 Oct 02 12:05:17 crc kubenswrapper[4766]: I1002 12:05:17.129940 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfflg" event={"ID":"41213cb3-c99b-400d-b355-eb3b80506a64","Type":"ContainerDied","Data":"f42cbff396454764316c78e1885ffdf8606f81ba8f74774edd031ef8946b2171"} Oct 02 12:05:17 crc kubenswrapper[4766]: I1002 12:05:17.131820 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfflg" event={"ID":"41213cb3-c99b-400d-b355-eb3b80506a64","Type":"ContainerStarted","Data":"a900c75422c7581a7b94d61d42d0dd2056fcdb0c347dfee614cd7cb4744f1f57"} Oct 02 12:05:19 crc kubenswrapper[4766]: I1002 12:05:19.159158 4766 generic.go:334] "Generic (PLEG): container finished" podID="41213cb3-c99b-400d-b355-eb3b80506a64" containerID="68c9637533ee30d41efa85709c1609bc029548d790943c6f6aa7e04eb8a35f25" exitCode=0 Oct 02 12:05:19 crc kubenswrapper[4766]: I1002 12:05:19.159227 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfflg" event={"ID":"41213cb3-c99b-400d-b355-eb3b80506a64","Type":"ContainerDied","Data":"68c9637533ee30d41efa85709c1609bc029548d790943c6f6aa7e04eb8a35f25"} Oct 02 12:05:21 crc kubenswrapper[4766]: I1002 12:05:21.184400 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfflg" event={"ID":"41213cb3-c99b-400d-b355-eb3b80506a64","Type":"ContainerStarted","Data":"6b732e00d01aedf461b991d2f05cf4170eb5fd640e4b16c74afe32a2e3722bf7"} Oct 02 12:05:21 crc kubenswrapper[4766]: I1002 12:05:21.213790 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cfflg" podStartSLOduration=3.643947553 podStartE2EDuration="6.213766031s" podCreationTimestamp="2025-10-02 12:05:15 +0000 UTC" firstStartedPulling="2025-10-02 12:05:17.131416768 +0000 UTC m=+4432.074287752" lastFinishedPulling="2025-10-02 12:05:19.701235246 +0000 UTC m=+4434.644106230" observedRunningTime="2025-10-02 12:05:21.211432776 +0000 UTC m=+4436.154303730" watchObservedRunningTime="2025-10-02 12:05:21.213766031 +0000 UTC m=+4436.156636975" Oct 02 12:05:26 crc kubenswrapper[4766]: I1002 12:05:26.100674 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cfflg" Oct 02 12:05:26 crc kubenswrapper[4766]: I1002 12:05:26.101256 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cfflg" Oct 02 12:05:26 crc kubenswrapper[4766]: I1002 12:05:26.151300 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cfflg" Oct 02 12:05:26 crc kubenswrapper[4766]: I1002 12:05:26.288140 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cfflg" Oct 02 12:05:26 crc kubenswrapper[4766]: I1002 12:05:26.396516 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cfflg"] Oct 02 12:05:26 crc kubenswrapper[4766]: I1002 12:05:26.882155 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:05:26 crc kubenswrapper[4766]: E1002 12:05:26.883007 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:05:28 crc kubenswrapper[4766]: I1002 12:05:28.250932 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cfflg" podUID="41213cb3-c99b-400d-b355-eb3b80506a64" containerName="registry-server" containerID="cri-o://6b732e00d01aedf461b991d2f05cf4170eb5fd640e4b16c74afe32a2e3722bf7" gracePeriod=2 Oct 02 12:05:28 crc kubenswrapper[4766]: I1002 12:05:28.732796 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cfflg" Oct 02 12:05:28 crc kubenswrapper[4766]: I1002 12:05:28.929165 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41213cb3-c99b-400d-b355-eb3b80506a64-catalog-content\") pod \"41213cb3-c99b-400d-b355-eb3b80506a64\" (UID: \"41213cb3-c99b-400d-b355-eb3b80506a64\") " Oct 02 12:05:28 crc kubenswrapper[4766]: I1002 12:05:28.929702 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjlbz\" (UniqueName: \"kubernetes.io/projected/41213cb3-c99b-400d-b355-eb3b80506a64-kube-api-access-hjlbz\") pod \"41213cb3-c99b-400d-b355-eb3b80506a64\" (UID: \"41213cb3-c99b-400d-b355-eb3b80506a64\") " Oct 02 12:05:28 crc kubenswrapper[4766]: I1002 12:05:28.929785 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41213cb3-c99b-400d-b355-eb3b80506a64-utilities\") pod \"41213cb3-c99b-400d-b355-eb3b80506a64\" (UID: \"41213cb3-c99b-400d-b355-eb3b80506a64\") " Oct 02 12:05:28 crc kubenswrapper[4766]: I1002 12:05:28.930699 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41213cb3-c99b-400d-b355-eb3b80506a64-utilities" (OuterVolumeSpecName: "utilities") pod "41213cb3-c99b-400d-b355-eb3b80506a64" (UID: "41213cb3-c99b-400d-b355-eb3b80506a64"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:05:28 crc kubenswrapper[4766]: I1002 12:05:28.936021 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41213cb3-c99b-400d-b355-eb3b80506a64-kube-api-access-hjlbz" (OuterVolumeSpecName: "kube-api-access-hjlbz") pod "41213cb3-c99b-400d-b355-eb3b80506a64" (UID: "41213cb3-c99b-400d-b355-eb3b80506a64"). InnerVolumeSpecName "kube-api-access-hjlbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.032444 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjlbz\" (UniqueName: \"kubernetes.io/projected/41213cb3-c99b-400d-b355-eb3b80506a64-kube-api-access-hjlbz\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.032488 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41213cb3-c99b-400d-b355-eb3b80506a64-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.266930 4766 generic.go:334] "Generic (PLEG): container finished" podID="41213cb3-c99b-400d-b355-eb3b80506a64" containerID="6b732e00d01aedf461b991d2f05cf4170eb5fd640e4b16c74afe32a2e3722bf7" exitCode=0 Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.267037 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfflg" event={"ID":"41213cb3-c99b-400d-b355-eb3b80506a64","Type":"ContainerDied","Data":"6b732e00d01aedf461b991d2f05cf4170eb5fd640e4b16c74afe32a2e3722bf7"} Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.267098 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cfflg" Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.267129 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cfflg" event={"ID":"41213cb3-c99b-400d-b355-eb3b80506a64","Type":"ContainerDied","Data":"a900c75422c7581a7b94d61d42d0dd2056fcdb0c347dfee614cd7cb4744f1f57"} Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.267152 4766 scope.go:117] "RemoveContainer" containerID="6b732e00d01aedf461b991d2f05cf4170eb5fd640e4b16c74afe32a2e3722bf7" Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.296141 4766 scope.go:117] "RemoveContainer" containerID="68c9637533ee30d41efa85709c1609bc029548d790943c6f6aa7e04eb8a35f25" Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.297149 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41213cb3-c99b-400d-b355-eb3b80506a64-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41213cb3-c99b-400d-b355-eb3b80506a64" (UID: "41213cb3-c99b-400d-b355-eb3b80506a64"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.324964 4766 scope.go:117] "RemoveContainer" containerID="f42cbff396454764316c78e1885ffdf8606f81ba8f74774edd031ef8946b2171" Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.337334 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41213cb3-c99b-400d-b355-eb3b80506a64-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.356793 4766 scope.go:117] "RemoveContainer" containerID="6b732e00d01aedf461b991d2f05cf4170eb5fd640e4b16c74afe32a2e3722bf7" Oct 02 12:05:29 crc kubenswrapper[4766]: E1002 12:05:29.357487 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b732e00d01aedf461b991d2f05cf4170eb5fd640e4b16c74afe32a2e3722bf7\": container with ID starting with 6b732e00d01aedf461b991d2f05cf4170eb5fd640e4b16c74afe32a2e3722bf7 not found: ID does not exist" containerID="6b732e00d01aedf461b991d2f05cf4170eb5fd640e4b16c74afe32a2e3722bf7" Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.357602 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b732e00d01aedf461b991d2f05cf4170eb5fd640e4b16c74afe32a2e3722bf7"} err="failed to get container status \"6b732e00d01aedf461b991d2f05cf4170eb5fd640e4b16c74afe32a2e3722bf7\": rpc error: code = NotFound desc = could not find container \"6b732e00d01aedf461b991d2f05cf4170eb5fd640e4b16c74afe32a2e3722bf7\": container with ID starting with 6b732e00d01aedf461b991d2f05cf4170eb5fd640e4b16c74afe32a2e3722bf7 not found: ID does not exist" Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.357658 4766 scope.go:117] "RemoveContainer" containerID="68c9637533ee30d41efa85709c1609bc029548d790943c6f6aa7e04eb8a35f25" Oct 02 12:05:29 crc kubenswrapper[4766]: E1002 12:05:29.358445 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68c9637533ee30d41efa85709c1609bc029548d790943c6f6aa7e04eb8a35f25\": container with ID starting with 68c9637533ee30d41efa85709c1609bc029548d790943c6f6aa7e04eb8a35f25 not found: ID does not exist" containerID="68c9637533ee30d41efa85709c1609bc029548d790943c6f6aa7e04eb8a35f25" Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.358530 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68c9637533ee30d41efa85709c1609bc029548d790943c6f6aa7e04eb8a35f25"} err="failed to get container status \"68c9637533ee30d41efa85709c1609bc029548d790943c6f6aa7e04eb8a35f25\": rpc error: code = NotFound desc = could not find container \"68c9637533ee30d41efa85709c1609bc029548d790943c6f6aa7e04eb8a35f25\": container with ID starting with 68c9637533ee30d41efa85709c1609bc029548d790943c6f6aa7e04eb8a35f25 not found: ID does not exist" Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.358601 4766 scope.go:117] "RemoveContainer" containerID="f42cbff396454764316c78e1885ffdf8606f81ba8f74774edd031ef8946b2171" Oct 02 12:05:29 crc kubenswrapper[4766]: E1002 12:05:29.359596 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f42cbff396454764316c78e1885ffdf8606f81ba8f74774edd031ef8946b2171\": container with ID starting with f42cbff396454764316c78e1885ffdf8606f81ba8f74774edd031ef8946b2171 not found: ID does not exist" containerID="f42cbff396454764316c78e1885ffdf8606f81ba8f74774edd031ef8946b2171" Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.359685 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f42cbff396454764316c78e1885ffdf8606f81ba8f74774edd031ef8946b2171"} err="failed to get container status \"f42cbff396454764316c78e1885ffdf8606f81ba8f74774edd031ef8946b2171\": rpc error: code = NotFound desc = could not find container \"f42cbff396454764316c78e1885ffdf8606f81ba8f74774edd031ef8946b2171\": container with ID starting with f42cbff396454764316c78e1885ffdf8606f81ba8f74774edd031ef8946b2171 not found: ID does not exist" Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.624206 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cfflg"] Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.631658 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cfflg"] Oct 02 12:05:29 crc kubenswrapper[4766]: I1002 12:05:29.892442 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41213cb3-c99b-400d-b355-eb3b80506a64" path="/var/lib/kubelet/pods/41213cb3-c99b-400d-b355-eb3b80506a64/volumes" Oct 02 12:05:41 crc kubenswrapper[4766]: I1002 12:05:41.882495 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:05:41 crc kubenswrapper[4766]: E1002 12:05:41.884198 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:05:55 crc kubenswrapper[4766]: I1002 12:05:55.886302 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:05:55 crc kubenswrapper[4766]: E1002 12:05:55.888605 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:06:08 crc kubenswrapper[4766]: I1002 12:06:08.881664 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:06:08 crc kubenswrapper[4766]: E1002 12:06:08.882472 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:06:19 crc kubenswrapper[4766]: I1002 12:06:19.881896 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:06:19 crc kubenswrapper[4766]: E1002 12:06:19.883091 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:06:33 crc kubenswrapper[4766]: I1002 12:06:33.880983 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:06:33 crc kubenswrapper[4766]: E1002 12:06:33.883773 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:06:44 crc kubenswrapper[4766]: I1002 12:06:44.881330 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:06:44 crc kubenswrapper[4766]: E1002 12:06:44.882385 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:06:56 crc kubenswrapper[4766]: I1002 12:06:56.881759 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:06:56 crc kubenswrapper[4766]: E1002 12:06:56.883144 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:07:09 crc kubenswrapper[4766]: I1002 12:07:09.882396 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:07:09 crc kubenswrapper[4766]: E1002 12:07:09.883428 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:07:22 crc kubenswrapper[4766]: I1002 12:07:22.882404 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:07:22 crc kubenswrapper[4766]: E1002 12:07:22.883573 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:07:36 crc kubenswrapper[4766]: I1002 12:07:36.882069 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:07:36 crc kubenswrapper[4766]: E1002 12:07:36.883205 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:07:50 crc kubenswrapper[4766]: I1002 12:07:50.881904 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:07:50 crc kubenswrapper[4766]: E1002 12:07:50.884666 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:08:02 crc kubenswrapper[4766]: I1002 12:08:02.881961 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:08:03 crc kubenswrapper[4766]: I1002 12:08:03.627859 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"149f35bc8a77d5d6ff3ce7752d3d3ff07f43eb4b72bf1dbea39d29006fc83855"} Oct 02 12:10:09 crc kubenswrapper[4766]: I1002 12:10:09.851710 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-srn9g"] Oct 02 12:10:09 crc kubenswrapper[4766]: I1002 12:10:09.866427 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-srn9g"] Oct 02 12:10:09 crc kubenswrapper[4766]: I1002 12:10:09.892201 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd47b711-0420-4131-b61b-28b954f4fc9d" path="/var/lib/kubelet/pods/fd47b711-0420-4131-b61b-28b954f4fc9d/volumes" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.041171 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-nvzt6"] Oct 02 12:10:10 crc kubenswrapper[4766]: E1002 12:10:10.041596 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41213cb3-c99b-400d-b355-eb3b80506a64" containerName="registry-server" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.041620 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="41213cb3-c99b-400d-b355-eb3b80506a64" containerName="registry-server" Oct 02 12:10:10 crc kubenswrapper[4766]: E1002 12:10:10.041649 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41213cb3-c99b-400d-b355-eb3b80506a64" containerName="extract-utilities" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.041660 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="41213cb3-c99b-400d-b355-eb3b80506a64" containerName="extract-utilities" Oct 02 12:10:10 crc kubenswrapper[4766]: E1002 12:10:10.041676 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41213cb3-c99b-400d-b355-eb3b80506a64" containerName="extract-content" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.041685 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="41213cb3-c99b-400d-b355-eb3b80506a64" containerName="extract-content" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.041889 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="41213cb3-c99b-400d-b355-eb3b80506a64" containerName="registry-server" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.042529 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nvzt6" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.045161 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.046013 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.046276 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.048036 4766 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-rlbp5" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.059467 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfwfw\" (UniqueName: \"kubernetes.io/projected/aa1c262a-8898-409c-b02b-da37f6605897-kube-api-access-nfwfw\") pod \"crc-storage-crc-nvzt6\" (UID: \"aa1c262a-8898-409c-b02b-da37f6605897\") " pod="crc-storage/crc-storage-crc-nvzt6" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.059613 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aa1c262a-8898-409c-b02b-da37f6605897-node-mnt\") pod \"crc-storage-crc-nvzt6\" (UID: \"aa1c262a-8898-409c-b02b-da37f6605897\") " pod="crc-storage/crc-storage-crc-nvzt6" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.059754 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aa1c262a-8898-409c-b02b-da37f6605897-crc-storage\") pod \"crc-storage-crc-nvzt6\" (UID: \"aa1c262a-8898-409c-b02b-da37f6605897\") " pod="crc-storage/crc-storage-crc-nvzt6" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.065044 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-nvzt6"] Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.161238 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aa1c262a-8898-409c-b02b-da37f6605897-crc-storage\") pod \"crc-storage-crc-nvzt6\" (UID: \"aa1c262a-8898-409c-b02b-da37f6605897\") " pod="crc-storage/crc-storage-crc-nvzt6" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.161787 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfwfw\" (UniqueName: \"kubernetes.io/projected/aa1c262a-8898-409c-b02b-da37f6605897-kube-api-access-nfwfw\") pod \"crc-storage-crc-nvzt6\" (UID: \"aa1c262a-8898-409c-b02b-da37f6605897\") " pod="crc-storage/crc-storage-crc-nvzt6" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.161845 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aa1c262a-8898-409c-b02b-da37f6605897-node-mnt\") pod \"crc-storage-crc-nvzt6\" (UID: \"aa1c262a-8898-409c-b02b-da37f6605897\") " pod="crc-storage/crc-storage-crc-nvzt6" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.161991 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aa1c262a-8898-409c-b02b-da37f6605897-crc-storage\") pod \"crc-storage-crc-nvzt6\" (UID: \"aa1c262a-8898-409c-b02b-da37f6605897\") " pod="crc-storage/crc-storage-crc-nvzt6" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.162141 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aa1c262a-8898-409c-b02b-da37f6605897-node-mnt\") pod \"crc-storage-crc-nvzt6\" (UID: \"aa1c262a-8898-409c-b02b-da37f6605897\") " pod="crc-storage/crc-storage-crc-nvzt6" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.197560 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfwfw\" (UniqueName: \"kubernetes.io/projected/aa1c262a-8898-409c-b02b-da37f6605897-kube-api-access-nfwfw\") pod \"crc-storage-crc-nvzt6\" (UID: \"aa1c262a-8898-409c-b02b-da37f6605897\") " pod="crc-storage/crc-storage-crc-nvzt6" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.363732 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nvzt6" Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.639173 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-nvzt6"] Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.646181 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:10:10 crc kubenswrapper[4766]: I1002 12:10:10.677813 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-nvzt6" event={"ID":"aa1c262a-8898-409c-b02b-da37f6605897","Type":"ContainerStarted","Data":"568e7760c1630b55a322563d5435a1dc38b3bce5475a27fb90900ad4711d1300"} Oct 02 12:10:11 crc kubenswrapper[4766]: I1002 12:10:11.688199 4766 generic.go:334] "Generic (PLEG): container finished" podID="aa1c262a-8898-409c-b02b-da37f6605897" containerID="27a4033829e97c9864d1d7d2ef5a66362db8f2c09e7a0b22c0abdc34c63ef5aa" exitCode=0 Oct 02 12:10:11 crc kubenswrapper[4766]: I1002 12:10:11.688310 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-nvzt6" event={"ID":"aa1c262a-8898-409c-b02b-da37f6605897","Type":"ContainerDied","Data":"27a4033829e97c9864d1d7d2ef5a66362db8f2c09e7a0b22c0abdc34c63ef5aa"} Oct 02 12:10:12 crc kubenswrapper[4766]: I1002 12:10:12.997619 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nvzt6" Oct 02 12:10:13 crc kubenswrapper[4766]: I1002 12:10:13.124621 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfwfw\" (UniqueName: \"kubernetes.io/projected/aa1c262a-8898-409c-b02b-da37f6605897-kube-api-access-nfwfw\") pod \"aa1c262a-8898-409c-b02b-da37f6605897\" (UID: \"aa1c262a-8898-409c-b02b-da37f6605897\") " Oct 02 12:10:13 crc kubenswrapper[4766]: I1002 12:10:13.127834 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aa1c262a-8898-409c-b02b-da37f6605897-node-mnt\") pod \"aa1c262a-8898-409c-b02b-da37f6605897\" (UID: \"aa1c262a-8898-409c-b02b-da37f6605897\") " Oct 02 12:10:13 crc kubenswrapper[4766]: I1002 12:10:13.128009 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aa1c262a-8898-409c-b02b-da37f6605897-crc-storage\") pod \"aa1c262a-8898-409c-b02b-da37f6605897\" (UID: \"aa1c262a-8898-409c-b02b-da37f6605897\") " Oct 02 12:10:13 crc kubenswrapper[4766]: I1002 12:10:13.128004 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aa1c262a-8898-409c-b02b-da37f6605897-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "aa1c262a-8898-409c-b02b-da37f6605897" (UID: "aa1c262a-8898-409c-b02b-da37f6605897"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:10:13 crc kubenswrapper[4766]: I1002 12:10:13.129912 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa1c262a-8898-409c-b02b-da37f6605897-kube-api-access-nfwfw" (OuterVolumeSpecName: "kube-api-access-nfwfw") pod "aa1c262a-8898-409c-b02b-da37f6605897" (UID: "aa1c262a-8898-409c-b02b-da37f6605897"). InnerVolumeSpecName "kube-api-access-nfwfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:10:13 crc kubenswrapper[4766]: I1002 12:10:13.147749 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa1c262a-8898-409c-b02b-da37f6605897-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "aa1c262a-8898-409c-b02b-da37f6605897" (UID: "aa1c262a-8898-409c-b02b-da37f6605897"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:10:13 crc kubenswrapper[4766]: I1002 12:10:13.230386 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfwfw\" (UniqueName: \"kubernetes.io/projected/aa1c262a-8898-409c-b02b-da37f6605897-kube-api-access-nfwfw\") on node \"crc\" DevicePath \"\"" Oct 02 12:10:13 crc kubenswrapper[4766]: I1002 12:10:13.230427 4766 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aa1c262a-8898-409c-b02b-da37f6605897-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 02 12:10:13 crc kubenswrapper[4766]: I1002 12:10:13.230437 4766 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aa1c262a-8898-409c-b02b-da37f6605897-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 02 12:10:13 crc kubenswrapper[4766]: I1002 12:10:13.712375 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-nvzt6" event={"ID":"aa1c262a-8898-409c-b02b-da37f6605897","Type":"ContainerDied","Data":"568e7760c1630b55a322563d5435a1dc38b3bce5475a27fb90900ad4711d1300"} Oct 02 12:10:13 crc kubenswrapper[4766]: I1002 12:10:13.712461 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="568e7760c1630b55a322563d5435a1dc38b3bce5475a27fb90900ad4711d1300" Oct 02 12:10:13 crc kubenswrapper[4766]: I1002 12:10:13.712496 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-nvzt6" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.159611 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-nvzt6"] Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.164262 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-nvzt6"] Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.289357 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-94m8g"] Oct 02 12:10:15 crc kubenswrapper[4766]: E1002 12:10:15.289927 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa1c262a-8898-409c-b02b-da37f6605897" containerName="storage" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.289955 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa1c262a-8898-409c-b02b-da37f6605897" containerName="storage" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.290146 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa1c262a-8898-409c-b02b-da37f6605897" containerName="storage" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.290816 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-94m8g" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.293673 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.294338 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.296994 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.299141 4766 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-rlbp5" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.305067 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-94m8g"] Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.460245 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg5fm\" (UniqueName: \"kubernetes.io/projected/ec323e06-768e-4fca-a57c-bd73a1569660-kube-api-access-pg5fm\") pod \"crc-storage-crc-94m8g\" (UID: \"ec323e06-768e-4fca-a57c-bd73a1569660\") " pod="crc-storage/crc-storage-crc-94m8g" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.460653 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ec323e06-768e-4fca-a57c-bd73a1569660-crc-storage\") pod \"crc-storage-crc-94m8g\" (UID: \"ec323e06-768e-4fca-a57c-bd73a1569660\") " pod="crc-storage/crc-storage-crc-94m8g" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.460711 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ec323e06-768e-4fca-a57c-bd73a1569660-node-mnt\") pod \"crc-storage-crc-94m8g\" (UID: \"ec323e06-768e-4fca-a57c-bd73a1569660\") " pod="crc-storage/crc-storage-crc-94m8g" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.562203 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg5fm\" (UniqueName: \"kubernetes.io/projected/ec323e06-768e-4fca-a57c-bd73a1569660-kube-api-access-pg5fm\") pod \"crc-storage-crc-94m8g\" (UID: \"ec323e06-768e-4fca-a57c-bd73a1569660\") " pod="crc-storage/crc-storage-crc-94m8g" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.562675 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ec323e06-768e-4fca-a57c-bd73a1569660-crc-storage\") pod \"crc-storage-crc-94m8g\" (UID: \"ec323e06-768e-4fca-a57c-bd73a1569660\") " pod="crc-storage/crc-storage-crc-94m8g" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.563020 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ec323e06-768e-4fca-a57c-bd73a1569660-node-mnt\") pod \"crc-storage-crc-94m8g\" (UID: \"ec323e06-768e-4fca-a57c-bd73a1569660\") " pod="crc-storage/crc-storage-crc-94m8g" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.563625 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ec323e06-768e-4fca-a57c-bd73a1569660-node-mnt\") pod \"crc-storage-crc-94m8g\" (UID: \"ec323e06-768e-4fca-a57c-bd73a1569660\") " pod="crc-storage/crc-storage-crc-94m8g" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.563785 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ec323e06-768e-4fca-a57c-bd73a1569660-crc-storage\") pod \"crc-storage-crc-94m8g\" (UID: \"ec323e06-768e-4fca-a57c-bd73a1569660\") " pod="crc-storage/crc-storage-crc-94m8g" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.588652 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg5fm\" (UniqueName: \"kubernetes.io/projected/ec323e06-768e-4fca-a57c-bd73a1569660-kube-api-access-pg5fm\") pod \"crc-storage-crc-94m8g\" (UID: \"ec323e06-768e-4fca-a57c-bd73a1569660\") " pod="crc-storage/crc-storage-crc-94m8g" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.614542 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-94m8g" Oct 02 12:10:15 crc kubenswrapper[4766]: I1002 12:10:15.894296 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa1c262a-8898-409c-b02b-da37f6605897" path="/var/lib/kubelet/pods/aa1c262a-8898-409c-b02b-da37f6605897/volumes" Oct 02 12:10:16 crc kubenswrapper[4766]: I1002 12:10:16.031964 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-94m8g"] Oct 02 12:10:16 crc kubenswrapper[4766]: I1002 12:10:16.738075 4766 generic.go:334] "Generic (PLEG): container finished" podID="ec323e06-768e-4fca-a57c-bd73a1569660" containerID="be7727050c155ad2b49f6e5996097d7a9c312a789835ef7d54a225e217a2fcba" exitCode=0 Oct 02 12:10:16 crc kubenswrapper[4766]: I1002 12:10:16.738140 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-94m8g" event={"ID":"ec323e06-768e-4fca-a57c-bd73a1569660","Type":"ContainerDied","Data":"be7727050c155ad2b49f6e5996097d7a9c312a789835ef7d54a225e217a2fcba"} Oct 02 12:10:16 crc kubenswrapper[4766]: I1002 12:10:16.738493 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-94m8g" event={"ID":"ec323e06-768e-4fca-a57c-bd73a1569660","Type":"ContainerStarted","Data":"14897cf168dbebc10964fb6c56fb21b8316d6aac7d923028153052f20ed92bb9"} Oct 02 12:10:18 crc kubenswrapper[4766]: I1002 12:10:18.004842 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-94m8g" Oct 02 12:10:18 crc kubenswrapper[4766]: I1002 12:10:18.202795 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ec323e06-768e-4fca-a57c-bd73a1569660-crc-storage\") pod \"ec323e06-768e-4fca-a57c-bd73a1569660\" (UID: \"ec323e06-768e-4fca-a57c-bd73a1569660\") " Oct 02 12:10:18 crc kubenswrapper[4766]: I1002 12:10:18.202863 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ec323e06-768e-4fca-a57c-bd73a1569660-node-mnt\") pod \"ec323e06-768e-4fca-a57c-bd73a1569660\" (UID: \"ec323e06-768e-4fca-a57c-bd73a1569660\") " Oct 02 12:10:18 crc kubenswrapper[4766]: I1002 12:10:18.202967 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg5fm\" (UniqueName: \"kubernetes.io/projected/ec323e06-768e-4fca-a57c-bd73a1569660-kube-api-access-pg5fm\") pod \"ec323e06-768e-4fca-a57c-bd73a1569660\" (UID: \"ec323e06-768e-4fca-a57c-bd73a1569660\") " Oct 02 12:10:18 crc kubenswrapper[4766]: I1002 12:10:18.203531 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ec323e06-768e-4fca-a57c-bd73a1569660-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "ec323e06-768e-4fca-a57c-bd73a1569660" (UID: "ec323e06-768e-4fca-a57c-bd73a1569660"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:10:18 crc kubenswrapper[4766]: I1002 12:10:18.304762 4766 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/ec323e06-768e-4fca-a57c-bd73a1569660-node-mnt\") on node \"crc\" DevicePath \"\"" Oct 02 12:10:18 crc kubenswrapper[4766]: I1002 12:10:18.593769 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec323e06-768e-4fca-a57c-bd73a1569660-kube-api-access-pg5fm" (OuterVolumeSpecName: "kube-api-access-pg5fm") pod "ec323e06-768e-4fca-a57c-bd73a1569660" (UID: "ec323e06-768e-4fca-a57c-bd73a1569660"). InnerVolumeSpecName "kube-api-access-pg5fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:10:18 crc kubenswrapper[4766]: I1002 12:10:18.595005 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec323e06-768e-4fca-a57c-bd73a1569660-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "ec323e06-768e-4fca-a57c-bd73a1569660" (UID: "ec323e06-768e-4fca-a57c-bd73a1569660"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:10:18 crc kubenswrapper[4766]: I1002 12:10:18.610198 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg5fm\" (UniqueName: \"kubernetes.io/projected/ec323e06-768e-4fca-a57c-bd73a1569660-kube-api-access-pg5fm\") on node \"crc\" DevicePath \"\"" Oct 02 12:10:18 crc kubenswrapper[4766]: I1002 12:10:18.610236 4766 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/ec323e06-768e-4fca-a57c-bd73a1569660-crc-storage\") on node \"crc\" DevicePath \"\"" Oct 02 12:10:18 crc kubenswrapper[4766]: I1002 12:10:18.760783 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-94m8g" event={"ID":"ec323e06-768e-4fca-a57c-bd73a1569660","Type":"ContainerDied","Data":"14897cf168dbebc10964fb6c56fb21b8316d6aac7d923028153052f20ed92bb9"} Oct 02 12:10:18 crc kubenswrapper[4766]: I1002 12:10:18.760849 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14897cf168dbebc10964fb6c56fb21b8316d6aac7d923028153052f20ed92bb9" Oct 02 12:10:18 crc kubenswrapper[4766]: I1002 12:10:18.760915 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-94m8g" Oct 02 12:10:24 crc kubenswrapper[4766]: I1002 12:10:24.432784 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:10:24 crc kubenswrapper[4766]: I1002 12:10:24.433867 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:10:34 crc kubenswrapper[4766]: I1002 12:10:34.623248 4766 scope.go:117] "RemoveContainer" containerID="edeabddb523219c5c33771eb075692170d60afac7520813de77b0e4eb035b79d" Oct 02 12:10:54 crc kubenswrapper[4766]: I1002 12:10:54.433190 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:10:54 crc kubenswrapper[4766]: I1002 12:10:54.437352 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:11:24 crc kubenswrapper[4766]: I1002 12:11:24.432715 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:11:24 crc kubenswrapper[4766]: I1002 12:11:24.433441 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:11:24 crc kubenswrapper[4766]: I1002 12:11:24.433552 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 12:11:24 crc kubenswrapper[4766]: I1002 12:11:24.434317 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"149f35bc8a77d5d6ff3ce7752d3d3ff07f43eb4b72bf1dbea39d29006fc83855"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:11:24 crc kubenswrapper[4766]: I1002 12:11:24.434371 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://149f35bc8a77d5d6ff3ce7752d3d3ff07f43eb4b72bf1dbea39d29006fc83855" gracePeriod=600 Oct 02 12:11:25 crc kubenswrapper[4766]: I1002 12:11:25.331962 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="149f35bc8a77d5d6ff3ce7752d3d3ff07f43eb4b72bf1dbea39d29006fc83855" exitCode=0 Oct 02 12:11:25 crc kubenswrapper[4766]: I1002 12:11:25.332041 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"149f35bc8a77d5d6ff3ce7752d3d3ff07f43eb4b72bf1dbea39d29006fc83855"} Oct 02 12:11:25 crc kubenswrapper[4766]: I1002 12:11:25.332649 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465"} Oct 02 12:11:25 crc kubenswrapper[4766]: I1002 12:11:25.332678 4766 scope.go:117] "RemoveContainer" containerID="85cd12bae0005883b273646676b810bdd61ec6505d72b36c38e075df93ce2a8f" Oct 02 12:11:54 crc kubenswrapper[4766]: I1002 12:11:54.883394 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dd7zs"] Oct 02 12:11:54 crc kubenswrapper[4766]: E1002 12:11:54.884758 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec323e06-768e-4fca-a57c-bd73a1569660" containerName="storage" Oct 02 12:11:54 crc kubenswrapper[4766]: I1002 12:11:54.884777 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec323e06-768e-4fca-a57c-bd73a1569660" containerName="storage" Oct 02 12:11:54 crc kubenswrapper[4766]: I1002 12:11:54.884962 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec323e06-768e-4fca-a57c-bd73a1569660" containerName="storage" Oct 02 12:11:54 crc kubenswrapper[4766]: I1002 12:11:54.886311 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dd7zs" Oct 02 12:11:54 crc kubenswrapper[4766]: I1002 12:11:54.900620 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dd7zs"] Oct 02 12:11:55 crc kubenswrapper[4766]: I1002 12:11:55.016274 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdf1d736-4fa8-4382-a758-f32a73580d2f-utilities\") pod \"community-operators-dd7zs\" (UID: \"cdf1d736-4fa8-4382-a758-f32a73580d2f\") " pod="openshift-marketplace/community-operators-dd7zs" Oct 02 12:11:55 crc kubenswrapper[4766]: I1002 12:11:55.016518 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdf1d736-4fa8-4382-a758-f32a73580d2f-catalog-content\") pod \"community-operators-dd7zs\" (UID: \"cdf1d736-4fa8-4382-a758-f32a73580d2f\") " pod="openshift-marketplace/community-operators-dd7zs" Oct 02 12:11:55 crc kubenswrapper[4766]: I1002 12:11:55.016831 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrbkn\" (UniqueName: \"kubernetes.io/projected/cdf1d736-4fa8-4382-a758-f32a73580d2f-kube-api-access-wrbkn\") pod \"community-operators-dd7zs\" (UID: \"cdf1d736-4fa8-4382-a758-f32a73580d2f\") " pod="openshift-marketplace/community-operators-dd7zs" Oct 02 12:11:55 crc kubenswrapper[4766]: I1002 12:11:55.118711 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrbkn\" (UniqueName: \"kubernetes.io/projected/cdf1d736-4fa8-4382-a758-f32a73580d2f-kube-api-access-wrbkn\") pod \"community-operators-dd7zs\" (UID: \"cdf1d736-4fa8-4382-a758-f32a73580d2f\") " pod="openshift-marketplace/community-operators-dd7zs" Oct 02 12:11:55 crc kubenswrapper[4766]: I1002 12:11:55.118841 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdf1d736-4fa8-4382-a758-f32a73580d2f-utilities\") pod \"community-operators-dd7zs\" (UID: \"cdf1d736-4fa8-4382-a758-f32a73580d2f\") " pod="openshift-marketplace/community-operators-dd7zs" Oct 02 12:11:55 crc kubenswrapper[4766]: I1002 12:11:55.118877 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdf1d736-4fa8-4382-a758-f32a73580d2f-catalog-content\") pod \"community-operators-dd7zs\" (UID: \"cdf1d736-4fa8-4382-a758-f32a73580d2f\") " pod="openshift-marketplace/community-operators-dd7zs" Oct 02 12:11:55 crc kubenswrapper[4766]: I1002 12:11:55.119332 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdf1d736-4fa8-4382-a758-f32a73580d2f-utilities\") pod \"community-operators-dd7zs\" (UID: \"cdf1d736-4fa8-4382-a758-f32a73580d2f\") " pod="openshift-marketplace/community-operators-dd7zs" Oct 02 12:11:55 crc kubenswrapper[4766]: I1002 12:11:55.119889 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdf1d736-4fa8-4382-a758-f32a73580d2f-catalog-content\") pod \"community-operators-dd7zs\" (UID: \"cdf1d736-4fa8-4382-a758-f32a73580d2f\") " pod="openshift-marketplace/community-operators-dd7zs" Oct 02 12:11:55 crc kubenswrapper[4766]: I1002 12:11:55.147048 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrbkn\" (UniqueName: \"kubernetes.io/projected/cdf1d736-4fa8-4382-a758-f32a73580d2f-kube-api-access-wrbkn\") pod \"community-operators-dd7zs\" (UID: \"cdf1d736-4fa8-4382-a758-f32a73580d2f\") " pod="openshift-marketplace/community-operators-dd7zs" Oct 02 12:11:55 crc kubenswrapper[4766]: I1002 12:11:55.207130 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dd7zs" Oct 02 12:11:55 crc kubenswrapper[4766]: I1002 12:11:55.695249 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dd7zs"] Oct 02 12:11:56 crc kubenswrapper[4766]: I1002 12:11:56.600234 4766 generic.go:334] "Generic (PLEG): container finished" podID="cdf1d736-4fa8-4382-a758-f32a73580d2f" containerID="6bca3db9da6db4fa2b1138937920334c826d06dc3c978dd55c7203f2887eeca4" exitCode=0 Oct 02 12:11:56 crc kubenswrapper[4766]: I1002 12:11:56.600351 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7zs" event={"ID":"cdf1d736-4fa8-4382-a758-f32a73580d2f","Type":"ContainerDied","Data":"6bca3db9da6db4fa2b1138937920334c826d06dc3c978dd55c7203f2887eeca4"} Oct 02 12:11:56 crc kubenswrapper[4766]: I1002 12:11:56.600582 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7zs" event={"ID":"cdf1d736-4fa8-4382-a758-f32a73580d2f","Type":"ContainerStarted","Data":"b17c3498353bfa94158be633652b8a6ebbdcf8841b9b686fc703a102592ff041"} Oct 02 12:11:58 crc kubenswrapper[4766]: I1002 12:11:58.623326 4766 generic.go:334] "Generic (PLEG): container finished" podID="cdf1d736-4fa8-4382-a758-f32a73580d2f" containerID="9417bfa72165a8b67683ecb4db36949dad4b852cfdfe5300406749df3ed380bc" exitCode=0 Oct 02 12:11:58 crc kubenswrapper[4766]: I1002 12:11:58.623405 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7zs" event={"ID":"cdf1d736-4fa8-4382-a758-f32a73580d2f","Type":"ContainerDied","Data":"9417bfa72165a8b67683ecb4db36949dad4b852cfdfe5300406749df3ed380bc"} Oct 02 12:11:59 crc kubenswrapper[4766]: I1002 12:11:59.633789 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7zs" event={"ID":"cdf1d736-4fa8-4382-a758-f32a73580d2f","Type":"ContainerStarted","Data":"d631fd787f5ca6d6c514f7d89b93ccc5df588ccf0c18bc8670c7e241f922ea85"} Oct 02 12:11:59 crc kubenswrapper[4766]: I1002 12:11:59.659758 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dd7zs" podStartSLOduration=3.184098301 podStartE2EDuration="5.659719484s" podCreationTimestamp="2025-10-02 12:11:54 +0000 UTC" firstStartedPulling="2025-10-02 12:11:56.602084043 +0000 UTC m=+4831.544954987" lastFinishedPulling="2025-10-02 12:11:59.077705226 +0000 UTC m=+4834.020576170" observedRunningTime="2025-10-02 12:11:59.656679436 +0000 UTC m=+4834.599550390" watchObservedRunningTime="2025-10-02 12:11:59.659719484 +0000 UTC m=+4834.602590448" Oct 02 12:12:05 crc kubenswrapper[4766]: I1002 12:12:05.207942 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dd7zs" Oct 02 12:12:05 crc kubenswrapper[4766]: I1002 12:12:05.208680 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dd7zs" Oct 02 12:12:05 crc kubenswrapper[4766]: I1002 12:12:05.254569 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dd7zs" Oct 02 12:12:05 crc kubenswrapper[4766]: I1002 12:12:05.719820 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dd7zs" Oct 02 12:12:05 crc kubenswrapper[4766]: I1002 12:12:05.769712 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dd7zs"] Oct 02 12:12:07 crc kubenswrapper[4766]: I1002 12:12:07.689132 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dd7zs" podUID="cdf1d736-4fa8-4382-a758-f32a73580d2f" containerName="registry-server" containerID="cri-o://d631fd787f5ca6d6c514f7d89b93ccc5df588ccf0c18bc8670c7e241f922ea85" gracePeriod=2 Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.097785 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dd7zs" Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.246429 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdf1d736-4fa8-4382-a758-f32a73580d2f-utilities\") pod \"cdf1d736-4fa8-4382-a758-f32a73580d2f\" (UID: \"cdf1d736-4fa8-4382-a758-f32a73580d2f\") " Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.246649 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrbkn\" (UniqueName: \"kubernetes.io/projected/cdf1d736-4fa8-4382-a758-f32a73580d2f-kube-api-access-wrbkn\") pod \"cdf1d736-4fa8-4382-a758-f32a73580d2f\" (UID: \"cdf1d736-4fa8-4382-a758-f32a73580d2f\") " Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.246692 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdf1d736-4fa8-4382-a758-f32a73580d2f-catalog-content\") pod \"cdf1d736-4fa8-4382-a758-f32a73580d2f\" (UID: \"cdf1d736-4fa8-4382-a758-f32a73580d2f\") " Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.247794 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdf1d736-4fa8-4382-a758-f32a73580d2f-utilities" (OuterVolumeSpecName: "utilities") pod "cdf1d736-4fa8-4382-a758-f32a73580d2f" (UID: "cdf1d736-4fa8-4382-a758-f32a73580d2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.256141 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdf1d736-4fa8-4382-a758-f32a73580d2f-kube-api-access-wrbkn" (OuterVolumeSpecName: "kube-api-access-wrbkn") pod "cdf1d736-4fa8-4382-a758-f32a73580d2f" (UID: "cdf1d736-4fa8-4382-a758-f32a73580d2f"). InnerVolumeSpecName "kube-api-access-wrbkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.306024 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdf1d736-4fa8-4382-a758-f32a73580d2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cdf1d736-4fa8-4382-a758-f32a73580d2f" (UID: "cdf1d736-4fa8-4382-a758-f32a73580d2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.348955 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrbkn\" (UniqueName: \"kubernetes.io/projected/cdf1d736-4fa8-4382-a758-f32a73580d2f-kube-api-access-wrbkn\") on node \"crc\" DevicePath \"\"" Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.349376 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cdf1d736-4fa8-4382-a758-f32a73580d2f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.349442 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cdf1d736-4fa8-4382-a758-f32a73580d2f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.701314 4766 generic.go:334] "Generic (PLEG): container finished" podID="cdf1d736-4fa8-4382-a758-f32a73580d2f" containerID="d631fd787f5ca6d6c514f7d89b93ccc5df588ccf0c18bc8670c7e241f922ea85" exitCode=0 Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.701416 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dd7zs" Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.701447 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7zs" event={"ID":"cdf1d736-4fa8-4382-a758-f32a73580d2f","Type":"ContainerDied","Data":"d631fd787f5ca6d6c514f7d89b93ccc5df588ccf0c18bc8670c7e241f922ea85"} Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.702590 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dd7zs" event={"ID":"cdf1d736-4fa8-4382-a758-f32a73580d2f","Type":"ContainerDied","Data":"b17c3498353bfa94158be633652b8a6ebbdcf8841b9b686fc703a102592ff041"} Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.702622 4766 scope.go:117] "RemoveContainer" containerID="d631fd787f5ca6d6c514f7d89b93ccc5df588ccf0c18bc8670c7e241f922ea85" Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.728241 4766 scope.go:117] "RemoveContainer" containerID="9417bfa72165a8b67683ecb4db36949dad4b852cfdfe5300406749df3ed380bc" Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.745182 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dd7zs"] Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.751118 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dd7zs"] Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.779652 4766 scope.go:117] "RemoveContainer" containerID="6bca3db9da6db4fa2b1138937920334c826d06dc3c978dd55c7203f2887eeca4" Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.798567 4766 scope.go:117] "RemoveContainer" containerID="d631fd787f5ca6d6c514f7d89b93ccc5df588ccf0c18bc8670c7e241f922ea85" Oct 02 12:12:08 crc kubenswrapper[4766]: E1002 12:12:08.799256 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d631fd787f5ca6d6c514f7d89b93ccc5df588ccf0c18bc8670c7e241f922ea85\": container with ID starting with d631fd787f5ca6d6c514f7d89b93ccc5df588ccf0c18bc8670c7e241f922ea85 not found: ID does not exist" containerID="d631fd787f5ca6d6c514f7d89b93ccc5df588ccf0c18bc8670c7e241f922ea85" Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.799308 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d631fd787f5ca6d6c514f7d89b93ccc5df588ccf0c18bc8670c7e241f922ea85"} err="failed to get container status \"d631fd787f5ca6d6c514f7d89b93ccc5df588ccf0c18bc8670c7e241f922ea85\": rpc error: code = NotFound desc = could not find container \"d631fd787f5ca6d6c514f7d89b93ccc5df588ccf0c18bc8670c7e241f922ea85\": container with ID starting with d631fd787f5ca6d6c514f7d89b93ccc5df588ccf0c18bc8670c7e241f922ea85 not found: ID does not exist" Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.799347 4766 scope.go:117] "RemoveContainer" containerID="9417bfa72165a8b67683ecb4db36949dad4b852cfdfe5300406749df3ed380bc" Oct 02 12:12:08 crc kubenswrapper[4766]: E1002 12:12:08.799913 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9417bfa72165a8b67683ecb4db36949dad4b852cfdfe5300406749df3ed380bc\": container with ID starting with 9417bfa72165a8b67683ecb4db36949dad4b852cfdfe5300406749df3ed380bc not found: ID does not exist" containerID="9417bfa72165a8b67683ecb4db36949dad4b852cfdfe5300406749df3ed380bc" Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.799972 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9417bfa72165a8b67683ecb4db36949dad4b852cfdfe5300406749df3ed380bc"} err="failed to get container status \"9417bfa72165a8b67683ecb4db36949dad4b852cfdfe5300406749df3ed380bc\": rpc error: code = NotFound desc = could not find container \"9417bfa72165a8b67683ecb4db36949dad4b852cfdfe5300406749df3ed380bc\": container with ID starting with 9417bfa72165a8b67683ecb4db36949dad4b852cfdfe5300406749df3ed380bc not found: ID does not exist" Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.800019 4766 scope.go:117] "RemoveContainer" containerID="6bca3db9da6db4fa2b1138937920334c826d06dc3c978dd55c7203f2887eeca4" Oct 02 12:12:08 crc kubenswrapper[4766]: E1002 12:12:08.800421 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bca3db9da6db4fa2b1138937920334c826d06dc3c978dd55c7203f2887eeca4\": container with ID starting with 6bca3db9da6db4fa2b1138937920334c826d06dc3c978dd55c7203f2887eeca4 not found: ID does not exist" containerID="6bca3db9da6db4fa2b1138937920334c826d06dc3c978dd55c7203f2887eeca4" Oct 02 12:12:08 crc kubenswrapper[4766]: I1002 12:12:08.800450 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bca3db9da6db4fa2b1138937920334c826d06dc3c978dd55c7203f2887eeca4"} err="failed to get container status \"6bca3db9da6db4fa2b1138937920334c826d06dc3c978dd55c7203f2887eeca4\": rpc error: code = NotFound desc = could not find container \"6bca3db9da6db4fa2b1138937920334c826d06dc3c978dd55c7203f2887eeca4\": container with ID starting with 6bca3db9da6db4fa2b1138937920334c826d06dc3c978dd55c7203f2887eeca4 not found: ID does not exist" Oct 02 12:12:09 crc kubenswrapper[4766]: I1002 12:12:09.893299 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdf1d736-4fa8-4382-a758-f32a73580d2f" path="/var/lib/kubelet/pods/cdf1d736-4fa8-4382-a758-f32a73580d2f/volumes" Oct 02 12:13:22 crc kubenswrapper[4766]: I1002 12:13:22.798601 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-v6tgj"] Oct 02 12:13:22 crc kubenswrapper[4766]: E1002 12:13:22.799702 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf1d736-4fa8-4382-a758-f32a73580d2f" containerName="extract-utilities" Oct 02 12:13:22 crc kubenswrapper[4766]: I1002 12:13:22.799716 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf1d736-4fa8-4382-a758-f32a73580d2f" containerName="extract-utilities" Oct 02 12:13:22 crc kubenswrapper[4766]: E1002 12:13:22.799749 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf1d736-4fa8-4382-a758-f32a73580d2f" containerName="registry-server" Oct 02 12:13:22 crc kubenswrapper[4766]: I1002 12:13:22.799758 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf1d736-4fa8-4382-a758-f32a73580d2f" containerName="registry-server" Oct 02 12:13:22 crc kubenswrapper[4766]: E1002 12:13:22.799770 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdf1d736-4fa8-4382-a758-f32a73580d2f" containerName="extract-content" Oct 02 12:13:22 crc kubenswrapper[4766]: I1002 12:13:22.799776 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf1d736-4fa8-4382-a758-f32a73580d2f" containerName="extract-content" Oct 02 12:13:22 crc kubenswrapper[4766]: I1002 12:13:22.799928 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdf1d736-4fa8-4382-a758-f32a73580d2f" containerName="registry-server" Oct 02 12:13:22 crc kubenswrapper[4766]: I1002 12:13:22.806294 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" Oct 02 12:13:22 crc kubenswrapper[4766]: I1002 12:13:22.809635 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 02 12:13:22 crc kubenswrapper[4766]: I1002 12:13:22.809872 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 02 12:13:22 crc kubenswrapper[4766]: I1002 12:13:22.810238 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-26rs8" Oct 02 12:13:22 crc kubenswrapper[4766]: I1002 12:13:22.815310 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 02 12:13:22 crc kubenswrapper[4766]: I1002 12:13:22.815607 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 02 12:13:22 crc kubenswrapper[4766]: I1002 12:13:22.836226 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-v6tgj"] Oct 02 12:13:22 crc kubenswrapper[4766]: I1002 12:13:22.990287 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnz5n\" (UniqueName: \"kubernetes.io/projected/c6424bd4-37c8-4466-b33b-00f7671e9421-kube-api-access-gnz5n\") pod \"dnsmasq-dns-5d7b5456f5-v6tgj\" (UID: \"c6424bd4-37c8-4466-b33b-00f7671e9421\") " pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" Oct 02 12:13:22 crc kubenswrapper[4766]: I1002 12:13:22.990369 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6424bd4-37c8-4466-b33b-00f7671e9421-config\") pod \"dnsmasq-dns-5d7b5456f5-v6tgj\" (UID: \"c6424bd4-37c8-4466-b33b-00f7671e9421\") " pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" Oct 02 12:13:22 crc kubenswrapper[4766]: I1002 12:13:22.990751 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6424bd4-37c8-4466-b33b-00f7671e9421-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-v6tgj\" (UID: \"c6424bd4-37c8-4466-b33b-00f7671e9421\") " pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.092976 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnz5n\" (UniqueName: \"kubernetes.io/projected/c6424bd4-37c8-4466-b33b-00f7671e9421-kube-api-access-gnz5n\") pod \"dnsmasq-dns-5d7b5456f5-v6tgj\" (UID: \"c6424bd4-37c8-4466-b33b-00f7671e9421\") " pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.093064 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6424bd4-37c8-4466-b33b-00f7671e9421-config\") pod \"dnsmasq-dns-5d7b5456f5-v6tgj\" (UID: \"c6424bd4-37c8-4466-b33b-00f7671e9421\") " pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.093141 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6424bd4-37c8-4466-b33b-00f7671e9421-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-v6tgj\" (UID: \"c6424bd4-37c8-4466-b33b-00f7671e9421\") " pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.094458 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6424bd4-37c8-4466-b33b-00f7671e9421-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-v6tgj\" (UID: \"c6424bd4-37c8-4466-b33b-00f7671e9421\") " pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.095554 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6424bd4-37c8-4466-b33b-00f7671e9421-config\") pod \"dnsmasq-dns-5d7b5456f5-v6tgj\" (UID: \"c6424bd4-37c8-4466-b33b-00f7671e9421\") " pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.111951 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-cz2f5"] Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.114362 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.126780 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnz5n\" (UniqueName: \"kubernetes.io/projected/c6424bd4-37c8-4466-b33b-00f7671e9421-kube-api-access-gnz5n\") pod \"dnsmasq-dns-5d7b5456f5-v6tgj\" (UID: \"c6424bd4-37c8-4466-b33b-00f7671e9421\") " pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.129640 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-cz2f5"] Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.140847 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.301336 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4191acc-30f7-4f6a-9812-ac63638e2663-config\") pod \"dnsmasq-dns-98ddfc8f-cz2f5\" (UID: \"a4191acc-30f7-4f6a-9812-ac63638e2663\") " pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.301984 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4191acc-30f7-4f6a-9812-ac63638e2663-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-cz2f5\" (UID: \"a4191acc-30f7-4f6a-9812-ac63638e2663\") " pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.302025 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wnts\" (UniqueName: \"kubernetes.io/projected/a4191acc-30f7-4f6a-9812-ac63638e2663-kube-api-access-4wnts\") pod \"dnsmasq-dns-98ddfc8f-cz2f5\" (UID: \"a4191acc-30f7-4f6a-9812-ac63638e2663\") " pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.413568 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4191acc-30f7-4f6a-9812-ac63638e2663-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-cz2f5\" (UID: \"a4191acc-30f7-4f6a-9812-ac63638e2663\") " pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.413654 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wnts\" (UniqueName: \"kubernetes.io/projected/a4191acc-30f7-4f6a-9812-ac63638e2663-kube-api-access-4wnts\") pod \"dnsmasq-dns-98ddfc8f-cz2f5\" (UID: \"a4191acc-30f7-4f6a-9812-ac63638e2663\") " pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.413772 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4191acc-30f7-4f6a-9812-ac63638e2663-config\") pod \"dnsmasq-dns-98ddfc8f-cz2f5\" (UID: \"a4191acc-30f7-4f6a-9812-ac63638e2663\") " pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.423642 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4191acc-30f7-4f6a-9812-ac63638e2663-config\") pod \"dnsmasq-dns-98ddfc8f-cz2f5\" (UID: \"a4191acc-30f7-4f6a-9812-ac63638e2663\") " pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.425321 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4191acc-30f7-4f6a-9812-ac63638e2663-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-cz2f5\" (UID: \"a4191acc-30f7-4f6a-9812-ac63638e2663\") " pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.503654 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wnts\" (UniqueName: \"kubernetes.io/projected/a4191acc-30f7-4f6a-9812-ac63638e2663-kube-api-access-4wnts\") pod \"dnsmasq-dns-98ddfc8f-cz2f5\" (UID: \"a4191acc-30f7-4f6a-9812-ac63638e2663\") " pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.521103 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.763783 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-v6tgj"] Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.962246 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.964522 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.969212 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.969427 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-t4dn7" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.969599 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.969786 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.969876 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 02 12:13:23 crc kubenswrapper[4766]: I1002 12:13:23.978941 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.070336 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-cz2f5"] Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.130294 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.130884 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.130915 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zpqs\" (UniqueName: \"kubernetes.io/projected/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-kube-api-access-6zpqs\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.130947 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.131190 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.131280 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.131313 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.131339 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.131416 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.232798 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.232912 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.232956 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.232980 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.233010 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.233047 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.233086 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.233129 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.233159 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zpqs\" (UniqueName: \"kubernetes.io/projected/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-kube-api-access-6zpqs\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.234868 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.234905 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.235463 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.235917 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-server-conf\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.239409 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.239719 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.240217 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.240267 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7a3667aa3bc7fedcb15e0761f0f8e5d7b7b40152f5d342ccea1cf24b99f9ff61/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.246110 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.255343 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zpqs\" (UniqueName: \"kubernetes.io/projected/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-kube-api-access-6zpqs\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.277139 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\") pod \"rabbitmq-server-0\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.298071 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.304295 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.307837 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.308107 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.308251 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.310874 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-nt6jm" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.311187 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.319749 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.339535 4766 generic.go:334] "Generic (PLEG): container finished" podID="c6424bd4-37c8-4466-b33b-00f7671e9421" containerID="acdd72bd48a8e9bf39fb0815ef46e4d19f3a5927f07ff46108e295d4dbaa9159" exitCode=0 Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.339697 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" event={"ID":"c6424bd4-37c8-4466-b33b-00f7671e9421","Type":"ContainerDied","Data":"acdd72bd48a8e9bf39fb0815ef46e4d19f3a5927f07ff46108e295d4dbaa9159"} Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.339738 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" event={"ID":"c6424bd4-37c8-4466-b33b-00f7671e9421","Type":"ContainerStarted","Data":"89a3254f807cff3e973c8396d772612a700b6b87179c1795eec7ddf181778cd0"} Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.344154 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" event={"ID":"a4191acc-30f7-4f6a-9812-ac63638e2663","Type":"ContainerStarted","Data":"bae803c19d5e7dff4d8ae90a6530669401e97afbb34ebc584b169113c930fcd9"} Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.344270 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" event={"ID":"a4191acc-30f7-4f6a-9812-ac63638e2663","Type":"ContainerStarted","Data":"1325b0b9db593c1409d83eb23058e4fbfafdaeb9b6f1a42fd0c603dea0deb30a"} Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.349436 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.432022 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.432119 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.443152 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.443395 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.443465 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.443487 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.443606 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.443685 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.443744 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.443783 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.443878 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt77j\" (UniqueName: \"kubernetes.io/projected/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-kube-api-access-pt77j\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.545528 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.546003 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.546051 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.546107 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt77j\" (UniqueName: \"kubernetes.io/projected/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-kube-api-access-pt77j\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.546196 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.546274 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.546315 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.546346 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.546411 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.547140 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.549397 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.550024 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.550073 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.550134 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9aa24177a2f910eebba2713fc08ade1562fcbca42d7781d2bc9462bdc31af46e/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.552673 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.559168 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.559388 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.559416 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.569705 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt77j\" (UniqueName: \"kubernetes.io/projected/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-kube-api-access-pt77j\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.590293 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\") pod \"rabbitmq-cell1-server-0\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.639183 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.712842 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:13:24 crc kubenswrapper[4766]: W1002 12:13:24.728054 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f3ea03a_d14b_4bf3_b67d_7c0f72123842.slice/crio-32360ef05f68216c99ab1279483e3a292d6abfc73bd171c1fe53a8788b371eb1 WatchSource:0}: Error finding container 32360ef05f68216c99ab1279483e3a292d6abfc73bd171c1fe53a8788b371eb1: Status 404 returned error can't find the container with id 32360ef05f68216c99ab1279483e3a292d6abfc73bd171c1fe53a8788b371eb1 Oct 02 12:13:24 crc kubenswrapper[4766]: I1002 12:13:24.896637 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:13:24 crc kubenswrapper[4766]: W1002 12:13:24.902198 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb85ab82d_faf9_4cc4_b020_4c1db1c2b0f5.slice/crio-cfe2c50a44b97c2216866317b6fbb00a3e5bc3df4a3187988fe67b479ae8b0b4 WatchSource:0}: Error finding container cfe2c50a44b97c2216866317b6fbb00a3e5bc3df4a3187988fe67b479ae8b0b4: Status 404 returned error can't find the container with id cfe2c50a44b97c2216866317b6fbb00a3e5bc3df4a3187988fe67b479ae8b0b4 Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.345388 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.347064 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.355774 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5","Type":"ContainerStarted","Data":"cfe2c50a44b97c2216866317b6fbb00a3e5bc3df4a3187988fe67b479ae8b0b4"} Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.358104 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" event={"ID":"c6424bd4-37c8-4466-b33b-00f7671e9421","Type":"ContainerStarted","Data":"f764df5c2a65ba66732533e005f93097f0702267afdb63e3c7c6ba690e4cf61a"} Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.358320 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.358440 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.360180 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.360215 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-kvjz5" Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.362884 4766 generic.go:334] "Generic (PLEG): container finished" podID="a4191acc-30f7-4f6a-9812-ac63638e2663" containerID="bae803c19d5e7dff4d8ae90a6530669401e97afbb34ebc584b169113c930fcd9" exitCode=0 Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.363160 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" event={"ID":"a4191acc-30f7-4f6a-9812-ac63638e2663","Type":"ContainerDied","Data":"bae803c19d5e7dff4d8ae90a6530669401e97afbb34ebc584b169113c930fcd9"} Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.368867 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3f3ea03a-d14b-4bf3-b67d-7c0f72123842","Type":"ContainerStarted","Data":"32360ef05f68216c99ab1279483e3a292d6abfc73bd171c1fe53a8788b371eb1"} Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.422289 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" podStartSLOduration=3.422262825 podStartE2EDuration="3.422262825s" podCreationTimestamp="2025-10-02 12:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:13:25.416219361 +0000 UTC m=+4920.359090305" watchObservedRunningTime="2025-10-02 12:13:25.422262825 +0000 UTC m=+4920.365133769" Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.464782 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d388c6de-eb31-4738-a537-8679908b7240-kolla-config\") pod \"memcached-0\" (UID: \"d388c6de-eb31-4738-a537-8679908b7240\") " pod="openstack/memcached-0" Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.464904 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ssww\" (UniqueName: \"kubernetes.io/projected/d388c6de-eb31-4738-a537-8679908b7240-kube-api-access-7ssww\") pod \"memcached-0\" (UID: \"d388c6de-eb31-4738-a537-8679908b7240\") " pod="openstack/memcached-0" Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.465001 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d388c6de-eb31-4738-a537-8679908b7240-config-data\") pod \"memcached-0\" (UID: \"d388c6de-eb31-4738-a537-8679908b7240\") " pod="openstack/memcached-0" Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.566846 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d388c6de-eb31-4738-a537-8679908b7240-kolla-config\") pod \"memcached-0\" (UID: \"d388c6de-eb31-4738-a537-8679908b7240\") " pod="openstack/memcached-0" Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.566947 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ssww\" (UniqueName: \"kubernetes.io/projected/d388c6de-eb31-4738-a537-8679908b7240-kube-api-access-7ssww\") pod \"memcached-0\" (UID: \"d388c6de-eb31-4738-a537-8679908b7240\") " pod="openstack/memcached-0" Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.567019 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d388c6de-eb31-4738-a537-8679908b7240-config-data\") pod \"memcached-0\" (UID: \"d388c6de-eb31-4738-a537-8679908b7240\") " pod="openstack/memcached-0" Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.568072 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d388c6de-eb31-4738-a537-8679908b7240-kolla-config\") pod \"memcached-0\" (UID: \"d388c6de-eb31-4738-a537-8679908b7240\") " pod="openstack/memcached-0" Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.568293 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d388c6de-eb31-4738-a537-8679908b7240-config-data\") pod \"memcached-0\" (UID: \"d388c6de-eb31-4738-a537-8679908b7240\") " pod="openstack/memcached-0" Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.587949 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ssww\" (UniqueName: \"kubernetes.io/projected/d388c6de-eb31-4738-a537-8679908b7240-kube-api-access-7ssww\") pod \"memcached-0\" (UID: \"d388c6de-eb31-4738-a537-8679908b7240\") " pod="openstack/memcached-0" Oct 02 12:13:25 crc kubenswrapper[4766]: I1002 12:13:25.667969 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.278388 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 02 12:13:26 crc kubenswrapper[4766]: W1002 12:13:26.286136 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd388c6de_eb31_4738_a537_8679908b7240.slice/crio-3d443c2a07d117fd68a6413cd9797a42e1c4a71c99969f73baf06382b85a9215 WatchSource:0}: Error finding container 3d443c2a07d117fd68a6413cd9797a42e1c4a71c99969f73baf06382b85a9215: Status 404 returned error can't find the container with id 3d443c2a07d117fd68a6413cd9797a42e1c4a71c99969f73baf06382b85a9215 Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.379558 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d388c6de-eb31-4738-a537-8679908b7240","Type":"ContainerStarted","Data":"3d443c2a07d117fd68a6413cd9797a42e1c4a71c99969f73baf06382b85a9215"} Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.381860 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" event={"ID":"a4191acc-30f7-4f6a-9812-ac63638e2663","Type":"ContainerStarted","Data":"daa619c58cd2608c2b441a32bdd4851a1d5a1bb63565b92dab32c7d51b83beae"} Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.382166 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.383933 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3f3ea03a-d14b-4bf3-b67d-7c0f72123842","Type":"ContainerStarted","Data":"d06823d8e57ee6b56ce4c0c08e4da7c3a2e621bce04f90755ab6c0a6fea194b7"} Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.385311 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5","Type":"ContainerStarted","Data":"115bbf36ab90fc0c6970fea4016d4b7c80937d835976580efb88403813b7f352"} Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.409148 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" podStartSLOduration=3.409096211 podStartE2EDuration="3.409096211s" podCreationTimestamp="2025-10-02 12:13:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:13:26.404935169 +0000 UTC m=+4921.347806113" watchObservedRunningTime="2025-10-02 12:13:26.409096211 +0000 UTC m=+4921.351967175" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.724605 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.726069 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.729292 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.731365 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.731591 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-gkw77" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.731766 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.733729 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.737669 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.741758 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.749374 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.750711 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.754285 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.754871 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.754909 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-kw8j9" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.754998 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.779090 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.796326 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8rkc\" (UniqueName: \"kubernetes.io/projected/c7a337c5-3d90-4978-b2e2-1bd756a4a967-kube-api-access-s8rkc\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.796428 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a337c5-3d90-4978-b2e2-1bd756a4a967-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.796460 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48638127-8158-456c-ae7e-77d9ba95fd0b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.796499 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a337c5-3d90-4978-b2e2-1bd756a4a967-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.796546 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c7a337c5-3d90-4978-b2e2-1bd756a4a967-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.796614 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c7a337c5-3d90-4978-b2e2-1bd756a4a967-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.796648 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c7a337c5-3d90-4978-b2e2-1bd756a4a967-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.796687 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3fbe4173-d9b0-4f9d-b0a4-d541f2bc0900\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3fbe4173-d9b0-4f9d-b0a4-d541f2bc0900\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.796754 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48638127-8158-456c-ae7e-77d9ba95fd0b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.796826 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/48638127-8158-456c-ae7e-77d9ba95fd0b-secrets\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.796863 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/48638127-8158-456c-ae7e-77d9ba95fd0b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.796899 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/48638127-8158-456c-ae7e-77d9ba95fd0b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.796930 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c7a337c5-3d90-4978-b2e2-1bd756a4a967-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.796960 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhjtc\" (UniqueName: \"kubernetes.io/projected/48638127-8158-456c-ae7e-77d9ba95fd0b-kube-api-access-vhjtc\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.797001 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a337c5-3d90-4978-b2e2-1bd756a4a967-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.797031 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/48638127-8158-456c-ae7e-77d9ba95fd0b-kolla-config\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.797054 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/48638127-8158-456c-ae7e-77d9ba95fd0b-config-data-default\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.797089 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d25e7ff0-24a1-48d1-a16b-f27fcf09c4c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d25e7ff0-24a1-48d1-a16b-f27fcf09c4c5\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.898898 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a337c5-3d90-4978-b2e2-1bd756a4a967-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.899556 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48638127-8158-456c-ae7e-77d9ba95fd0b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.899608 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a337c5-3d90-4978-b2e2-1bd756a4a967-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.899634 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c7a337c5-3d90-4978-b2e2-1bd756a4a967-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.899689 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c7a337c5-3d90-4978-b2e2-1bd756a4a967-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.899724 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c7a337c5-3d90-4978-b2e2-1bd756a4a967-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.899756 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3fbe4173-d9b0-4f9d-b0a4-d541f2bc0900\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3fbe4173-d9b0-4f9d-b0a4-d541f2bc0900\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.899796 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48638127-8158-456c-ae7e-77d9ba95fd0b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.899841 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/48638127-8158-456c-ae7e-77d9ba95fd0b-secrets\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.899870 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/48638127-8158-456c-ae7e-77d9ba95fd0b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.899895 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/48638127-8158-456c-ae7e-77d9ba95fd0b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.899918 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c7a337c5-3d90-4978-b2e2-1bd756a4a967-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.899949 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhjtc\" (UniqueName: \"kubernetes.io/projected/48638127-8158-456c-ae7e-77d9ba95fd0b-kube-api-access-vhjtc\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.899976 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a337c5-3d90-4978-b2e2-1bd756a4a967-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.900005 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/48638127-8158-456c-ae7e-77d9ba95fd0b-kolla-config\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.900023 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/48638127-8158-456c-ae7e-77d9ba95fd0b-config-data-default\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.900050 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d25e7ff0-24a1-48d1-a16b-f27fcf09c4c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d25e7ff0-24a1-48d1-a16b-f27fcf09c4c5\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.900080 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8rkc\" (UniqueName: \"kubernetes.io/projected/c7a337c5-3d90-4978-b2e2-1bd756a4a967-kube-api-access-s8rkc\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.900797 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c7a337c5-3d90-4978-b2e2-1bd756a4a967-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.901166 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c7a337c5-3d90-4978-b2e2-1bd756a4a967-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.901288 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/48638127-8158-456c-ae7e-77d9ba95fd0b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.901611 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/48638127-8158-456c-ae7e-77d9ba95fd0b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.902207 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c7a337c5-3d90-4978-b2e2-1bd756a4a967-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.902373 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/48638127-8158-456c-ae7e-77d9ba95fd0b-kolla-config\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.902735 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/48638127-8158-456c-ae7e-77d9ba95fd0b-config-data-default\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.903015 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c7a337c5-3d90-4978-b2e2-1bd756a4a967-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.904265 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.904305 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d25e7ff0-24a1-48d1-a16b-f27fcf09c4c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d25e7ff0-24a1-48d1-a16b-f27fcf09c4c5\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5142c7d8dc172cc0c6972aa4f874b1a95908bc0c4f89fa1d5a1c67d6e8c9a192/globalmount\"" pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.906372 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.906438 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3fbe4173-d9b0-4f9d-b0a4-d541f2bc0900\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3fbe4173-d9b0-4f9d-b0a4-d541f2bc0900\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/040f56f8cb34b5c6eeb50890d5a4b591e3ab225f67aca5e603a0719c3af1f3dd/globalmount\"" pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.908082 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/48638127-8158-456c-ae7e-77d9ba95fd0b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.908820 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c7a337c5-3d90-4978-b2e2-1bd756a4a967-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.909213 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7a337c5-3d90-4978-b2e2-1bd756a4a967-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.911019 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/48638127-8158-456c-ae7e-77d9ba95fd0b-secrets\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.917054 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48638127-8158-456c-ae7e-77d9ba95fd0b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.920154 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/c7a337c5-3d90-4978-b2e2-1bd756a4a967-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.922055 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8rkc\" (UniqueName: \"kubernetes.io/projected/c7a337c5-3d90-4978-b2e2-1bd756a4a967-kube-api-access-s8rkc\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.924150 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhjtc\" (UniqueName: \"kubernetes.io/projected/48638127-8158-456c-ae7e-77d9ba95fd0b-kube-api-access-vhjtc\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.945267 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3fbe4173-d9b0-4f9d-b0a4-d541f2bc0900\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3fbe4173-d9b0-4f9d-b0a4-d541f2bc0900\") pod \"openstack-galera-0\" (UID: \"48638127-8158-456c-ae7e-77d9ba95fd0b\") " pod="openstack/openstack-galera-0" Oct 02 12:13:26 crc kubenswrapper[4766]: I1002 12:13:26.949442 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d25e7ff0-24a1-48d1-a16b-f27fcf09c4c5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d25e7ff0-24a1-48d1-a16b-f27fcf09c4c5\") pod \"openstack-cell1-galera-0\" (UID: \"c7a337c5-3d90-4978-b2e2-1bd756a4a967\") " pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:27 crc kubenswrapper[4766]: I1002 12:13:27.048580 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 02 12:13:27 crc kubenswrapper[4766]: I1002 12:13:27.069025 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:27 crc kubenswrapper[4766]: I1002 12:13:27.400465 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"d388c6de-eb31-4738-a537-8679908b7240","Type":"ContainerStarted","Data":"61fc7ec3f6de69aabe0302d378b74bda5b651e285b6cbaed463c41646d66b880"} Oct 02 12:13:27 crc kubenswrapper[4766]: I1002 12:13:27.401493 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 02 12:13:27 crc kubenswrapper[4766]: I1002 12:13:27.430367 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.4303131000000002 podStartE2EDuration="2.4303131s" podCreationTimestamp="2025-10-02 12:13:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:13:27.421612711 +0000 UTC m=+4922.364483655" watchObservedRunningTime="2025-10-02 12:13:27.4303131 +0000 UTC m=+4922.373184044" Oct 02 12:13:27 crc kubenswrapper[4766]: W1002 12:13:27.550555 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48638127_8158_456c_ae7e_77d9ba95fd0b.slice/crio-d74011841625118ff055fdcfbcacbab03da752f97a598aad7db1468f2cf8b303 WatchSource:0}: Error finding container d74011841625118ff055fdcfbcacbab03da752f97a598aad7db1468f2cf8b303: Status 404 returned error can't find the container with id d74011841625118ff055fdcfbcacbab03da752f97a598aad7db1468f2cf8b303 Oct 02 12:13:27 crc kubenswrapper[4766]: I1002 12:13:27.560992 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 02 12:13:27 crc kubenswrapper[4766]: I1002 12:13:27.579946 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 02 12:13:28 crc kubenswrapper[4766]: I1002 12:13:28.409708 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"48638127-8158-456c-ae7e-77d9ba95fd0b","Type":"ContainerStarted","Data":"c032e3d140036fd39e2b88c0b0bf0953230d0b73c1e98cbe1aefb69147c5dd18"} Oct 02 12:13:28 crc kubenswrapper[4766]: I1002 12:13:28.409767 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"48638127-8158-456c-ae7e-77d9ba95fd0b","Type":"ContainerStarted","Data":"d74011841625118ff055fdcfbcacbab03da752f97a598aad7db1468f2cf8b303"} Oct 02 12:13:28 crc kubenswrapper[4766]: I1002 12:13:28.415710 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c7a337c5-3d90-4978-b2e2-1bd756a4a967","Type":"ContainerStarted","Data":"1ebb41d2a5c0228849d1555059dd5e73f9dbf048d1bd571cd5bf3e70156a20bf"} Oct 02 12:13:28 crc kubenswrapper[4766]: I1002 12:13:28.415783 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c7a337c5-3d90-4978-b2e2-1bd756a4a967","Type":"ContainerStarted","Data":"6e0d37decf701d88a8a91a46dfda72c82e6664879170a80a869437b019e2ccae"} Oct 02 12:13:31 crc kubenswrapper[4766]: I1002 12:13:31.446450 4766 generic.go:334] "Generic (PLEG): container finished" podID="48638127-8158-456c-ae7e-77d9ba95fd0b" containerID="c032e3d140036fd39e2b88c0b0bf0953230d0b73c1e98cbe1aefb69147c5dd18" exitCode=0 Oct 02 12:13:31 crc kubenswrapper[4766]: I1002 12:13:31.446619 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"48638127-8158-456c-ae7e-77d9ba95fd0b","Type":"ContainerDied","Data":"c032e3d140036fd39e2b88c0b0bf0953230d0b73c1e98cbe1aefb69147c5dd18"} Oct 02 12:13:31 crc kubenswrapper[4766]: I1002 12:13:31.450958 4766 generic.go:334] "Generic (PLEG): container finished" podID="c7a337c5-3d90-4978-b2e2-1bd756a4a967" containerID="1ebb41d2a5c0228849d1555059dd5e73f9dbf048d1bd571cd5bf3e70156a20bf" exitCode=0 Oct 02 12:13:31 crc kubenswrapper[4766]: I1002 12:13:31.451021 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c7a337c5-3d90-4978-b2e2-1bd756a4a967","Type":"ContainerDied","Data":"1ebb41d2a5c0228849d1555059dd5e73f9dbf048d1bd571cd5bf3e70156a20bf"} Oct 02 12:13:32 crc kubenswrapper[4766]: I1002 12:13:32.463205 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"48638127-8158-456c-ae7e-77d9ba95fd0b","Type":"ContainerStarted","Data":"4153750e670cc954ff133da2fac49cc308676efc3ea56b6e0e0ebb54356292c2"} Oct 02 12:13:32 crc kubenswrapper[4766]: I1002 12:13:32.467782 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c7a337c5-3d90-4978-b2e2-1bd756a4a967","Type":"ContainerStarted","Data":"9899b06e8f3645edb363d331ae96de90f792329e48f0869391027efe6a910a29"} Oct 02 12:13:32 crc kubenswrapper[4766]: I1002 12:13:32.492555 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.492529644 podStartE2EDuration="7.492529644s" podCreationTimestamp="2025-10-02 12:13:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:13:32.486649226 +0000 UTC m=+4927.429520170" watchObservedRunningTime="2025-10-02 12:13:32.492529644 +0000 UTC m=+4927.435400578" Oct 02 12:13:32 crc kubenswrapper[4766]: I1002 12:13:32.511661 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.5116410160000004 podStartE2EDuration="7.511641016s" podCreationTimestamp="2025-10-02 12:13:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:13:32.509881249 +0000 UTC m=+4927.452752203" watchObservedRunningTime="2025-10-02 12:13:32.511641016 +0000 UTC m=+4927.454511960" Oct 02 12:13:33 crc kubenswrapper[4766]: I1002 12:13:33.143952 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" Oct 02 12:13:33 crc kubenswrapper[4766]: I1002 12:13:33.527806 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" Oct 02 12:13:33 crc kubenswrapper[4766]: I1002 12:13:33.597594 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-v6tgj"] Oct 02 12:13:33 crc kubenswrapper[4766]: I1002 12:13:33.597921 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" podUID="c6424bd4-37c8-4466-b33b-00f7671e9421" containerName="dnsmasq-dns" containerID="cri-o://f764df5c2a65ba66732533e005f93097f0702267afdb63e3c7c6ba690e4cf61a" gracePeriod=10 Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.079774 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.230628 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6424bd4-37c8-4466-b33b-00f7671e9421-config\") pod \"c6424bd4-37c8-4466-b33b-00f7671e9421\" (UID: \"c6424bd4-37c8-4466-b33b-00f7671e9421\") " Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.230764 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6424bd4-37c8-4466-b33b-00f7671e9421-dns-svc\") pod \"c6424bd4-37c8-4466-b33b-00f7671e9421\" (UID: \"c6424bd4-37c8-4466-b33b-00f7671e9421\") " Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.231017 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnz5n\" (UniqueName: \"kubernetes.io/projected/c6424bd4-37c8-4466-b33b-00f7671e9421-kube-api-access-gnz5n\") pod \"c6424bd4-37c8-4466-b33b-00f7671e9421\" (UID: \"c6424bd4-37c8-4466-b33b-00f7671e9421\") " Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.237268 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6424bd4-37c8-4466-b33b-00f7671e9421-kube-api-access-gnz5n" (OuterVolumeSpecName: "kube-api-access-gnz5n") pod "c6424bd4-37c8-4466-b33b-00f7671e9421" (UID: "c6424bd4-37c8-4466-b33b-00f7671e9421"). InnerVolumeSpecName "kube-api-access-gnz5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.272853 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6424bd4-37c8-4466-b33b-00f7671e9421-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c6424bd4-37c8-4466-b33b-00f7671e9421" (UID: "c6424bd4-37c8-4466-b33b-00f7671e9421"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.276906 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6424bd4-37c8-4466-b33b-00f7671e9421-config" (OuterVolumeSpecName: "config") pod "c6424bd4-37c8-4466-b33b-00f7671e9421" (UID: "c6424bd4-37c8-4466-b33b-00f7671e9421"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.333306 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6424bd4-37c8-4466-b33b-00f7671e9421-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.333829 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnz5n\" (UniqueName: \"kubernetes.io/projected/c6424bd4-37c8-4466-b33b-00f7671e9421-kube-api-access-gnz5n\") on node \"crc\" DevicePath \"\"" Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.333841 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6424bd4-37c8-4466-b33b-00f7671e9421-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.487702 4766 generic.go:334] "Generic (PLEG): container finished" podID="c6424bd4-37c8-4466-b33b-00f7671e9421" containerID="f764df5c2a65ba66732533e005f93097f0702267afdb63e3c7c6ba690e4cf61a" exitCode=0 Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.487757 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" event={"ID":"c6424bd4-37c8-4466-b33b-00f7671e9421","Type":"ContainerDied","Data":"f764df5c2a65ba66732533e005f93097f0702267afdb63e3c7c6ba690e4cf61a"} Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.487795 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" event={"ID":"c6424bd4-37c8-4466-b33b-00f7671e9421","Type":"ContainerDied","Data":"89a3254f807cff3e973c8396d772612a700b6b87179c1795eec7ddf181778cd0"} Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.487814 4766 scope.go:117] "RemoveContainer" containerID="f764df5c2a65ba66732533e005f93097f0702267afdb63e3c7c6ba690e4cf61a" Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.488062 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-v6tgj" Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.530164 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-v6tgj"] Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.530740 4766 scope.go:117] "RemoveContainer" containerID="acdd72bd48a8e9bf39fb0815ef46e4d19f3a5927f07ff46108e295d4dbaa9159" Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.536467 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-v6tgj"] Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.557203 4766 scope.go:117] "RemoveContainer" containerID="f764df5c2a65ba66732533e005f93097f0702267afdb63e3c7c6ba690e4cf61a" Oct 02 12:13:34 crc kubenswrapper[4766]: E1002 12:13:34.557843 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f764df5c2a65ba66732533e005f93097f0702267afdb63e3c7c6ba690e4cf61a\": container with ID starting with f764df5c2a65ba66732533e005f93097f0702267afdb63e3c7c6ba690e4cf61a not found: ID does not exist" containerID="f764df5c2a65ba66732533e005f93097f0702267afdb63e3c7c6ba690e4cf61a" Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.557900 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f764df5c2a65ba66732533e005f93097f0702267afdb63e3c7c6ba690e4cf61a"} err="failed to get container status \"f764df5c2a65ba66732533e005f93097f0702267afdb63e3c7c6ba690e4cf61a\": rpc error: code = NotFound desc = could not find container \"f764df5c2a65ba66732533e005f93097f0702267afdb63e3c7c6ba690e4cf61a\": container with ID starting with f764df5c2a65ba66732533e005f93097f0702267afdb63e3c7c6ba690e4cf61a not found: ID does not exist" Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.557928 4766 scope.go:117] "RemoveContainer" containerID="acdd72bd48a8e9bf39fb0815ef46e4d19f3a5927f07ff46108e295d4dbaa9159" Oct 02 12:13:34 crc kubenswrapper[4766]: E1002 12:13:34.558350 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acdd72bd48a8e9bf39fb0815ef46e4d19f3a5927f07ff46108e295d4dbaa9159\": container with ID starting with acdd72bd48a8e9bf39fb0815ef46e4d19f3a5927f07ff46108e295d4dbaa9159 not found: ID does not exist" containerID="acdd72bd48a8e9bf39fb0815ef46e4d19f3a5927f07ff46108e295d4dbaa9159" Oct 02 12:13:34 crc kubenswrapper[4766]: I1002 12:13:34.558377 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acdd72bd48a8e9bf39fb0815ef46e4d19f3a5927f07ff46108e295d4dbaa9159"} err="failed to get container status \"acdd72bd48a8e9bf39fb0815ef46e4d19f3a5927f07ff46108e295d4dbaa9159\": rpc error: code = NotFound desc = could not find container \"acdd72bd48a8e9bf39fb0815ef46e4d19f3a5927f07ff46108e295d4dbaa9159\": container with ID starting with acdd72bd48a8e9bf39fb0815ef46e4d19f3a5927f07ff46108e295d4dbaa9159 not found: ID does not exist" Oct 02 12:13:35 crc kubenswrapper[4766]: I1002 12:13:35.669803 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 02 12:13:35 crc kubenswrapper[4766]: I1002 12:13:35.892699 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6424bd4-37c8-4466-b33b-00f7671e9421" path="/var/lib/kubelet/pods/c6424bd4-37c8-4466-b33b-00f7671e9421/volumes" Oct 02 12:13:37 crc kubenswrapper[4766]: I1002 12:13:37.049442 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 02 12:13:37 crc kubenswrapper[4766]: I1002 12:13:37.050362 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 02 12:13:37 crc kubenswrapper[4766]: I1002 12:13:37.069528 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:37 crc kubenswrapper[4766]: I1002 12:13:37.069886 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:39 crc kubenswrapper[4766]: I1002 12:13:39.103837 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 02 12:13:39 crc kubenswrapper[4766]: I1002 12:13:39.135918 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:39 crc kubenswrapper[4766]: I1002 12:13:39.150547 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 02 12:13:39 crc kubenswrapper[4766]: I1002 12:13:39.205226 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 02 12:13:54 crc kubenswrapper[4766]: I1002 12:13:54.432166 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:13:54 crc kubenswrapper[4766]: I1002 12:13:54.433198 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:13:58 crc kubenswrapper[4766]: I1002 12:13:58.713961 4766 generic.go:334] "Generic (PLEG): container finished" podID="b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" containerID="115bbf36ab90fc0c6970fea4016d4b7c80937d835976580efb88403813b7f352" exitCode=0 Oct 02 12:13:58 crc kubenswrapper[4766]: I1002 12:13:58.714060 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5","Type":"ContainerDied","Data":"115bbf36ab90fc0c6970fea4016d4b7c80937d835976580efb88403813b7f352"} Oct 02 12:13:58 crc kubenswrapper[4766]: I1002 12:13:58.716630 4766 generic.go:334] "Generic (PLEG): container finished" podID="3f3ea03a-d14b-4bf3-b67d-7c0f72123842" containerID="d06823d8e57ee6b56ce4c0c08e4da7c3a2e621bce04f90755ab6c0a6fea194b7" exitCode=0 Oct 02 12:13:58 crc kubenswrapper[4766]: I1002 12:13:58.716672 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3f3ea03a-d14b-4bf3-b67d-7c0f72123842","Type":"ContainerDied","Data":"d06823d8e57ee6b56ce4c0c08e4da7c3a2e621bce04f90755ab6c0a6fea194b7"} Oct 02 12:13:59 crc kubenswrapper[4766]: I1002 12:13:59.729347 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3f3ea03a-d14b-4bf3-b67d-7c0f72123842","Type":"ContainerStarted","Data":"ab13f2737dc492f317cb9e2e05a0cf32dc39294ac5f6afcc61cd2adb6c6dbe49"} Oct 02 12:13:59 crc kubenswrapper[4766]: I1002 12:13:59.730986 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 02 12:13:59 crc kubenswrapper[4766]: I1002 12:13:59.733371 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5","Type":"ContainerStarted","Data":"b835000b6011e7f7b078f402eb099dc1ee686a9cc021548fe6f7d01b2988468b"} Oct 02 12:13:59 crc kubenswrapper[4766]: I1002 12:13:59.733898 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:13:59 crc kubenswrapper[4766]: I1002 12:13:59.770719 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.770688607 podStartE2EDuration="37.770688607s" podCreationTimestamp="2025-10-02 12:13:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:13:59.766826513 +0000 UTC m=+4954.709697487" watchObservedRunningTime="2025-10-02 12:13:59.770688607 +0000 UTC m=+4954.713559561" Oct 02 12:13:59 crc kubenswrapper[4766]: I1002 12:13:59.795905 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.795879743 podStartE2EDuration="36.795879743s" podCreationTimestamp="2025-10-02 12:13:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:13:59.793331012 +0000 UTC m=+4954.736201996" watchObservedRunningTime="2025-10-02 12:13:59.795879743 +0000 UTC m=+4954.738750697" Oct 02 12:14:06 crc kubenswrapper[4766]: I1002 12:14:06.821775 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gl2wg"] Oct 02 12:14:06 crc kubenswrapper[4766]: E1002 12:14:06.822821 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6424bd4-37c8-4466-b33b-00f7671e9421" containerName="dnsmasq-dns" Oct 02 12:14:06 crc kubenswrapper[4766]: I1002 12:14:06.822838 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6424bd4-37c8-4466-b33b-00f7671e9421" containerName="dnsmasq-dns" Oct 02 12:14:06 crc kubenswrapper[4766]: E1002 12:14:06.822881 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6424bd4-37c8-4466-b33b-00f7671e9421" containerName="init" Oct 02 12:14:06 crc kubenswrapper[4766]: I1002 12:14:06.822887 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6424bd4-37c8-4466-b33b-00f7671e9421" containerName="init" Oct 02 12:14:06 crc kubenswrapper[4766]: I1002 12:14:06.823037 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6424bd4-37c8-4466-b33b-00f7671e9421" containerName="dnsmasq-dns" Oct 02 12:14:06 crc kubenswrapper[4766]: I1002 12:14:06.824079 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gl2wg" Oct 02 12:14:06 crc kubenswrapper[4766]: I1002 12:14:06.842282 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gl2wg"] Oct 02 12:14:06 crc kubenswrapper[4766]: I1002 12:14:06.930147 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cda9075-c127-4525-b597-2d4d63824a24-utilities\") pod \"redhat-marketplace-gl2wg\" (UID: \"6cda9075-c127-4525-b597-2d4d63824a24\") " pod="openshift-marketplace/redhat-marketplace-gl2wg" Oct 02 12:14:06 crc kubenswrapper[4766]: I1002 12:14:06.930223 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cda9075-c127-4525-b597-2d4d63824a24-catalog-content\") pod \"redhat-marketplace-gl2wg\" (UID: \"6cda9075-c127-4525-b597-2d4d63824a24\") " pod="openshift-marketplace/redhat-marketplace-gl2wg" Oct 02 12:14:06 crc kubenswrapper[4766]: I1002 12:14:06.930256 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8knz\" (UniqueName: \"kubernetes.io/projected/6cda9075-c127-4525-b597-2d4d63824a24-kube-api-access-d8knz\") pod \"redhat-marketplace-gl2wg\" (UID: \"6cda9075-c127-4525-b597-2d4d63824a24\") " pod="openshift-marketplace/redhat-marketplace-gl2wg" Oct 02 12:14:07 crc kubenswrapper[4766]: I1002 12:14:07.032274 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cda9075-c127-4525-b597-2d4d63824a24-utilities\") pod \"redhat-marketplace-gl2wg\" (UID: \"6cda9075-c127-4525-b597-2d4d63824a24\") " pod="openshift-marketplace/redhat-marketplace-gl2wg" Oct 02 12:14:07 crc kubenswrapper[4766]: I1002 12:14:07.032350 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cda9075-c127-4525-b597-2d4d63824a24-catalog-content\") pod \"redhat-marketplace-gl2wg\" (UID: \"6cda9075-c127-4525-b597-2d4d63824a24\") " pod="openshift-marketplace/redhat-marketplace-gl2wg" Oct 02 12:14:07 crc kubenswrapper[4766]: I1002 12:14:07.032393 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8knz\" (UniqueName: \"kubernetes.io/projected/6cda9075-c127-4525-b597-2d4d63824a24-kube-api-access-d8knz\") pod \"redhat-marketplace-gl2wg\" (UID: \"6cda9075-c127-4525-b597-2d4d63824a24\") " pod="openshift-marketplace/redhat-marketplace-gl2wg" Oct 02 12:14:07 crc kubenswrapper[4766]: I1002 12:14:07.032939 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cda9075-c127-4525-b597-2d4d63824a24-utilities\") pod \"redhat-marketplace-gl2wg\" (UID: \"6cda9075-c127-4525-b597-2d4d63824a24\") " pod="openshift-marketplace/redhat-marketplace-gl2wg" Oct 02 12:14:07 crc kubenswrapper[4766]: I1002 12:14:07.032964 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cda9075-c127-4525-b597-2d4d63824a24-catalog-content\") pod \"redhat-marketplace-gl2wg\" (UID: \"6cda9075-c127-4525-b597-2d4d63824a24\") " pod="openshift-marketplace/redhat-marketplace-gl2wg" Oct 02 12:14:07 crc kubenswrapper[4766]: I1002 12:14:07.062441 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8knz\" (UniqueName: \"kubernetes.io/projected/6cda9075-c127-4525-b597-2d4d63824a24-kube-api-access-d8knz\") pod \"redhat-marketplace-gl2wg\" (UID: \"6cda9075-c127-4525-b597-2d4d63824a24\") " pod="openshift-marketplace/redhat-marketplace-gl2wg" Oct 02 12:14:07 crc kubenswrapper[4766]: I1002 12:14:07.149412 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gl2wg" Oct 02 12:14:07 crc kubenswrapper[4766]: I1002 12:14:07.612373 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gl2wg"] Oct 02 12:14:07 crc kubenswrapper[4766]: I1002 12:14:07.800516 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gl2wg" event={"ID":"6cda9075-c127-4525-b597-2d4d63824a24","Type":"ContainerStarted","Data":"e718950b5b77c90be067a7e4d1a27e4c1361bdf9d48dbd52b705bb56ec4951d7"} Oct 02 12:14:07 crc kubenswrapper[4766]: I1002 12:14:07.801076 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gl2wg" event={"ID":"6cda9075-c127-4525-b597-2d4d63824a24","Type":"ContainerStarted","Data":"77a6cf59ccbbf2839b5ad9ab47710a97c9dc3343104a673cd2a84777566de735"} Oct 02 12:14:08 crc kubenswrapper[4766]: I1002 12:14:08.812141 4766 generic.go:334] "Generic (PLEG): container finished" podID="6cda9075-c127-4525-b597-2d4d63824a24" containerID="e718950b5b77c90be067a7e4d1a27e4c1361bdf9d48dbd52b705bb56ec4951d7" exitCode=0 Oct 02 12:14:08 crc kubenswrapper[4766]: I1002 12:14:08.812213 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gl2wg" event={"ID":"6cda9075-c127-4525-b597-2d4d63824a24","Type":"ContainerDied","Data":"e718950b5b77c90be067a7e4d1a27e4c1361bdf9d48dbd52b705bb56ec4951d7"} Oct 02 12:14:10 crc kubenswrapper[4766]: I1002 12:14:10.832968 4766 generic.go:334] "Generic (PLEG): container finished" podID="6cda9075-c127-4525-b597-2d4d63824a24" containerID="049e08125856a2e4b92228102607b0a190c2c540b4b2fcb9db858dd1d41ea0d5" exitCode=0 Oct 02 12:14:10 crc kubenswrapper[4766]: I1002 12:14:10.833105 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gl2wg" event={"ID":"6cda9075-c127-4525-b597-2d4d63824a24","Type":"ContainerDied","Data":"049e08125856a2e4b92228102607b0a190c2c540b4b2fcb9db858dd1d41ea0d5"} Oct 02 12:14:11 crc kubenswrapper[4766]: I1002 12:14:11.844231 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gl2wg" event={"ID":"6cda9075-c127-4525-b597-2d4d63824a24","Type":"ContainerStarted","Data":"d7c4d8b70ed3a642bf0ed3b1d46f772f0aecaac073ce507106445b69d6939d9d"} Oct 02 12:14:11 crc kubenswrapper[4766]: I1002 12:14:11.868818 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gl2wg" podStartSLOduration=3.3500557300000002 podStartE2EDuration="5.868793131s" podCreationTimestamp="2025-10-02 12:14:06 +0000 UTC" firstStartedPulling="2025-10-02 12:14:08.816492593 +0000 UTC m=+4963.759363537" lastFinishedPulling="2025-10-02 12:14:11.335229984 +0000 UTC m=+4966.278100938" observedRunningTime="2025-10-02 12:14:11.861949603 +0000 UTC m=+4966.804820557" watchObservedRunningTime="2025-10-02 12:14:11.868793131 +0000 UTC m=+4966.811664075" Oct 02 12:14:14 crc kubenswrapper[4766]: I1002 12:14:14.352788 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 02 12:14:14 crc kubenswrapper[4766]: I1002 12:14:14.642281 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:17 crc kubenswrapper[4766]: I1002 12:14:17.150392 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gl2wg" Oct 02 12:14:17 crc kubenswrapper[4766]: I1002 12:14:17.150735 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gl2wg" Oct 02 12:14:17 crc kubenswrapper[4766]: I1002 12:14:17.194877 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gl2wg" Oct 02 12:14:17 crc kubenswrapper[4766]: I1002 12:14:17.928409 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gl2wg" Oct 02 12:14:17 crc kubenswrapper[4766]: I1002 12:14:17.981646 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gl2wg"] Oct 02 12:14:18 crc kubenswrapper[4766]: I1002 12:14:18.983386 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-vjg7m"] Oct 02 12:14:18 crc kubenswrapper[4766]: I1002 12:14:18.990651 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" Oct 02 12:14:18 crc kubenswrapper[4766]: I1002 12:14:18.995550 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-vjg7m"] Oct 02 12:14:19 crc kubenswrapper[4766]: I1002 12:14:19.036623 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-vjg7m\" (UID: \"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" Oct 02 12:14:19 crc kubenswrapper[4766]: I1002 12:14:19.036718 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-config\") pod \"dnsmasq-dns-5b7946d7b9-vjg7m\" (UID: \"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" Oct 02 12:14:19 crc kubenswrapper[4766]: I1002 12:14:19.036819 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh6pk\" (UniqueName: \"kubernetes.io/projected/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-kube-api-access-zh6pk\") pod \"dnsmasq-dns-5b7946d7b9-vjg7m\" (UID: \"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" Oct 02 12:14:19 crc kubenswrapper[4766]: I1002 12:14:19.138409 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-vjg7m\" (UID: \"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" Oct 02 12:14:19 crc kubenswrapper[4766]: I1002 12:14:19.138483 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-config\") pod \"dnsmasq-dns-5b7946d7b9-vjg7m\" (UID: \"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" Oct 02 12:14:19 crc kubenswrapper[4766]: I1002 12:14:19.139594 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh6pk\" (UniqueName: \"kubernetes.io/projected/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-kube-api-access-zh6pk\") pod \"dnsmasq-dns-5b7946d7b9-vjg7m\" (UID: \"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" Oct 02 12:14:19 crc kubenswrapper[4766]: I1002 12:14:19.139633 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-vjg7m\" (UID: \"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" Oct 02 12:14:19 crc kubenswrapper[4766]: I1002 12:14:19.139648 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-config\") pod \"dnsmasq-dns-5b7946d7b9-vjg7m\" (UID: \"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" Oct 02 12:14:19 crc kubenswrapper[4766]: I1002 12:14:19.202042 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh6pk\" (UniqueName: \"kubernetes.io/projected/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-kube-api-access-zh6pk\") pod \"dnsmasq-dns-5b7946d7b9-vjg7m\" (UID: \"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9\") " pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" Oct 02 12:14:19 crc kubenswrapper[4766]: I1002 12:14:19.317358 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" Oct 02 12:14:19 crc kubenswrapper[4766]: I1002 12:14:19.649720 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:14:19 crc kubenswrapper[4766]: I1002 12:14:19.785433 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-vjg7m"] Oct 02 12:14:19 crc kubenswrapper[4766]: I1002 12:14:19.919367 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" event={"ID":"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9","Type":"ContainerStarted","Data":"ddfde4ed3a78cb0bd6341dbcab24cdba28a982ca7ee8efc751b8d406d7625713"} Oct 02 12:14:19 crc kubenswrapper[4766]: I1002 12:14:19.919622 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gl2wg" podUID="6cda9075-c127-4525-b597-2d4d63824a24" containerName="registry-server" containerID="cri-o://d7c4d8b70ed3a642bf0ed3b1d46f772f0aecaac073ce507106445b69d6939d9d" gracePeriod=2 Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.429863 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gl2wg" Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.441178 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.475334 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cda9075-c127-4525-b597-2d4d63824a24-catalog-content\") pod \"6cda9075-c127-4525-b597-2d4d63824a24\" (UID: \"6cda9075-c127-4525-b597-2d4d63824a24\") " Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.475473 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cda9075-c127-4525-b597-2d4d63824a24-utilities\") pod \"6cda9075-c127-4525-b597-2d4d63824a24\" (UID: \"6cda9075-c127-4525-b597-2d4d63824a24\") " Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.475736 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8knz\" (UniqueName: \"kubernetes.io/projected/6cda9075-c127-4525-b597-2d4d63824a24-kube-api-access-d8knz\") pod \"6cda9075-c127-4525-b597-2d4d63824a24\" (UID: \"6cda9075-c127-4525-b597-2d4d63824a24\") " Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.476862 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cda9075-c127-4525-b597-2d4d63824a24-utilities" (OuterVolumeSpecName: "utilities") pod "6cda9075-c127-4525-b597-2d4d63824a24" (UID: "6cda9075-c127-4525-b597-2d4d63824a24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.483904 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cda9075-c127-4525-b597-2d4d63824a24-kube-api-access-d8knz" (OuterVolumeSpecName: "kube-api-access-d8knz") pod "6cda9075-c127-4525-b597-2d4d63824a24" (UID: "6cda9075-c127-4525-b597-2d4d63824a24"). InnerVolumeSpecName "kube-api-access-d8knz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.500768 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cda9075-c127-4525-b597-2d4d63824a24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6cda9075-c127-4525-b597-2d4d63824a24" (UID: "6cda9075-c127-4525-b597-2d4d63824a24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.577908 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8knz\" (UniqueName: \"kubernetes.io/projected/6cda9075-c127-4525-b597-2d4d63824a24-kube-api-access-d8knz\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.577953 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cda9075-c127-4525-b597-2d4d63824a24-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.577963 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cda9075-c127-4525-b597-2d4d63824a24-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.930671 4766 generic.go:334] "Generic (PLEG): container finished" podID="6cda9075-c127-4525-b597-2d4d63824a24" containerID="d7c4d8b70ed3a642bf0ed3b1d46f772f0aecaac073ce507106445b69d6939d9d" exitCode=0 Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.930732 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gl2wg" Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.930750 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gl2wg" event={"ID":"6cda9075-c127-4525-b597-2d4d63824a24","Type":"ContainerDied","Data":"d7c4d8b70ed3a642bf0ed3b1d46f772f0aecaac073ce507106445b69d6939d9d"} Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.930842 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gl2wg" event={"ID":"6cda9075-c127-4525-b597-2d4d63824a24","Type":"ContainerDied","Data":"77a6cf59ccbbf2839b5ad9ab47710a97c9dc3343104a673cd2a84777566de735"} Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.930871 4766 scope.go:117] "RemoveContainer" containerID="d7c4d8b70ed3a642bf0ed3b1d46f772f0aecaac073ce507106445b69d6939d9d" Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.933459 4766 generic.go:334] "Generic (PLEG): container finished" podID="b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9" containerID="f9e57669eda5c559501482777482036deab2a93cddb4849de48d7ba9f9d94570" exitCode=0 Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.933535 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" event={"ID":"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9","Type":"ContainerDied","Data":"f9e57669eda5c559501482777482036deab2a93cddb4849de48d7ba9f9d94570"} Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.955896 4766 scope.go:117] "RemoveContainer" containerID="049e08125856a2e4b92228102607b0a190c2c540b4b2fcb9db858dd1d41ea0d5" Oct 02 12:14:20 crc kubenswrapper[4766]: I1002 12:14:20.998248 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gl2wg"] Oct 02 12:14:21 crc kubenswrapper[4766]: I1002 12:14:21.010700 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gl2wg"] Oct 02 12:14:21 crc kubenswrapper[4766]: I1002 12:14:21.033992 4766 scope.go:117] "RemoveContainer" containerID="e718950b5b77c90be067a7e4d1a27e4c1361bdf9d48dbd52b705bb56ec4951d7" Oct 02 12:14:21 crc kubenswrapper[4766]: I1002 12:14:21.055330 4766 scope.go:117] "RemoveContainer" containerID="d7c4d8b70ed3a642bf0ed3b1d46f772f0aecaac073ce507106445b69d6939d9d" Oct 02 12:14:21 crc kubenswrapper[4766]: E1002 12:14:21.056270 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c4d8b70ed3a642bf0ed3b1d46f772f0aecaac073ce507106445b69d6939d9d\": container with ID starting with d7c4d8b70ed3a642bf0ed3b1d46f772f0aecaac073ce507106445b69d6939d9d not found: ID does not exist" containerID="d7c4d8b70ed3a642bf0ed3b1d46f772f0aecaac073ce507106445b69d6939d9d" Oct 02 12:14:21 crc kubenswrapper[4766]: I1002 12:14:21.056337 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c4d8b70ed3a642bf0ed3b1d46f772f0aecaac073ce507106445b69d6939d9d"} err="failed to get container status \"d7c4d8b70ed3a642bf0ed3b1d46f772f0aecaac073ce507106445b69d6939d9d\": rpc error: code = NotFound desc = could not find container \"d7c4d8b70ed3a642bf0ed3b1d46f772f0aecaac073ce507106445b69d6939d9d\": container with ID starting with d7c4d8b70ed3a642bf0ed3b1d46f772f0aecaac073ce507106445b69d6939d9d not found: ID does not exist" Oct 02 12:14:21 crc kubenswrapper[4766]: I1002 12:14:21.056384 4766 scope.go:117] "RemoveContainer" containerID="049e08125856a2e4b92228102607b0a190c2c540b4b2fcb9db858dd1d41ea0d5" Oct 02 12:14:21 crc kubenswrapper[4766]: E1002 12:14:21.057026 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"049e08125856a2e4b92228102607b0a190c2c540b4b2fcb9db858dd1d41ea0d5\": container with ID starting with 049e08125856a2e4b92228102607b0a190c2c540b4b2fcb9db858dd1d41ea0d5 not found: ID does not exist" containerID="049e08125856a2e4b92228102607b0a190c2c540b4b2fcb9db858dd1d41ea0d5" Oct 02 12:14:21 crc kubenswrapper[4766]: I1002 12:14:21.057068 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049e08125856a2e4b92228102607b0a190c2c540b4b2fcb9db858dd1d41ea0d5"} err="failed to get container status \"049e08125856a2e4b92228102607b0a190c2c540b4b2fcb9db858dd1d41ea0d5\": rpc error: code = NotFound desc = could not find container \"049e08125856a2e4b92228102607b0a190c2c540b4b2fcb9db858dd1d41ea0d5\": container with ID starting with 049e08125856a2e4b92228102607b0a190c2c540b4b2fcb9db858dd1d41ea0d5 not found: ID does not exist" Oct 02 12:14:21 crc kubenswrapper[4766]: I1002 12:14:21.057107 4766 scope.go:117] "RemoveContainer" containerID="e718950b5b77c90be067a7e4d1a27e4c1361bdf9d48dbd52b705bb56ec4951d7" Oct 02 12:14:21 crc kubenswrapper[4766]: E1002 12:14:21.057800 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e718950b5b77c90be067a7e4d1a27e4c1361bdf9d48dbd52b705bb56ec4951d7\": container with ID starting with e718950b5b77c90be067a7e4d1a27e4c1361bdf9d48dbd52b705bb56ec4951d7 not found: ID does not exist" containerID="e718950b5b77c90be067a7e4d1a27e4c1361bdf9d48dbd52b705bb56ec4951d7" Oct 02 12:14:21 crc kubenswrapper[4766]: I1002 12:14:21.057840 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e718950b5b77c90be067a7e4d1a27e4c1361bdf9d48dbd52b705bb56ec4951d7"} err="failed to get container status \"e718950b5b77c90be067a7e4d1a27e4c1361bdf9d48dbd52b705bb56ec4951d7\": rpc error: code = NotFound desc = could not find container \"e718950b5b77c90be067a7e4d1a27e4c1361bdf9d48dbd52b705bb56ec4951d7\": container with ID starting with e718950b5b77c90be067a7e4d1a27e4c1361bdf9d48dbd52b705bb56ec4951d7 not found: ID does not exist" Oct 02 12:14:21 crc kubenswrapper[4766]: I1002 12:14:21.579158 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="3f3ea03a-d14b-4bf3-b67d-7c0f72123842" containerName="rabbitmq" containerID="cri-o://ab13f2737dc492f317cb9e2e05a0cf32dc39294ac5f6afcc61cd2adb6c6dbe49" gracePeriod=604799 Oct 02 12:14:21 crc kubenswrapper[4766]: I1002 12:14:21.894433 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cda9075-c127-4525-b597-2d4d63824a24" path="/var/lib/kubelet/pods/6cda9075-c127-4525-b597-2d4d63824a24/volumes" Oct 02 12:14:21 crc kubenswrapper[4766]: I1002 12:14:21.944487 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" event={"ID":"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9","Type":"ContainerStarted","Data":"9b020ea37c7baf24cbf4c64d013b1d959ae4561cc3241bebff26b7ad136de938"} Oct 02 12:14:21 crc kubenswrapper[4766]: I1002 12:14:21.944685 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" Oct 02 12:14:21 crc kubenswrapper[4766]: I1002 12:14:21.969196 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" podStartSLOduration=3.969172036 podStartE2EDuration="3.969172036s" podCreationTimestamp="2025-10-02 12:14:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:14:21.960888701 +0000 UTC m=+4976.903759665" watchObservedRunningTime="2025-10-02 12:14:21.969172036 +0000 UTC m=+4976.912042980" Oct 02 12:14:22 crc kubenswrapper[4766]: I1002 12:14:22.305681 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" containerName="rabbitmq" containerID="cri-o://b835000b6011e7f7b078f402eb099dc1ee686a9cc021548fe6f7d01b2988468b" gracePeriod=604799 Oct 02 12:14:24 crc kubenswrapper[4766]: I1002 12:14:24.350971 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="3f3ea03a-d14b-4bf3-b67d-7c0f72123842" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.236:5672: connect: connection refused" Oct 02 12:14:24 crc kubenswrapper[4766]: I1002 12:14:24.432247 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:14:24 crc kubenswrapper[4766]: I1002 12:14:24.432347 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:14:24 crc kubenswrapper[4766]: I1002 12:14:24.432413 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 12:14:24 crc kubenswrapper[4766]: I1002 12:14:24.433352 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:14:24 crc kubenswrapper[4766]: I1002 12:14:24.433424 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" gracePeriod=600 Oct 02 12:14:24 crc kubenswrapper[4766]: E1002 12:14:24.568393 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:14:24 crc kubenswrapper[4766]: I1002 12:14:24.640437 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.237:5672: connect: connection refused" Oct 02 12:14:24 crc kubenswrapper[4766]: I1002 12:14:24.973233 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" exitCode=0 Oct 02 12:14:24 crc kubenswrapper[4766]: I1002 12:14:24.973304 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465"} Oct 02 12:14:24 crc kubenswrapper[4766]: I1002 12:14:24.974666 4766 scope.go:117] "RemoveContainer" containerID="149f35bc8a77d5d6ff3ce7752d3d3ff07f43eb4b72bf1dbea39d29006fc83855" Oct 02 12:14:24 crc kubenswrapper[4766]: I1002 12:14:24.975638 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:14:24 crc kubenswrapper[4766]: E1002 12:14:24.975981 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.006789 4766 generic.go:334] "Generic (PLEG): container finished" podID="3f3ea03a-d14b-4bf3-b67d-7c0f72123842" containerID="ab13f2737dc492f317cb9e2e05a0cf32dc39294ac5f6afcc61cd2adb6c6dbe49" exitCode=0 Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.006910 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3f3ea03a-d14b-4bf3-b67d-7c0f72123842","Type":"ContainerDied","Data":"ab13f2737dc492f317cb9e2e05a0cf32dc39294ac5f6afcc61cd2adb6c6dbe49"} Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.145033 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.307310 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-erlang-cookie-secret\") pod \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.307399 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-confd\") pod \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.307449 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-erlang-cookie\") pod \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.307490 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-plugins-conf\") pod \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.307689 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\") pod \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.307734 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-plugins\") pod \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.307797 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-server-conf\") pod \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.307838 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-pod-info\") pod \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.307886 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zpqs\" (UniqueName: \"kubernetes.io/projected/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-kube-api-access-6zpqs\") pod \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\" (UID: \"3f3ea03a-d14b-4bf3-b67d-7c0f72123842\") " Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.308159 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3f3ea03a-d14b-4bf3-b67d-7c0f72123842" (UID: "3f3ea03a-d14b-4bf3-b67d-7c0f72123842"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.308200 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3f3ea03a-d14b-4bf3-b67d-7c0f72123842" (UID: "3f3ea03a-d14b-4bf3-b67d-7c0f72123842"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.308221 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3f3ea03a-d14b-4bf3-b67d-7c0f72123842" (UID: "3f3ea03a-d14b-4bf3-b67d-7c0f72123842"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.308242 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.313270 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-pod-info" (OuterVolumeSpecName: "pod-info") pod "3f3ea03a-d14b-4bf3-b67d-7c0f72123842" (UID: "3f3ea03a-d14b-4bf3-b67d-7c0f72123842"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.313341 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-kube-api-access-6zpqs" (OuterVolumeSpecName: "kube-api-access-6zpqs") pod "3f3ea03a-d14b-4bf3-b67d-7c0f72123842" (UID: "3f3ea03a-d14b-4bf3-b67d-7c0f72123842"). InnerVolumeSpecName "kube-api-access-6zpqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.314726 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3f3ea03a-d14b-4bf3-b67d-7c0f72123842" (UID: "3f3ea03a-d14b-4bf3-b67d-7c0f72123842"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.327180 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b" (OuterVolumeSpecName: "persistence") pod "3f3ea03a-d14b-4bf3-b67d-7c0f72123842" (UID: "3f3ea03a-d14b-4bf3-b67d-7c0f72123842"). InnerVolumeSpecName "pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.344033 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-server-conf" (OuterVolumeSpecName: "server-conf") pod "3f3ea03a-d14b-4bf3-b67d-7c0f72123842" (UID: "3f3ea03a-d14b-4bf3-b67d-7c0f72123842"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.401181 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3f3ea03a-d14b-4bf3-b67d-7c0f72123842" (UID: "3f3ea03a-d14b-4bf3-b67d-7c0f72123842"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.409346 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zpqs\" (UniqueName: \"kubernetes.io/projected/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-kube-api-access-6zpqs\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.409383 4766 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.409395 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.409408 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.409422 4766 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.409470 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\") on node \"crc\" " Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.409486 4766 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.409498 4766 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f3ea03a-d14b-4bf3-b67d-7c0f72123842-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.428107 4766 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.428301 4766 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b") on node "crc" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.513061 4766 reconciler_common.go:293] "Volume detached for volume \"pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:28 crc kubenswrapper[4766]: I1002 12:14:28.886346 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.015815 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"3f3ea03a-d14b-4bf3-b67d-7c0f72123842","Type":"ContainerDied","Data":"32360ef05f68216c99ab1279483e3a292d6abfc73bd171c1fe53a8788b371eb1"} Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.015871 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.016122 4766 scope.go:117] "RemoveContainer" containerID="ab13f2737dc492f317cb9e2e05a0cf32dc39294ac5f6afcc61cd2adb6c6dbe49" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.018293 4766 generic.go:334] "Generic (PLEG): container finished" podID="b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" containerID="b835000b6011e7f7b078f402eb099dc1ee686a9cc021548fe6f7d01b2988468b" exitCode=0 Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.018332 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5","Type":"ContainerDied","Data":"b835000b6011e7f7b078f402eb099dc1ee686a9cc021548fe6f7d01b2988468b"} Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.018359 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5","Type":"ContainerDied","Data":"cfe2c50a44b97c2216866317b6fbb00a3e5bc3df4a3187988fe67b479ae8b0b4"} Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.018415 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.021755 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt77j\" (UniqueName: \"kubernetes.io/projected/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-kube-api-access-pt77j\") pod \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.021798 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-plugins\") pod \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.021825 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-pod-info\") pod \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.021862 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-server-conf\") pod \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.021951 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\") pod \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.021975 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-confd\") pod \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.022038 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-erlang-cookie\") pod \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.022085 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-plugins-conf\") pod \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.022148 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-erlang-cookie-secret\") pod \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\" (UID: \"b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5\") " Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.022330 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" (UID: "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.022828 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.025935 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" (UID: "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.026319 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-kube-api-access-pt77j" (OuterVolumeSpecName: "kube-api-access-pt77j") pod "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" (UID: "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5"). InnerVolumeSpecName "kube-api-access-pt77j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.027072 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-pod-info" (OuterVolumeSpecName: "pod-info") pod "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" (UID: "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.029334 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" (UID: "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.039171 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" (UID: "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.040212 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2" (OuterVolumeSpecName: "persistence") pod "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" (UID: "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5"). InnerVolumeSpecName "pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.044719 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-server-conf" (OuterVolumeSpecName: "server-conf") pod "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" (UID: "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.106853 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" (UID: "b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.121149 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.124056 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt77j\" (UniqueName: \"kubernetes.io/projected/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-kube-api-access-pt77j\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.124194 4766 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-pod-info\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.124259 4766 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-server-conf\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.124355 4766 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\") on node \"crc\" " Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.124428 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.124487 4766 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.124582 4766 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.124648 4766 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.127376 4766 scope.go:117] "RemoveContainer" containerID="d06823d8e57ee6b56ce4c0c08e4da7c3a2e621bce04f90755ab6c0a6fea194b7" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.130909 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.151193 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:14:29 crc kubenswrapper[4766]: E1002 12:14:29.151558 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f3ea03a-d14b-4bf3-b67d-7c0f72123842" containerName="rabbitmq" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.151583 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f3ea03a-d14b-4bf3-b67d-7c0f72123842" containerName="rabbitmq" Oct 02 12:14:29 crc kubenswrapper[4766]: E1002 12:14:29.151598 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cda9075-c127-4525-b597-2d4d63824a24" containerName="extract-content" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.151606 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cda9075-c127-4525-b597-2d4d63824a24" containerName="extract-content" Oct 02 12:14:29 crc kubenswrapper[4766]: E1002 12:14:29.151623 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f3ea03a-d14b-4bf3-b67d-7c0f72123842" containerName="setup-container" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.151651 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f3ea03a-d14b-4bf3-b67d-7c0f72123842" containerName="setup-container" Oct 02 12:14:29 crc kubenswrapper[4766]: E1002 12:14:29.151673 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" containerName="rabbitmq" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.151681 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" containerName="rabbitmq" Oct 02 12:14:29 crc kubenswrapper[4766]: E1002 12:14:29.151694 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cda9075-c127-4525-b597-2d4d63824a24" containerName="extract-utilities" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.151701 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cda9075-c127-4525-b597-2d4d63824a24" containerName="extract-utilities" Oct 02 12:14:29 crc kubenswrapper[4766]: E1002 12:14:29.151719 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cda9075-c127-4525-b597-2d4d63824a24" containerName="registry-server" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.155889 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cda9075-c127-4525-b597-2d4d63824a24" containerName="registry-server" Oct 02 12:14:29 crc kubenswrapper[4766]: E1002 12:14:29.155911 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" containerName="setup-container" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.155917 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" containerName="setup-container" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.156159 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f3ea03a-d14b-4bf3-b67d-7c0f72123842" containerName="rabbitmq" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.156175 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cda9075-c127-4525-b597-2d4d63824a24" containerName="registry-server" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.156184 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" containerName="rabbitmq" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.157462 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.157856 4766 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.158747 4766 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2") on node "crc" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.159193 4766 scope.go:117] "RemoveContainer" containerID="b835000b6011e7f7b078f402eb099dc1ee686a9cc021548fe6f7d01b2988468b" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.162413 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.162576 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.162760 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.163347 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.163379 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-t4dn7" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.167321 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.192702 4766 scope.go:117] "RemoveContainer" containerID="115bbf36ab90fc0c6970fea4016d4b7c80937d835976580efb88403813b7f352" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.212393 4766 scope.go:117] "RemoveContainer" containerID="b835000b6011e7f7b078f402eb099dc1ee686a9cc021548fe6f7d01b2988468b" Oct 02 12:14:29 crc kubenswrapper[4766]: E1002 12:14:29.212891 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b835000b6011e7f7b078f402eb099dc1ee686a9cc021548fe6f7d01b2988468b\": container with ID starting with b835000b6011e7f7b078f402eb099dc1ee686a9cc021548fe6f7d01b2988468b not found: ID does not exist" containerID="b835000b6011e7f7b078f402eb099dc1ee686a9cc021548fe6f7d01b2988468b" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.212943 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b835000b6011e7f7b078f402eb099dc1ee686a9cc021548fe6f7d01b2988468b"} err="failed to get container status \"b835000b6011e7f7b078f402eb099dc1ee686a9cc021548fe6f7d01b2988468b\": rpc error: code = NotFound desc = could not find container \"b835000b6011e7f7b078f402eb099dc1ee686a9cc021548fe6f7d01b2988468b\": container with ID starting with b835000b6011e7f7b078f402eb099dc1ee686a9cc021548fe6f7d01b2988468b not found: ID does not exist" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.212966 4766 scope.go:117] "RemoveContainer" containerID="115bbf36ab90fc0c6970fea4016d4b7c80937d835976580efb88403813b7f352" Oct 02 12:14:29 crc kubenswrapper[4766]: E1002 12:14:29.213362 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"115bbf36ab90fc0c6970fea4016d4b7c80937d835976580efb88403813b7f352\": container with ID starting with 115bbf36ab90fc0c6970fea4016d4b7c80937d835976580efb88403813b7f352 not found: ID does not exist" containerID="115bbf36ab90fc0c6970fea4016d4b7c80937d835976580efb88403813b7f352" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.213384 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"115bbf36ab90fc0c6970fea4016d4b7c80937d835976580efb88403813b7f352"} err="failed to get container status \"115bbf36ab90fc0c6970fea4016d4b7c80937d835976580efb88403813b7f352\": rpc error: code = NotFound desc = could not find container \"115bbf36ab90fc0c6970fea4016d4b7c80937d835976580efb88403813b7f352\": container with ID starting with 115bbf36ab90fc0c6970fea4016d4b7c80937d835976580efb88403813b7f352 not found: ID does not exist" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.225944 4766 reconciler_common.go:293] "Volume detached for volume \"pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.318750 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.327644 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35a7d34a-27b2-496f-aa63-b04439becb52-server-conf\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.327700 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35a7d34a-27b2-496f-aa63-b04439becb52-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.327732 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35a7d34a-27b2-496f-aa63-b04439becb52-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.327806 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.327837 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35a7d34a-27b2-496f-aa63-b04439becb52-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.327996 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35a7d34a-27b2-496f-aa63-b04439becb52-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.328046 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35a7d34a-27b2-496f-aa63-b04439becb52-pod-info\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.328085 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35a7d34a-27b2-496f-aa63-b04439becb52-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.328272 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tslw9\" (UniqueName: \"kubernetes.io/projected/35a7d34a-27b2-496f-aa63-b04439becb52-kube-api-access-tslw9\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.358197 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.372631 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.408609 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-cz2f5"] Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.409027 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" podUID="a4191acc-30f7-4f6a-9812-ac63638e2663" containerName="dnsmasq-dns" containerID="cri-o://daa619c58cd2608c2b441a32bdd4851a1d5a1bb63565b92dab32c7d51b83beae" gracePeriod=10 Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.419174 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.421087 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.425012 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.425796 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.425896 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.426085 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.426272 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.427261 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-nt6jm" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.429940 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35a7d34a-27b2-496f-aa63-b04439becb52-server-conf\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.429980 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35a7d34a-27b2-496f-aa63-b04439becb52-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.430002 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35a7d34a-27b2-496f-aa63-b04439becb52-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.430045 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.430067 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35a7d34a-27b2-496f-aa63-b04439becb52-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.430089 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35a7d34a-27b2-496f-aa63-b04439becb52-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.430106 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35a7d34a-27b2-496f-aa63-b04439becb52-pod-info\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.430126 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35a7d34a-27b2-496f-aa63-b04439becb52-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.430235 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tslw9\" (UniqueName: \"kubernetes.io/projected/35a7d34a-27b2-496f-aa63-b04439becb52-kube-api-access-tslw9\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.431594 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35a7d34a-27b2-496f-aa63-b04439becb52-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.431643 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35a7d34a-27b2-496f-aa63-b04439becb52-server-conf\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.432459 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35a7d34a-27b2-496f-aa63-b04439becb52-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.432535 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35a7d34a-27b2-496f-aa63-b04439becb52-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.437729 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35a7d34a-27b2-496f-aa63-b04439becb52-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.437729 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35a7d34a-27b2-496f-aa63-b04439becb52-pod-info\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.440270 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.440316 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7a3667aa3bc7fedcb15e0761f0f8e5d7b7b40152f5d342ccea1cf24b99f9ff61/globalmount\"" pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.440376 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35a7d34a-27b2-496f-aa63-b04439becb52-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.458645 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tslw9\" (UniqueName: \"kubernetes.io/projected/35a7d34a-27b2-496f-aa63-b04439becb52-kube-api-access-tslw9\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.487620 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0308db3-8ede-4978-9927-6a9cee4f1d9b\") pod \"rabbitmq-server-0\" (UID: \"35a7d34a-27b2-496f-aa63-b04439becb52\") " pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.532099 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f39320fe-abbc-4c64-8b86-1b32a7924017-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.532158 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f39320fe-abbc-4c64-8b86-1b32a7924017-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.532288 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f39320fe-abbc-4c64-8b86-1b32a7924017-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.532378 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lr7m\" (UniqueName: \"kubernetes.io/projected/f39320fe-abbc-4c64-8b86-1b32a7924017-kube-api-access-6lr7m\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.532437 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f39320fe-abbc-4c64-8b86-1b32a7924017-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.532474 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f39320fe-abbc-4c64-8b86-1b32a7924017-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.532566 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f39320fe-abbc-4c64-8b86-1b32a7924017-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.532664 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.532703 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f39320fe-abbc-4c64-8b86-1b32a7924017-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.634871 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f39320fe-abbc-4c64-8b86-1b32a7924017-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.634951 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f39320fe-abbc-4c64-8b86-1b32a7924017-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.634990 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f39320fe-abbc-4c64-8b86-1b32a7924017-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.635040 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lr7m\" (UniqueName: \"kubernetes.io/projected/f39320fe-abbc-4c64-8b86-1b32a7924017-kube-api-access-6lr7m\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.635103 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f39320fe-abbc-4c64-8b86-1b32a7924017-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.635147 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f39320fe-abbc-4c64-8b86-1b32a7924017-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.635194 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f39320fe-abbc-4c64-8b86-1b32a7924017-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.635244 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.635281 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f39320fe-abbc-4c64-8b86-1b32a7924017-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.636840 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f39320fe-abbc-4c64-8b86-1b32a7924017-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.637179 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f39320fe-abbc-4c64-8b86-1b32a7924017-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.637459 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f39320fe-abbc-4c64-8b86-1b32a7924017-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.638099 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f39320fe-abbc-4c64-8b86-1b32a7924017-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.639517 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f39320fe-abbc-4c64-8b86-1b32a7924017-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.646168 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f39320fe-abbc-4c64-8b86-1b32a7924017-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.646206 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f39320fe-abbc-4c64-8b86-1b32a7924017-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.646748 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.646835 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9aa24177a2f910eebba2713fc08ade1562fcbca42d7781d2bc9462bdc31af46e/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.663667 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lr7m\" (UniqueName: \"kubernetes.io/projected/f39320fe-abbc-4c64-8b86-1b32a7924017-kube-api-access-6lr7m\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.705564 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a0d18ab4-ac3d-4b31-8135-bed73cf0ffb2\") pod \"rabbitmq-cell1-server-0\" (UID: \"f39320fe-abbc-4c64-8b86-1b32a7924017\") " pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.764413 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.785859 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.801943 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.861293 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wnts\" (UniqueName: \"kubernetes.io/projected/a4191acc-30f7-4f6a-9812-ac63638e2663-kube-api-access-4wnts\") pod \"a4191acc-30f7-4f6a-9812-ac63638e2663\" (UID: \"a4191acc-30f7-4f6a-9812-ac63638e2663\") " Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.861359 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4191acc-30f7-4f6a-9812-ac63638e2663-config\") pod \"a4191acc-30f7-4f6a-9812-ac63638e2663\" (UID: \"a4191acc-30f7-4f6a-9812-ac63638e2663\") " Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.861447 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4191acc-30f7-4f6a-9812-ac63638e2663-dns-svc\") pod \"a4191acc-30f7-4f6a-9812-ac63638e2663\" (UID: \"a4191acc-30f7-4f6a-9812-ac63638e2663\") " Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.864896 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4191acc-30f7-4f6a-9812-ac63638e2663-kube-api-access-4wnts" (OuterVolumeSpecName: "kube-api-access-4wnts") pod "a4191acc-30f7-4f6a-9812-ac63638e2663" (UID: "a4191acc-30f7-4f6a-9812-ac63638e2663"). InnerVolumeSpecName "kube-api-access-4wnts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.902550 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f3ea03a-d14b-4bf3-b67d-7c0f72123842" path="/var/lib/kubelet/pods/3f3ea03a-d14b-4bf3-b67d-7c0f72123842/volumes" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.903281 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5" path="/var/lib/kubelet/pods/b85ab82d-faf9-4cc4-b020-4c1db1c2b0f5/volumes" Oct 02 12:14:29 crc kubenswrapper[4766]: E1002 12:14:29.903893 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a4191acc-30f7-4f6a-9812-ac63638e2663-dns-svc podName:a4191acc-30f7-4f6a-9812-ac63638e2663 nodeName:}" failed. No retries permitted until 2025-10-02 12:14:30.403873291 +0000 UTC m=+4985.346744225 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/a4191acc-30f7-4f6a-9812-ac63638e2663-dns-svc") pod "a4191acc-30f7-4f6a-9812-ac63638e2663" (UID: "a4191acc-30f7-4f6a-9812-ac63638e2663") : error deleting /var/lib/kubelet/pods/a4191acc-30f7-4f6a-9812-ac63638e2663/volume-subpaths: remove /var/lib/kubelet/pods/a4191acc-30f7-4f6a-9812-ac63638e2663/volume-subpaths: no such file or directory Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.904832 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4191acc-30f7-4f6a-9812-ac63638e2663-config" (OuterVolumeSpecName: "config") pod "a4191acc-30f7-4f6a-9812-ac63638e2663" (UID: "a4191acc-30f7-4f6a-9812-ac63638e2663"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.963891 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wnts\" (UniqueName: \"kubernetes.io/projected/a4191acc-30f7-4f6a-9812-ac63638e2663-kube-api-access-4wnts\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:29 crc kubenswrapper[4766]: I1002 12:14:29.964252 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4191acc-30f7-4f6a-9812-ac63638e2663-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:30 crc kubenswrapper[4766]: I1002 12:14:30.026940 4766 generic.go:334] "Generic (PLEG): container finished" podID="a4191acc-30f7-4f6a-9812-ac63638e2663" containerID="daa619c58cd2608c2b441a32bdd4851a1d5a1bb63565b92dab32c7d51b83beae" exitCode=0 Oct 02 12:14:30 crc kubenswrapper[4766]: I1002 12:14:30.026991 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" Oct 02 12:14:30 crc kubenswrapper[4766]: I1002 12:14:30.027000 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" event={"ID":"a4191acc-30f7-4f6a-9812-ac63638e2663","Type":"ContainerDied","Data":"daa619c58cd2608c2b441a32bdd4851a1d5a1bb63565b92dab32c7d51b83beae"} Oct 02 12:14:30 crc kubenswrapper[4766]: I1002 12:14:30.027116 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-cz2f5" event={"ID":"a4191acc-30f7-4f6a-9812-ac63638e2663","Type":"ContainerDied","Data":"1325b0b9db593c1409d83eb23058e4fbfafdaeb9b6f1a42fd0c603dea0deb30a"} Oct 02 12:14:30 crc kubenswrapper[4766]: I1002 12:14:30.027144 4766 scope.go:117] "RemoveContainer" containerID="daa619c58cd2608c2b441a32bdd4851a1d5a1bb63565b92dab32c7d51b83beae" Oct 02 12:14:30 crc kubenswrapper[4766]: I1002 12:14:30.047430 4766 scope.go:117] "RemoveContainer" containerID="bae803c19d5e7dff4d8ae90a6530669401e97afbb34ebc584b169113c930fcd9" Oct 02 12:14:30 crc kubenswrapper[4766]: I1002 12:14:30.080196 4766 scope.go:117] "RemoveContainer" containerID="daa619c58cd2608c2b441a32bdd4851a1d5a1bb63565b92dab32c7d51b83beae" Oct 02 12:14:30 crc kubenswrapper[4766]: E1002 12:14:30.081335 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daa619c58cd2608c2b441a32bdd4851a1d5a1bb63565b92dab32c7d51b83beae\": container with ID starting with daa619c58cd2608c2b441a32bdd4851a1d5a1bb63565b92dab32c7d51b83beae not found: ID does not exist" containerID="daa619c58cd2608c2b441a32bdd4851a1d5a1bb63565b92dab32c7d51b83beae" Oct 02 12:14:30 crc kubenswrapper[4766]: I1002 12:14:30.081367 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daa619c58cd2608c2b441a32bdd4851a1d5a1bb63565b92dab32c7d51b83beae"} err="failed to get container status \"daa619c58cd2608c2b441a32bdd4851a1d5a1bb63565b92dab32c7d51b83beae\": rpc error: code = NotFound desc = could not find container \"daa619c58cd2608c2b441a32bdd4851a1d5a1bb63565b92dab32c7d51b83beae\": container with ID starting with daa619c58cd2608c2b441a32bdd4851a1d5a1bb63565b92dab32c7d51b83beae not found: ID does not exist" Oct 02 12:14:30 crc kubenswrapper[4766]: I1002 12:14:30.081392 4766 scope.go:117] "RemoveContainer" containerID="bae803c19d5e7dff4d8ae90a6530669401e97afbb34ebc584b169113c930fcd9" Oct 02 12:14:30 crc kubenswrapper[4766]: E1002 12:14:30.081799 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bae803c19d5e7dff4d8ae90a6530669401e97afbb34ebc584b169113c930fcd9\": container with ID starting with bae803c19d5e7dff4d8ae90a6530669401e97afbb34ebc584b169113c930fcd9 not found: ID does not exist" containerID="bae803c19d5e7dff4d8ae90a6530669401e97afbb34ebc584b169113c930fcd9" Oct 02 12:14:30 crc kubenswrapper[4766]: I1002 12:14:30.081822 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bae803c19d5e7dff4d8ae90a6530669401e97afbb34ebc584b169113c930fcd9"} err="failed to get container status \"bae803c19d5e7dff4d8ae90a6530669401e97afbb34ebc584b169113c930fcd9\": rpc error: code = NotFound desc = could not find container \"bae803c19d5e7dff4d8ae90a6530669401e97afbb34ebc584b169113c930fcd9\": container with ID starting with bae803c19d5e7dff4d8ae90a6530669401e97afbb34ebc584b169113c930fcd9 not found: ID does not exist" Oct 02 12:14:30 crc kubenswrapper[4766]: I1002 12:14:30.274136 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 02 12:14:30 crc kubenswrapper[4766]: W1002 12:14:30.278876 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf39320fe_abbc_4c64_8b86_1b32a7924017.slice/crio-0bb7cd1c8acc0ce6bb0f2a5ab0efa8e50447102d14f0c0706647b91c0d576ad9 WatchSource:0}: Error finding container 0bb7cd1c8acc0ce6bb0f2a5ab0efa8e50447102d14f0c0706647b91c0d576ad9: Status 404 returned error can't find the container with id 0bb7cd1c8acc0ce6bb0f2a5ab0efa8e50447102d14f0c0706647b91c0d576ad9 Oct 02 12:14:30 crc kubenswrapper[4766]: I1002 12:14:30.334898 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 02 12:14:30 crc kubenswrapper[4766]: I1002 12:14:30.474042 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4191acc-30f7-4f6a-9812-ac63638e2663-dns-svc\") pod \"a4191acc-30f7-4f6a-9812-ac63638e2663\" (UID: \"a4191acc-30f7-4f6a-9812-ac63638e2663\") " Oct 02 12:14:30 crc kubenswrapper[4766]: I1002 12:14:30.474641 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4191acc-30f7-4f6a-9812-ac63638e2663-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a4191acc-30f7-4f6a-9812-ac63638e2663" (UID: "a4191acc-30f7-4f6a-9812-ac63638e2663"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:14:30 crc kubenswrapper[4766]: I1002 12:14:30.575689 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a4191acc-30f7-4f6a-9812-ac63638e2663-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:30 crc kubenswrapper[4766]: I1002 12:14:30.657336 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-cz2f5"] Oct 02 12:14:30 crc kubenswrapper[4766]: I1002 12:14:30.662429 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-cz2f5"] Oct 02 12:14:31 crc kubenswrapper[4766]: I1002 12:14:31.041394 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f39320fe-abbc-4c64-8b86-1b32a7924017","Type":"ContainerStarted","Data":"0bb7cd1c8acc0ce6bb0f2a5ab0efa8e50447102d14f0c0706647b91c0d576ad9"} Oct 02 12:14:31 crc kubenswrapper[4766]: I1002 12:14:31.045853 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35a7d34a-27b2-496f-aa63-b04439becb52","Type":"ContainerStarted","Data":"11ea1b31d26a6f9f98885843b0f0f138f107ba900d95932bcdba93ccbade0f01"} Oct 02 12:14:31 crc kubenswrapper[4766]: I1002 12:14:31.891051 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4191acc-30f7-4f6a-9812-ac63638e2663" path="/var/lib/kubelet/pods/a4191acc-30f7-4f6a-9812-ac63638e2663/volumes" Oct 02 12:14:32 crc kubenswrapper[4766]: I1002 12:14:32.055753 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35a7d34a-27b2-496f-aa63-b04439becb52","Type":"ContainerStarted","Data":"9b47d5b9fcf0414486c4b489e98341cab3d9d29dc8eb31012ce873bfd690b687"} Oct 02 12:14:32 crc kubenswrapper[4766]: I1002 12:14:32.058050 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f39320fe-abbc-4c64-8b86-1b32a7924017","Type":"ContainerStarted","Data":"4159d30e904b11fdff6a6df92e95affeef7bcaec40141dae8b7baba7dbe20834"} Oct 02 12:14:37 crc kubenswrapper[4766]: I1002 12:14:37.828601 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qczhj"] Oct 02 12:14:37 crc kubenswrapper[4766]: E1002 12:14:37.829861 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4191acc-30f7-4f6a-9812-ac63638e2663" containerName="init" Oct 02 12:14:37 crc kubenswrapper[4766]: I1002 12:14:37.829879 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4191acc-30f7-4f6a-9812-ac63638e2663" containerName="init" Oct 02 12:14:37 crc kubenswrapper[4766]: E1002 12:14:37.829890 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4191acc-30f7-4f6a-9812-ac63638e2663" containerName="dnsmasq-dns" Oct 02 12:14:37 crc kubenswrapper[4766]: I1002 12:14:37.829897 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4191acc-30f7-4f6a-9812-ac63638e2663" containerName="dnsmasq-dns" Oct 02 12:14:37 crc kubenswrapper[4766]: I1002 12:14:37.830104 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4191acc-30f7-4f6a-9812-ac63638e2663" containerName="dnsmasq-dns" Oct 02 12:14:37 crc kubenswrapper[4766]: I1002 12:14:37.838025 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qczhj" Oct 02 12:14:37 crc kubenswrapper[4766]: I1002 12:14:37.852520 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qczhj"] Oct 02 12:14:37 crc kubenswrapper[4766]: I1002 12:14:37.881654 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:14:37 crc kubenswrapper[4766]: E1002 12:14:37.882020 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:14:37 crc kubenswrapper[4766]: I1002 12:14:37.899458 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71aeb356-2055-4b25-805a-f7ce67c14ab5-utilities\") pod \"certified-operators-qczhj\" (UID: \"71aeb356-2055-4b25-805a-f7ce67c14ab5\") " pod="openshift-marketplace/certified-operators-qczhj" Oct 02 12:14:37 crc kubenswrapper[4766]: I1002 12:14:37.899733 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjl5f\" (UniqueName: \"kubernetes.io/projected/71aeb356-2055-4b25-805a-f7ce67c14ab5-kube-api-access-sjl5f\") pod \"certified-operators-qczhj\" (UID: \"71aeb356-2055-4b25-805a-f7ce67c14ab5\") " pod="openshift-marketplace/certified-operators-qczhj" Oct 02 12:14:37 crc kubenswrapper[4766]: I1002 12:14:37.899997 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71aeb356-2055-4b25-805a-f7ce67c14ab5-catalog-content\") pod \"certified-operators-qczhj\" (UID: \"71aeb356-2055-4b25-805a-f7ce67c14ab5\") " pod="openshift-marketplace/certified-operators-qczhj" Oct 02 12:14:38 crc kubenswrapper[4766]: I1002 12:14:38.002216 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjl5f\" (UniqueName: \"kubernetes.io/projected/71aeb356-2055-4b25-805a-f7ce67c14ab5-kube-api-access-sjl5f\") pod \"certified-operators-qczhj\" (UID: \"71aeb356-2055-4b25-805a-f7ce67c14ab5\") " pod="openshift-marketplace/certified-operators-qczhj" Oct 02 12:14:38 crc kubenswrapper[4766]: I1002 12:14:38.002364 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71aeb356-2055-4b25-805a-f7ce67c14ab5-catalog-content\") pod \"certified-operators-qczhj\" (UID: \"71aeb356-2055-4b25-805a-f7ce67c14ab5\") " pod="openshift-marketplace/certified-operators-qczhj" Oct 02 12:14:38 crc kubenswrapper[4766]: I1002 12:14:38.002446 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71aeb356-2055-4b25-805a-f7ce67c14ab5-utilities\") pod \"certified-operators-qczhj\" (UID: \"71aeb356-2055-4b25-805a-f7ce67c14ab5\") " pod="openshift-marketplace/certified-operators-qczhj" Oct 02 12:14:38 crc kubenswrapper[4766]: I1002 12:14:38.003270 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71aeb356-2055-4b25-805a-f7ce67c14ab5-utilities\") pod \"certified-operators-qczhj\" (UID: \"71aeb356-2055-4b25-805a-f7ce67c14ab5\") " pod="openshift-marketplace/certified-operators-qczhj" Oct 02 12:14:38 crc kubenswrapper[4766]: I1002 12:14:38.003521 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71aeb356-2055-4b25-805a-f7ce67c14ab5-catalog-content\") pod \"certified-operators-qczhj\" (UID: \"71aeb356-2055-4b25-805a-f7ce67c14ab5\") " pod="openshift-marketplace/certified-operators-qczhj" Oct 02 12:14:38 crc kubenswrapper[4766]: I1002 12:14:38.029151 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjl5f\" (UniqueName: \"kubernetes.io/projected/71aeb356-2055-4b25-805a-f7ce67c14ab5-kube-api-access-sjl5f\") pod \"certified-operators-qczhj\" (UID: \"71aeb356-2055-4b25-805a-f7ce67c14ab5\") " pod="openshift-marketplace/certified-operators-qczhj" Oct 02 12:14:38 crc kubenswrapper[4766]: I1002 12:14:38.175123 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qczhj" Oct 02 12:14:38 crc kubenswrapper[4766]: I1002 12:14:38.485930 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qczhj"] Oct 02 12:14:39 crc kubenswrapper[4766]: I1002 12:14:39.135806 4766 generic.go:334] "Generic (PLEG): container finished" podID="71aeb356-2055-4b25-805a-f7ce67c14ab5" containerID="1502ee4c46df0df4a9709c3d8f541f1024e73786675f7b4fd45c27130fb8ea26" exitCode=0 Oct 02 12:14:39 crc kubenswrapper[4766]: I1002 12:14:39.135936 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qczhj" event={"ID":"71aeb356-2055-4b25-805a-f7ce67c14ab5","Type":"ContainerDied","Data":"1502ee4c46df0df4a9709c3d8f541f1024e73786675f7b4fd45c27130fb8ea26"} Oct 02 12:14:39 crc kubenswrapper[4766]: I1002 12:14:39.136373 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qczhj" event={"ID":"71aeb356-2055-4b25-805a-f7ce67c14ab5","Type":"ContainerStarted","Data":"67bb804f7752965144f07a14679cf96a88791ab209297ef2ab0e506d2b718638"} Oct 02 12:14:41 crc kubenswrapper[4766]: I1002 12:14:41.159321 4766 generic.go:334] "Generic (PLEG): container finished" podID="71aeb356-2055-4b25-805a-f7ce67c14ab5" containerID="065c193e0bcdf752ff46399e03a74576b9de3db20d0edd94442cbc0dbd4e7bd5" exitCode=0 Oct 02 12:14:41 crc kubenswrapper[4766]: I1002 12:14:41.159424 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qczhj" event={"ID":"71aeb356-2055-4b25-805a-f7ce67c14ab5","Type":"ContainerDied","Data":"065c193e0bcdf752ff46399e03a74576b9de3db20d0edd94442cbc0dbd4e7bd5"} Oct 02 12:14:42 crc kubenswrapper[4766]: I1002 12:14:42.175154 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qczhj" event={"ID":"71aeb356-2055-4b25-805a-f7ce67c14ab5","Type":"ContainerStarted","Data":"c20533051e9229de2f6982b730b89d08450945a892518de879685850c405cfc8"} Oct 02 12:14:42 crc kubenswrapper[4766]: I1002 12:14:42.197264 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qczhj" podStartSLOduration=2.7775153120000002 podStartE2EDuration="5.197227474s" podCreationTimestamp="2025-10-02 12:14:37 +0000 UTC" firstStartedPulling="2025-10-02 12:14:39.138776842 +0000 UTC m=+4994.081647796" lastFinishedPulling="2025-10-02 12:14:41.558489004 +0000 UTC m=+4996.501359958" observedRunningTime="2025-10-02 12:14:42.192809773 +0000 UTC m=+4997.135680717" watchObservedRunningTime="2025-10-02 12:14:42.197227474 +0000 UTC m=+4997.140098418" Oct 02 12:14:48 crc kubenswrapper[4766]: I1002 12:14:48.176259 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qczhj" Oct 02 12:14:48 crc kubenswrapper[4766]: I1002 12:14:48.177206 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qczhj" Oct 02 12:14:48 crc kubenswrapper[4766]: I1002 12:14:48.230658 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qczhj" Oct 02 12:14:48 crc kubenswrapper[4766]: I1002 12:14:48.289567 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qczhj" Oct 02 12:14:48 crc kubenswrapper[4766]: I1002 12:14:48.467745 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qczhj"] Oct 02 12:14:49 crc kubenswrapper[4766]: I1002 12:14:49.881821 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:14:49 crc kubenswrapper[4766]: E1002 12:14:49.882406 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:14:50 crc kubenswrapper[4766]: I1002 12:14:50.246476 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qczhj" podUID="71aeb356-2055-4b25-805a-f7ce67c14ab5" containerName="registry-server" containerID="cri-o://c20533051e9229de2f6982b730b89d08450945a892518de879685850c405cfc8" gracePeriod=2 Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.230870 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qczhj" Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.274570 4766 generic.go:334] "Generic (PLEG): container finished" podID="71aeb356-2055-4b25-805a-f7ce67c14ab5" containerID="c20533051e9229de2f6982b730b89d08450945a892518de879685850c405cfc8" exitCode=0 Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.274649 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qczhj" event={"ID":"71aeb356-2055-4b25-805a-f7ce67c14ab5","Type":"ContainerDied","Data":"c20533051e9229de2f6982b730b89d08450945a892518de879685850c405cfc8"} Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.274697 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qczhj" event={"ID":"71aeb356-2055-4b25-805a-f7ce67c14ab5","Type":"ContainerDied","Data":"67bb804f7752965144f07a14679cf96a88791ab209297ef2ab0e506d2b718638"} Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.274728 4766 scope.go:117] "RemoveContainer" containerID="c20533051e9229de2f6982b730b89d08450945a892518de879685850c405cfc8" Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.274948 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qczhj" Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.302103 4766 scope.go:117] "RemoveContainer" containerID="065c193e0bcdf752ff46399e03a74576b9de3db20d0edd94442cbc0dbd4e7bd5" Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.328025 4766 scope.go:117] "RemoveContainer" containerID="1502ee4c46df0df4a9709c3d8f541f1024e73786675f7b4fd45c27130fb8ea26" Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.355857 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71aeb356-2055-4b25-805a-f7ce67c14ab5-utilities\") pod \"71aeb356-2055-4b25-805a-f7ce67c14ab5\" (UID: \"71aeb356-2055-4b25-805a-f7ce67c14ab5\") " Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.355915 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjl5f\" (UniqueName: \"kubernetes.io/projected/71aeb356-2055-4b25-805a-f7ce67c14ab5-kube-api-access-sjl5f\") pod \"71aeb356-2055-4b25-805a-f7ce67c14ab5\" (UID: \"71aeb356-2055-4b25-805a-f7ce67c14ab5\") " Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.356048 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71aeb356-2055-4b25-805a-f7ce67c14ab5-catalog-content\") pod \"71aeb356-2055-4b25-805a-f7ce67c14ab5\" (UID: \"71aeb356-2055-4b25-805a-f7ce67c14ab5\") " Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.357285 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71aeb356-2055-4b25-805a-f7ce67c14ab5-utilities" (OuterVolumeSpecName: "utilities") pod "71aeb356-2055-4b25-805a-f7ce67c14ab5" (UID: "71aeb356-2055-4b25-805a-f7ce67c14ab5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.361603 4766 scope.go:117] "RemoveContainer" containerID="c20533051e9229de2f6982b730b89d08450945a892518de879685850c405cfc8" Oct 02 12:14:51 crc kubenswrapper[4766]: E1002 12:14:51.365132 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c20533051e9229de2f6982b730b89d08450945a892518de879685850c405cfc8\": container with ID starting with c20533051e9229de2f6982b730b89d08450945a892518de879685850c405cfc8 not found: ID does not exist" containerID="c20533051e9229de2f6982b730b89d08450945a892518de879685850c405cfc8" Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.365195 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c20533051e9229de2f6982b730b89d08450945a892518de879685850c405cfc8"} err="failed to get container status \"c20533051e9229de2f6982b730b89d08450945a892518de879685850c405cfc8\": rpc error: code = NotFound desc = could not find container \"c20533051e9229de2f6982b730b89d08450945a892518de879685850c405cfc8\": container with ID starting with c20533051e9229de2f6982b730b89d08450945a892518de879685850c405cfc8 not found: ID does not exist" Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.365232 4766 scope.go:117] "RemoveContainer" containerID="065c193e0bcdf752ff46399e03a74576b9de3db20d0edd94442cbc0dbd4e7bd5" Oct 02 12:14:51 crc kubenswrapper[4766]: E1002 12:14:51.365925 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"065c193e0bcdf752ff46399e03a74576b9de3db20d0edd94442cbc0dbd4e7bd5\": container with ID starting with 065c193e0bcdf752ff46399e03a74576b9de3db20d0edd94442cbc0dbd4e7bd5 not found: ID does not exist" containerID="065c193e0bcdf752ff46399e03a74576b9de3db20d0edd94442cbc0dbd4e7bd5" Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.365965 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"065c193e0bcdf752ff46399e03a74576b9de3db20d0edd94442cbc0dbd4e7bd5"} err="failed to get container status \"065c193e0bcdf752ff46399e03a74576b9de3db20d0edd94442cbc0dbd4e7bd5\": rpc error: code = NotFound desc = could not find container \"065c193e0bcdf752ff46399e03a74576b9de3db20d0edd94442cbc0dbd4e7bd5\": container with ID starting with 065c193e0bcdf752ff46399e03a74576b9de3db20d0edd94442cbc0dbd4e7bd5 not found: ID does not exist" Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.365997 4766 scope.go:117] "RemoveContainer" containerID="1502ee4c46df0df4a9709c3d8f541f1024e73786675f7b4fd45c27130fb8ea26" Oct 02 12:14:51 crc kubenswrapper[4766]: E1002 12:14:51.366412 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1502ee4c46df0df4a9709c3d8f541f1024e73786675f7b4fd45c27130fb8ea26\": container with ID starting with 1502ee4c46df0df4a9709c3d8f541f1024e73786675f7b4fd45c27130fb8ea26 not found: ID does not exist" containerID="1502ee4c46df0df4a9709c3d8f541f1024e73786675f7b4fd45c27130fb8ea26" Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.366472 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1502ee4c46df0df4a9709c3d8f541f1024e73786675f7b4fd45c27130fb8ea26"} err="failed to get container status \"1502ee4c46df0df4a9709c3d8f541f1024e73786675f7b4fd45c27130fb8ea26\": rpc error: code = NotFound desc = could not find container \"1502ee4c46df0df4a9709c3d8f541f1024e73786675f7b4fd45c27130fb8ea26\": container with ID starting with 1502ee4c46df0df4a9709c3d8f541f1024e73786675f7b4fd45c27130fb8ea26 not found: ID does not exist" Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.366925 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71aeb356-2055-4b25-805a-f7ce67c14ab5-kube-api-access-sjl5f" (OuterVolumeSpecName: "kube-api-access-sjl5f") pod "71aeb356-2055-4b25-805a-f7ce67c14ab5" (UID: "71aeb356-2055-4b25-805a-f7ce67c14ab5"). InnerVolumeSpecName "kube-api-access-sjl5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.412546 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71aeb356-2055-4b25-805a-f7ce67c14ab5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71aeb356-2055-4b25-805a-f7ce67c14ab5" (UID: "71aeb356-2055-4b25-805a-f7ce67c14ab5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.457754 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71aeb356-2055-4b25-805a-f7ce67c14ab5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.458045 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71aeb356-2055-4b25-805a-f7ce67c14ab5-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.458118 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjl5f\" (UniqueName: \"kubernetes.io/projected/71aeb356-2055-4b25-805a-f7ce67c14ab5-kube-api-access-sjl5f\") on node \"crc\" DevicePath \"\"" Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.608003 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qczhj"] Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.614009 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qczhj"] Oct 02 12:14:51 crc kubenswrapper[4766]: I1002 12:14:51.890947 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71aeb356-2055-4b25-805a-f7ce67c14ab5" path="/var/lib/kubelet/pods/71aeb356-2055-4b25-805a-f7ce67c14ab5/volumes" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.161992 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb"] Oct 02 12:15:00 crc kubenswrapper[4766]: E1002 12:15:00.163287 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71aeb356-2055-4b25-805a-f7ce67c14ab5" containerName="extract-utilities" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.163314 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="71aeb356-2055-4b25-805a-f7ce67c14ab5" containerName="extract-utilities" Oct 02 12:15:00 crc kubenswrapper[4766]: E1002 12:15:00.163337 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71aeb356-2055-4b25-805a-f7ce67c14ab5" containerName="extract-content" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.163347 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="71aeb356-2055-4b25-805a-f7ce67c14ab5" containerName="extract-content" Oct 02 12:15:00 crc kubenswrapper[4766]: E1002 12:15:00.163356 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71aeb356-2055-4b25-805a-f7ce67c14ab5" containerName="registry-server" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.163365 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="71aeb356-2055-4b25-805a-f7ce67c14ab5" containerName="registry-server" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.163630 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="71aeb356-2055-4b25-805a-f7ce67c14ab5" containerName="registry-server" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.164519 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.170223 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.170837 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.173140 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb"] Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.229356 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7baeb498-8c0f-497f-a80e-58c405c1d32f-config-volume\") pod \"collect-profiles-29323455-69xxb\" (UID: \"7baeb498-8c0f-497f-a80e-58c405c1d32f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.229450 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7baeb498-8c0f-497f-a80e-58c405c1d32f-secret-volume\") pod \"collect-profiles-29323455-69xxb\" (UID: \"7baeb498-8c0f-497f-a80e-58c405c1d32f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.229492 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzw92\" (UniqueName: \"kubernetes.io/projected/7baeb498-8c0f-497f-a80e-58c405c1d32f-kube-api-access-bzw92\") pod \"collect-profiles-29323455-69xxb\" (UID: \"7baeb498-8c0f-497f-a80e-58c405c1d32f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.331400 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7baeb498-8c0f-497f-a80e-58c405c1d32f-config-volume\") pod \"collect-profiles-29323455-69xxb\" (UID: \"7baeb498-8c0f-497f-a80e-58c405c1d32f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.331581 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7baeb498-8c0f-497f-a80e-58c405c1d32f-secret-volume\") pod \"collect-profiles-29323455-69xxb\" (UID: \"7baeb498-8c0f-497f-a80e-58c405c1d32f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.331622 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzw92\" (UniqueName: \"kubernetes.io/projected/7baeb498-8c0f-497f-a80e-58c405c1d32f-kube-api-access-bzw92\") pod \"collect-profiles-29323455-69xxb\" (UID: \"7baeb498-8c0f-497f-a80e-58c405c1d32f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.333233 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7baeb498-8c0f-497f-a80e-58c405c1d32f-config-volume\") pod \"collect-profiles-29323455-69xxb\" (UID: \"7baeb498-8c0f-497f-a80e-58c405c1d32f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.341831 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7baeb498-8c0f-497f-a80e-58c405c1d32f-secret-volume\") pod \"collect-profiles-29323455-69xxb\" (UID: \"7baeb498-8c0f-497f-a80e-58c405c1d32f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.356681 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzw92\" (UniqueName: \"kubernetes.io/projected/7baeb498-8c0f-497f-a80e-58c405c1d32f-kube-api-access-bzw92\") pod \"collect-profiles-29323455-69xxb\" (UID: \"7baeb498-8c0f-497f-a80e-58c405c1d32f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.496284 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb" Oct 02 12:15:00 crc kubenswrapper[4766]: I1002 12:15:00.953778 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb"] Oct 02 12:15:01 crc kubenswrapper[4766]: I1002 12:15:01.368388 4766 generic.go:334] "Generic (PLEG): container finished" podID="7baeb498-8c0f-497f-a80e-58c405c1d32f" containerID="0823cb18d3c479a33ba38d82fa830a0c494a8e538aed2aac68f31210bd4ec663" exitCode=0 Oct 02 12:15:01 crc kubenswrapper[4766]: I1002 12:15:01.368478 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb" event={"ID":"7baeb498-8c0f-497f-a80e-58c405c1d32f","Type":"ContainerDied","Data":"0823cb18d3c479a33ba38d82fa830a0c494a8e538aed2aac68f31210bd4ec663"} Oct 02 12:15:01 crc kubenswrapper[4766]: I1002 12:15:01.368627 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb" event={"ID":"7baeb498-8c0f-497f-a80e-58c405c1d32f","Type":"ContainerStarted","Data":"abd747d94a7c66a00c0be20c2da7b272e6c5d37c3d90ce0f4c3628e86ed3c39e"} Oct 02 12:15:01 crc kubenswrapper[4766]: I1002 12:15:01.882477 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:15:01 crc kubenswrapper[4766]: E1002 12:15:01.882887 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:15:02 crc kubenswrapper[4766]: I1002 12:15:02.696929 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb" Oct 02 12:15:02 crc kubenswrapper[4766]: I1002 12:15:02.777624 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7baeb498-8c0f-497f-a80e-58c405c1d32f-config-volume\") pod \"7baeb498-8c0f-497f-a80e-58c405c1d32f\" (UID: \"7baeb498-8c0f-497f-a80e-58c405c1d32f\") " Oct 02 12:15:02 crc kubenswrapper[4766]: I1002 12:15:02.777875 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7baeb498-8c0f-497f-a80e-58c405c1d32f-secret-volume\") pod \"7baeb498-8c0f-497f-a80e-58c405c1d32f\" (UID: \"7baeb498-8c0f-497f-a80e-58c405c1d32f\") " Oct 02 12:15:02 crc kubenswrapper[4766]: I1002 12:15:02.777957 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzw92\" (UniqueName: \"kubernetes.io/projected/7baeb498-8c0f-497f-a80e-58c405c1d32f-kube-api-access-bzw92\") pod \"7baeb498-8c0f-497f-a80e-58c405c1d32f\" (UID: \"7baeb498-8c0f-497f-a80e-58c405c1d32f\") " Oct 02 12:15:02 crc kubenswrapper[4766]: I1002 12:15:02.778929 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7baeb498-8c0f-497f-a80e-58c405c1d32f-config-volume" (OuterVolumeSpecName: "config-volume") pod "7baeb498-8c0f-497f-a80e-58c405c1d32f" (UID: "7baeb498-8c0f-497f-a80e-58c405c1d32f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:15:02 crc kubenswrapper[4766]: I1002 12:15:02.788689 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7baeb498-8c0f-497f-a80e-58c405c1d32f-kube-api-access-bzw92" (OuterVolumeSpecName: "kube-api-access-bzw92") pod "7baeb498-8c0f-497f-a80e-58c405c1d32f" (UID: "7baeb498-8c0f-497f-a80e-58c405c1d32f"). InnerVolumeSpecName "kube-api-access-bzw92". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:15:02 crc kubenswrapper[4766]: I1002 12:15:02.788712 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7baeb498-8c0f-497f-a80e-58c405c1d32f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7baeb498-8c0f-497f-a80e-58c405c1d32f" (UID: "7baeb498-8c0f-497f-a80e-58c405c1d32f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:15:02 crc kubenswrapper[4766]: I1002 12:15:02.880049 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7baeb498-8c0f-497f-a80e-58c405c1d32f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:02 crc kubenswrapper[4766]: I1002 12:15:02.880548 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzw92\" (UniqueName: \"kubernetes.io/projected/7baeb498-8c0f-497f-a80e-58c405c1d32f-kube-api-access-bzw92\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:02 crc kubenswrapper[4766]: I1002 12:15:02.880566 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7baeb498-8c0f-497f-a80e-58c405c1d32f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:03 crc kubenswrapper[4766]: I1002 12:15:03.390513 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb" event={"ID":"7baeb498-8c0f-497f-a80e-58c405c1d32f","Type":"ContainerDied","Data":"abd747d94a7c66a00c0be20c2da7b272e6c5d37c3d90ce0f4c3628e86ed3c39e"} Oct 02 12:15:03 crc kubenswrapper[4766]: I1002 12:15:03.390573 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abd747d94a7c66a00c0be20c2da7b272e6c5d37c3d90ce0f4c3628e86ed3c39e" Oct 02 12:15:03 crc kubenswrapper[4766]: I1002 12:15:03.390596 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb" Oct 02 12:15:03 crc kubenswrapper[4766]: I1002 12:15:03.790216 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76"] Oct 02 12:15:03 crc kubenswrapper[4766]: I1002 12:15:03.795324 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323410-k7s76"] Oct 02 12:15:03 crc kubenswrapper[4766]: I1002 12:15:03.891970 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49b32761-088a-4ccb-a645-a82ceee34ab8" path="/var/lib/kubelet/pods/49b32761-088a-4ccb-a645-a82ceee34ab8/volumes" Oct 02 12:15:04 crc kubenswrapper[4766]: I1002 12:15:04.401836 4766 generic.go:334] "Generic (PLEG): container finished" podID="f39320fe-abbc-4c64-8b86-1b32a7924017" containerID="4159d30e904b11fdff6a6df92e95affeef7bcaec40141dae8b7baba7dbe20834" exitCode=0 Oct 02 12:15:04 crc kubenswrapper[4766]: I1002 12:15:04.401932 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f39320fe-abbc-4c64-8b86-1b32a7924017","Type":"ContainerDied","Data":"4159d30e904b11fdff6a6df92e95affeef7bcaec40141dae8b7baba7dbe20834"} Oct 02 12:15:04 crc kubenswrapper[4766]: I1002 12:15:04.405945 4766 generic.go:334] "Generic (PLEG): container finished" podID="35a7d34a-27b2-496f-aa63-b04439becb52" containerID="9b47d5b9fcf0414486c4b489e98341cab3d9d29dc8eb31012ce873bfd690b687" exitCode=0 Oct 02 12:15:04 crc kubenswrapper[4766]: I1002 12:15:04.405992 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35a7d34a-27b2-496f-aa63-b04439becb52","Type":"ContainerDied","Data":"9b47d5b9fcf0414486c4b489e98341cab3d9d29dc8eb31012ce873bfd690b687"} Oct 02 12:15:05 crc kubenswrapper[4766]: I1002 12:15:05.420631 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f39320fe-abbc-4c64-8b86-1b32a7924017","Type":"ContainerStarted","Data":"cde631462f18f320e3dada6173453fbfa99785b267d6f03c1c134c42d7c134ff"} Oct 02 12:15:05 crc kubenswrapper[4766]: I1002 12:15:05.421605 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:15:05 crc kubenswrapper[4766]: I1002 12:15:05.431524 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"35a7d34a-27b2-496f-aa63-b04439becb52","Type":"ContainerStarted","Data":"42c71bddfda46f0eb9e253b660584554bad620cd9c6a27e8723ed45e7b3b75ec"} Oct 02 12:15:05 crc kubenswrapper[4766]: I1002 12:15:05.432196 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 02 12:15:05 crc kubenswrapper[4766]: I1002 12:15:05.454109 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.454083652 podStartE2EDuration="36.454083652s" podCreationTimestamp="2025-10-02 12:14:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:15:05.448871345 +0000 UTC m=+5020.391742289" watchObservedRunningTime="2025-10-02 12:15:05.454083652 +0000 UTC m=+5020.396954626" Oct 02 12:15:05 crc kubenswrapper[4766]: I1002 12:15:05.474766 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.474747633 podStartE2EDuration="36.474747633s" podCreationTimestamp="2025-10-02 12:14:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:15:05.470130286 +0000 UTC m=+5020.413001240" watchObservedRunningTime="2025-10-02 12:15:05.474747633 +0000 UTC m=+5020.417618577" Oct 02 12:15:14 crc kubenswrapper[4766]: I1002 12:15:14.881669 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:15:14 crc kubenswrapper[4766]: E1002 12:15:14.882745 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:15:19 crc kubenswrapper[4766]: I1002 12:15:19.789321 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 02 12:15:19 crc kubenswrapper[4766]: I1002 12:15:19.806733 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 02 12:15:28 crc kubenswrapper[4766]: I1002 12:15:28.031764 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Oct 02 12:15:28 crc kubenswrapper[4766]: E1002 12:15:28.033003 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7baeb498-8c0f-497f-a80e-58c405c1d32f" containerName="collect-profiles" Oct 02 12:15:28 crc kubenswrapper[4766]: I1002 12:15:28.033019 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7baeb498-8c0f-497f-a80e-58c405c1d32f" containerName="collect-profiles" Oct 02 12:15:28 crc kubenswrapper[4766]: I1002 12:15:28.033159 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7baeb498-8c0f-497f-a80e-58c405c1d32f" containerName="collect-profiles" Oct 02 12:15:28 crc kubenswrapper[4766]: I1002 12:15:28.033727 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 02 12:15:28 crc kubenswrapper[4766]: I1002 12:15:28.036573 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-zvb69" Oct 02 12:15:28 crc kubenswrapper[4766]: I1002 12:15:28.043041 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 02 12:15:28 crc kubenswrapper[4766]: I1002 12:15:28.107519 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zch8s\" (UniqueName: \"kubernetes.io/projected/abb4777f-a52b-43c0-808f-680d0d65d61f-kube-api-access-zch8s\") pod \"mariadb-client-1-default\" (UID: \"abb4777f-a52b-43c0-808f-680d0d65d61f\") " pod="openstack/mariadb-client-1-default" Oct 02 12:15:28 crc kubenswrapper[4766]: I1002 12:15:28.208779 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zch8s\" (UniqueName: \"kubernetes.io/projected/abb4777f-a52b-43c0-808f-680d0d65d61f-kube-api-access-zch8s\") pod \"mariadb-client-1-default\" (UID: \"abb4777f-a52b-43c0-808f-680d0d65d61f\") " pod="openstack/mariadb-client-1-default" Oct 02 12:15:28 crc kubenswrapper[4766]: I1002 12:15:28.235956 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zch8s\" (UniqueName: \"kubernetes.io/projected/abb4777f-a52b-43c0-808f-680d0d65d61f-kube-api-access-zch8s\") pod \"mariadb-client-1-default\" (UID: \"abb4777f-a52b-43c0-808f-680d0d65d61f\") " pod="openstack/mariadb-client-1-default" Oct 02 12:15:28 crc kubenswrapper[4766]: I1002 12:15:28.350065 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 02 12:15:28 crc kubenswrapper[4766]: I1002 12:15:28.863090 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 02 12:15:29 crc kubenswrapper[4766]: I1002 12:15:29.625450 4766 generic.go:334] "Generic (PLEG): container finished" podID="abb4777f-a52b-43c0-808f-680d0d65d61f" containerID="eed742e9bb0bbc991e9b8aa57e14717cdf90bf437717ecf42cd6d2afcd0386b9" exitCode=0 Oct 02 12:15:29 crc kubenswrapper[4766]: I1002 12:15:29.625535 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"abb4777f-a52b-43c0-808f-680d0d65d61f","Type":"ContainerDied","Data":"eed742e9bb0bbc991e9b8aa57e14717cdf90bf437717ecf42cd6d2afcd0386b9"} Oct 02 12:15:29 crc kubenswrapper[4766]: I1002 12:15:29.625876 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"abb4777f-a52b-43c0-808f-680d0d65d61f","Type":"ContainerStarted","Data":"09183befa7e80aa6471ffb5f8e8b9bb67312045dc750eb46848575e73618a83d"} Oct 02 12:15:29 crc kubenswrapper[4766]: I1002 12:15:29.882139 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:15:29 crc kubenswrapper[4766]: E1002 12:15:29.882388 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:15:30 crc kubenswrapper[4766]: I1002 12:15:30.970283 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 02 12:15:30 crc kubenswrapper[4766]: I1002 12:15:30.993096 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_abb4777f-a52b-43c0-808f-680d0d65d61f/mariadb-client-1-default/0.log" Oct 02 12:15:31 crc kubenswrapper[4766]: I1002 12:15:31.028907 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 02 12:15:31 crc kubenswrapper[4766]: I1002 12:15:31.034570 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Oct 02 12:15:31 crc kubenswrapper[4766]: I1002 12:15:31.055749 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zch8s\" (UniqueName: \"kubernetes.io/projected/abb4777f-a52b-43c0-808f-680d0d65d61f-kube-api-access-zch8s\") pod \"abb4777f-a52b-43c0-808f-680d0d65d61f\" (UID: \"abb4777f-a52b-43c0-808f-680d0d65d61f\") " Oct 02 12:15:31 crc kubenswrapper[4766]: I1002 12:15:31.068852 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abb4777f-a52b-43c0-808f-680d0d65d61f-kube-api-access-zch8s" (OuterVolumeSpecName: "kube-api-access-zch8s") pod "abb4777f-a52b-43c0-808f-680d0d65d61f" (UID: "abb4777f-a52b-43c0-808f-680d0d65d61f"). InnerVolumeSpecName "kube-api-access-zch8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:15:31 crc kubenswrapper[4766]: I1002 12:15:31.158406 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zch8s\" (UniqueName: \"kubernetes.io/projected/abb4777f-a52b-43c0-808f-680d0d65d61f-kube-api-access-zch8s\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:31 crc kubenswrapper[4766]: I1002 12:15:31.435439 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Oct 02 12:15:31 crc kubenswrapper[4766]: E1002 12:15:31.436119 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abb4777f-a52b-43c0-808f-680d0d65d61f" containerName="mariadb-client-1-default" Oct 02 12:15:31 crc kubenswrapper[4766]: I1002 12:15:31.436150 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb4777f-a52b-43c0-808f-680d0d65d61f" containerName="mariadb-client-1-default" Oct 02 12:15:31 crc kubenswrapper[4766]: I1002 12:15:31.436373 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="abb4777f-a52b-43c0-808f-680d0d65d61f" containerName="mariadb-client-1-default" Oct 02 12:15:31 crc kubenswrapper[4766]: I1002 12:15:31.437360 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 02 12:15:31 crc kubenswrapper[4766]: I1002 12:15:31.455739 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 02 12:15:31 crc kubenswrapper[4766]: I1002 12:15:31.573102 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq89z\" (UniqueName: \"kubernetes.io/projected/20896224-acf7-43be-aee5-dd2841eb4667-kube-api-access-sq89z\") pod \"mariadb-client-2-default\" (UID: \"20896224-acf7-43be-aee5-dd2841eb4667\") " pod="openstack/mariadb-client-2-default" Oct 02 12:15:31 crc kubenswrapper[4766]: I1002 12:15:31.640842 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09183befa7e80aa6471ffb5f8e8b9bb67312045dc750eb46848575e73618a83d" Oct 02 12:15:31 crc kubenswrapper[4766]: I1002 12:15:31.640908 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Oct 02 12:15:31 crc kubenswrapper[4766]: I1002 12:15:31.675235 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq89z\" (UniqueName: \"kubernetes.io/projected/20896224-acf7-43be-aee5-dd2841eb4667-kube-api-access-sq89z\") pod \"mariadb-client-2-default\" (UID: \"20896224-acf7-43be-aee5-dd2841eb4667\") " pod="openstack/mariadb-client-2-default" Oct 02 12:15:31 crc kubenswrapper[4766]: I1002 12:15:31.695676 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq89z\" (UniqueName: \"kubernetes.io/projected/20896224-acf7-43be-aee5-dd2841eb4667-kube-api-access-sq89z\") pod \"mariadb-client-2-default\" (UID: \"20896224-acf7-43be-aee5-dd2841eb4667\") " pod="openstack/mariadb-client-2-default" Oct 02 12:15:31 crc kubenswrapper[4766]: I1002 12:15:31.770220 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 02 12:15:31 crc kubenswrapper[4766]: I1002 12:15:31.890490 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abb4777f-a52b-43c0-808f-680d0d65d61f" path="/var/lib/kubelet/pods/abb4777f-a52b-43c0-808f-680d0d65d61f/volumes" Oct 02 12:15:32 crc kubenswrapper[4766]: I1002 12:15:32.296998 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 02 12:15:32 crc kubenswrapper[4766]: W1002 12:15:32.303510 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20896224_acf7_43be_aee5_dd2841eb4667.slice/crio-45a2aefc62e0cccee1c46479f06cda6f797b3562992f3d280711ce3089494edc WatchSource:0}: Error finding container 45a2aefc62e0cccee1c46479f06cda6f797b3562992f3d280711ce3089494edc: Status 404 returned error can't find the container with id 45a2aefc62e0cccee1c46479f06cda6f797b3562992f3d280711ce3089494edc Oct 02 12:15:32 crc kubenswrapper[4766]: I1002 12:15:32.653168 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"20896224-acf7-43be-aee5-dd2841eb4667","Type":"ContainerStarted","Data":"c1dd8fd73a1d6e341f575513e3287e186f929214855a53c7c7cc4f9488222795"} Oct 02 12:15:32 crc kubenswrapper[4766]: I1002 12:15:32.653755 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"20896224-acf7-43be-aee5-dd2841eb4667","Type":"ContainerStarted","Data":"45a2aefc62e0cccee1c46479f06cda6f797b3562992f3d280711ce3089494edc"} Oct 02 12:15:33 crc kubenswrapper[4766]: I1002 12:15:33.662471 4766 generic.go:334] "Generic (PLEG): container finished" podID="20896224-acf7-43be-aee5-dd2841eb4667" containerID="c1dd8fd73a1d6e341f575513e3287e186f929214855a53c7c7cc4f9488222795" exitCode=0 Oct 02 12:15:33 crc kubenswrapper[4766]: I1002 12:15:33.662558 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"20896224-acf7-43be-aee5-dd2841eb4667","Type":"ContainerDied","Data":"c1dd8fd73a1d6e341f575513e3287e186f929214855a53c7c7cc4f9488222795"} Oct 02 12:15:34 crc kubenswrapper[4766]: I1002 12:15:34.878153 4766 scope.go:117] "RemoveContainer" containerID="329cc54c30f8c86ef0d1ab0d3d7620c91f969a6d89f4ce489cd30d755779b9bd" Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.073218 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.111875 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.117808 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.131779 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq89z\" (UniqueName: \"kubernetes.io/projected/20896224-acf7-43be-aee5-dd2841eb4667-kube-api-access-sq89z\") pod \"20896224-acf7-43be-aee5-dd2841eb4667\" (UID: \"20896224-acf7-43be-aee5-dd2841eb4667\") " Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.138448 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20896224-acf7-43be-aee5-dd2841eb4667-kube-api-access-sq89z" (OuterVolumeSpecName: "kube-api-access-sq89z") pod "20896224-acf7-43be-aee5-dd2841eb4667" (UID: "20896224-acf7-43be-aee5-dd2841eb4667"). InnerVolumeSpecName "kube-api-access-sq89z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.234678 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq89z\" (UniqueName: \"kubernetes.io/projected/20896224-acf7-43be-aee5-dd2841eb4667-kube-api-access-sq89z\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.545157 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Oct 02 12:15:35 crc kubenswrapper[4766]: E1002 12:15:35.545661 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20896224-acf7-43be-aee5-dd2841eb4667" containerName="mariadb-client-2-default" Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.545727 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="20896224-acf7-43be-aee5-dd2841eb4667" containerName="mariadb-client-2-default" Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.545915 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="20896224-acf7-43be-aee5-dd2841eb4667" containerName="mariadb-client-2-default" Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.546612 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.552552 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.641650 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r7kw\" (UniqueName: \"kubernetes.io/projected/49cc1990-f320-497c-8e75-4a7f5187f59b-kube-api-access-6r7kw\") pod \"mariadb-client-1\" (UID: \"49cc1990-f320-497c-8e75-4a7f5187f59b\") " pod="openstack/mariadb-client-1" Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.680168 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45a2aefc62e0cccee1c46479f06cda6f797b3562992f3d280711ce3089494edc" Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.680262 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.743787 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r7kw\" (UniqueName: \"kubernetes.io/projected/49cc1990-f320-497c-8e75-4a7f5187f59b-kube-api-access-6r7kw\") pod \"mariadb-client-1\" (UID: \"49cc1990-f320-497c-8e75-4a7f5187f59b\") " pod="openstack/mariadb-client-1" Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.761605 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r7kw\" (UniqueName: \"kubernetes.io/projected/49cc1990-f320-497c-8e75-4a7f5187f59b-kube-api-access-6r7kw\") pod \"mariadb-client-1\" (UID: \"49cc1990-f320-497c-8e75-4a7f5187f59b\") " pod="openstack/mariadb-client-1" Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.864870 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 02 12:15:35 crc kubenswrapper[4766]: I1002 12:15:35.905376 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20896224-acf7-43be-aee5-dd2841eb4667" path="/var/lib/kubelet/pods/20896224-acf7-43be-aee5-dd2841eb4667/volumes" Oct 02 12:15:36 crc kubenswrapper[4766]: I1002 12:15:36.136205 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Oct 02 12:15:36 crc kubenswrapper[4766]: W1002 12:15:36.145698 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49cc1990_f320_497c_8e75_4a7f5187f59b.slice/crio-1b256e1470c50cf3b6ad2572888bfcb7c63ea3f50e2e44250589e2d07fcc585f WatchSource:0}: Error finding container 1b256e1470c50cf3b6ad2572888bfcb7c63ea3f50e2e44250589e2d07fcc585f: Status 404 returned error can't find the container with id 1b256e1470c50cf3b6ad2572888bfcb7c63ea3f50e2e44250589e2d07fcc585f Oct 02 12:15:36 crc kubenswrapper[4766]: I1002 12:15:36.699855 4766 generic.go:334] "Generic (PLEG): container finished" podID="49cc1990-f320-497c-8e75-4a7f5187f59b" containerID="28937aa03a29015a781675cd13c20a1db60c6f2617adf9efb5672ba9c794cbcb" exitCode=0 Oct 02 12:15:36 crc kubenswrapper[4766]: I1002 12:15:36.700137 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"49cc1990-f320-497c-8e75-4a7f5187f59b","Type":"ContainerDied","Data":"28937aa03a29015a781675cd13c20a1db60c6f2617adf9efb5672ba9c794cbcb"} Oct 02 12:15:36 crc kubenswrapper[4766]: I1002 12:15:36.700236 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"49cc1990-f320-497c-8e75-4a7f5187f59b","Type":"ContainerStarted","Data":"1b256e1470c50cf3b6ad2572888bfcb7c63ea3f50e2e44250589e2d07fcc585f"} Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.055082 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.078879 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_49cc1990-f320-497c-8e75-4a7f5187f59b/mariadb-client-1/0.log" Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.109468 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.119881 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.181738 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r7kw\" (UniqueName: \"kubernetes.io/projected/49cc1990-f320-497c-8e75-4a7f5187f59b-kube-api-access-6r7kw\") pod \"49cc1990-f320-497c-8e75-4a7f5187f59b\" (UID: \"49cc1990-f320-497c-8e75-4a7f5187f59b\") " Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.188465 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49cc1990-f320-497c-8e75-4a7f5187f59b-kube-api-access-6r7kw" (OuterVolumeSpecName: "kube-api-access-6r7kw") pod "49cc1990-f320-497c-8e75-4a7f5187f59b" (UID: "49cc1990-f320-497c-8e75-4a7f5187f59b"). InnerVolumeSpecName "kube-api-access-6r7kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.283900 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r7kw\" (UniqueName: \"kubernetes.io/projected/49cc1990-f320-497c-8e75-4a7f5187f59b-kube-api-access-6r7kw\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.556351 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Oct 02 12:15:38 crc kubenswrapper[4766]: E1002 12:15:38.557127 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49cc1990-f320-497c-8e75-4a7f5187f59b" containerName="mariadb-client-1" Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.557142 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="49cc1990-f320-497c-8e75-4a7f5187f59b" containerName="mariadb-client-1" Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.557340 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="49cc1990-f320-497c-8e75-4a7f5187f59b" containerName="mariadb-client-1" Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.558034 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.565661 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.690399 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8mtc\" (UniqueName: \"kubernetes.io/projected/79b5b3d9-e42a-492d-af27-a7abe95c2c17-kube-api-access-l8mtc\") pod \"mariadb-client-4-default\" (UID: \"79b5b3d9-e42a-492d-af27-a7abe95c2c17\") " pod="openstack/mariadb-client-4-default" Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.719247 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b256e1470c50cf3b6ad2572888bfcb7c63ea3f50e2e44250589e2d07fcc585f" Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.719326 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.792825 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8mtc\" (UniqueName: \"kubernetes.io/projected/79b5b3d9-e42a-492d-af27-a7abe95c2c17-kube-api-access-l8mtc\") pod \"mariadb-client-4-default\" (UID: \"79b5b3d9-e42a-492d-af27-a7abe95c2c17\") " pod="openstack/mariadb-client-4-default" Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.812291 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8mtc\" (UniqueName: \"kubernetes.io/projected/79b5b3d9-e42a-492d-af27-a7abe95c2c17-kube-api-access-l8mtc\") pod \"mariadb-client-4-default\" (UID: \"79b5b3d9-e42a-492d-af27-a7abe95c2c17\") " pod="openstack/mariadb-client-4-default" Oct 02 12:15:38 crc kubenswrapper[4766]: I1002 12:15:38.876852 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 02 12:15:39 crc kubenswrapper[4766]: I1002 12:15:39.385302 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 02 12:15:39 crc kubenswrapper[4766]: W1002 12:15:39.412169 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79b5b3d9_e42a_492d_af27_a7abe95c2c17.slice/crio-bf821f2b045671b65b2afa48500cbe6b4f6dd3dccd0b84e9d9194a008a29fb56 WatchSource:0}: Error finding container bf821f2b045671b65b2afa48500cbe6b4f6dd3dccd0b84e9d9194a008a29fb56: Status 404 returned error can't find the container with id bf821f2b045671b65b2afa48500cbe6b4f6dd3dccd0b84e9d9194a008a29fb56 Oct 02 12:15:39 crc kubenswrapper[4766]: I1002 12:15:39.728433 4766 generic.go:334] "Generic (PLEG): container finished" podID="79b5b3d9-e42a-492d-af27-a7abe95c2c17" containerID="7b5729f02b6a5a60fb4aead817aba78fa7ff1174d49db85abafb46f886f856eb" exitCode=0 Oct 02 12:15:39 crc kubenswrapper[4766]: I1002 12:15:39.728495 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"79b5b3d9-e42a-492d-af27-a7abe95c2c17","Type":"ContainerDied","Data":"7b5729f02b6a5a60fb4aead817aba78fa7ff1174d49db85abafb46f886f856eb"} Oct 02 12:15:39 crc kubenswrapper[4766]: I1002 12:15:39.728562 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"79b5b3d9-e42a-492d-af27-a7abe95c2c17","Type":"ContainerStarted","Data":"bf821f2b045671b65b2afa48500cbe6b4f6dd3dccd0b84e9d9194a008a29fb56"} Oct 02 12:15:39 crc kubenswrapper[4766]: I1002 12:15:39.891175 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49cc1990-f320-497c-8e75-4a7f5187f59b" path="/var/lib/kubelet/pods/49cc1990-f320-497c-8e75-4a7f5187f59b/volumes" Oct 02 12:15:41 crc kubenswrapper[4766]: I1002 12:15:41.107844 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 02 12:15:41 crc kubenswrapper[4766]: I1002 12:15:41.128448 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_79b5b3d9-e42a-492d-af27-a7abe95c2c17/mariadb-client-4-default/0.log" Oct 02 12:15:41 crc kubenswrapper[4766]: I1002 12:15:41.154639 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 02 12:15:41 crc kubenswrapper[4766]: I1002 12:15:41.161459 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Oct 02 12:15:41 crc kubenswrapper[4766]: I1002 12:15:41.242916 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8mtc\" (UniqueName: \"kubernetes.io/projected/79b5b3d9-e42a-492d-af27-a7abe95c2c17-kube-api-access-l8mtc\") pod \"79b5b3d9-e42a-492d-af27-a7abe95c2c17\" (UID: \"79b5b3d9-e42a-492d-af27-a7abe95c2c17\") " Oct 02 12:15:41 crc kubenswrapper[4766]: I1002 12:15:41.250291 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79b5b3d9-e42a-492d-af27-a7abe95c2c17-kube-api-access-l8mtc" (OuterVolumeSpecName: "kube-api-access-l8mtc") pod "79b5b3d9-e42a-492d-af27-a7abe95c2c17" (UID: "79b5b3d9-e42a-492d-af27-a7abe95c2c17"). InnerVolumeSpecName "kube-api-access-l8mtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:15:41 crc kubenswrapper[4766]: I1002 12:15:41.346470 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8mtc\" (UniqueName: \"kubernetes.io/projected/79b5b3d9-e42a-492d-af27-a7abe95c2c17-kube-api-access-l8mtc\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:41 crc kubenswrapper[4766]: I1002 12:15:41.748719 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf821f2b045671b65b2afa48500cbe6b4f6dd3dccd0b84e9d9194a008a29fb56" Oct 02 12:15:41 crc kubenswrapper[4766]: I1002 12:15:41.748766 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Oct 02 12:15:41 crc kubenswrapper[4766]: I1002 12:15:41.881845 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:15:41 crc kubenswrapper[4766]: E1002 12:15:41.882286 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:15:41 crc kubenswrapper[4766]: I1002 12:15:41.903236 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79b5b3d9-e42a-492d-af27-a7abe95c2c17" path="/var/lib/kubelet/pods/79b5b3d9-e42a-492d-af27-a7abe95c2c17/volumes" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.311622 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5kgp9"] Oct 02 12:15:45 crc kubenswrapper[4766]: E1002 12:15:45.312643 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b5b3d9-e42a-492d-af27-a7abe95c2c17" containerName="mariadb-client-4-default" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.312662 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b5b3d9-e42a-492d-af27-a7abe95c2c17" containerName="mariadb-client-4-default" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.312851 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="79b5b3d9-e42a-492d-af27-a7abe95c2c17" containerName="mariadb-client-4-default" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.314927 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5kgp9" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.326633 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5kgp9"] Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.423804 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsb54\" (UniqueName: \"kubernetes.io/projected/b55d18ab-6060-4870-84af-498433ff950e-kube-api-access-zsb54\") pod \"redhat-operators-5kgp9\" (UID: \"b55d18ab-6060-4870-84af-498433ff950e\") " pod="openshift-marketplace/redhat-operators-5kgp9" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.423980 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55d18ab-6060-4870-84af-498433ff950e-utilities\") pod \"redhat-operators-5kgp9\" (UID: \"b55d18ab-6060-4870-84af-498433ff950e\") " pod="openshift-marketplace/redhat-operators-5kgp9" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.424080 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55d18ab-6060-4870-84af-498433ff950e-catalog-content\") pod \"redhat-operators-5kgp9\" (UID: \"b55d18ab-6060-4870-84af-498433ff950e\") " pod="openshift-marketplace/redhat-operators-5kgp9" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.526228 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55d18ab-6060-4870-84af-498433ff950e-catalog-content\") pod \"redhat-operators-5kgp9\" (UID: \"b55d18ab-6060-4870-84af-498433ff950e\") " pod="openshift-marketplace/redhat-operators-5kgp9" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.526345 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsb54\" (UniqueName: \"kubernetes.io/projected/b55d18ab-6060-4870-84af-498433ff950e-kube-api-access-zsb54\") pod \"redhat-operators-5kgp9\" (UID: \"b55d18ab-6060-4870-84af-498433ff950e\") " pod="openshift-marketplace/redhat-operators-5kgp9" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.526436 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55d18ab-6060-4870-84af-498433ff950e-utilities\") pod \"redhat-operators-5kgp9\" (UID: \"b55d18ab-6060-4870-84af-498433ff950e\") " pod="openshift-marketplace/redhat-operators-5kgp9" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.527410 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55d18ab-6060-4870-84af-498433ff950e-catalog-content\") pod \"redhat-operators-5kgp9\" (UID: \"b55d18ab-6060-4870-84af-498433ff950e\") " pod="openshift-marketplace/redhat-operators-5kgp9" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.527645 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55d18ab-6060-4870-84af-498433ff950e-utilities\") pod \"redhat-operators-5kgp9\" (UID: \"b55d18ab-6060-4870-84af-498433ff950e\") " pod="openshift-marketplace/redhat-operators-5kgp9" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.539892 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.541335 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.544607 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-zvb69" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.597573 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsb54\" (UniqueName: \"kubernetes.io/projected/b55d18ab-6060-4870-84af-498433ff950e-kube-api-access-zsb54\") pod \"redhat-operators-5kgp9\" (UID: \"b55d18ab-6060-4870-84af-498433ff950e\") " pod="openshift-marketplace/redhat-operators-5kgp9" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.598099 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.627866 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlvk8\" (UniqueName: \"kubernetes.io/projected/b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76-kube-api-access-jlvk8\") pod \"mariadb-client-5-default\" (UID: \"b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76\") " pod="openstack/mariadb-client-5-default" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.636697 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5kgp9" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.730184 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlvk8\" (UniqueName: \"kubernetes.io/projected/b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76-kube-api-access-jlvk8\") pod \"mariadb-client-5-default\" (UID: \"b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76\") " pod="openstack/mariadb-client-5-default" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.752280 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlvk8\" (UniqueName: \"kubernetes.io/projected/b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76-kube-api-access-jlvk8\") pod \"mariadb-client-5-default\" (UID: \"b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76\") " pod="openstack/mariadb-client-5-default" Oct 02 12:15:45 crc kubenswrapper[4766]: I1002 12:15:45.907067 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 02 12:15:46 crc kubenswrapper[4766]: I1002 12:15:46.105683 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5kgp9"] Oct 02 12:15:46 crc kubenswrapper[4766]: I1002 12:15:46.441390 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 02 12:15:46 crc kubenswrapper[4766]: W1002 12:15:46.442919 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb54fb70e_9fe4_4d9c_8822_c21c3a1f1d76.slice/crio-4faf26383ea6435d9442c68ee4ee432d7a4faa80bfc21ad586bcffc7a9888317 WatchSource:0}: Error finding container 4faf26383ea6435d9442c68ee4ee432d7a4faa80bfc21ad586bcffc7a9888317: Status 404 returned error can't find the container with id 4faf26383ea6435d9442c68ee4ee432d7a4faa80bfc21ad586bcffc7a9888317 Oct 02 12:15:46 crc kubenswrapper[4766]: I1002 12:15:46.796547 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5kgp9" event={"ID":"b55d18ab-6060-4870-84af-498433ff950e","Type":"ContainerStarted","Data":"52d22f247ea8f5df3c6030b4a81ba4bbb8496db790ede4c0e8ce464440a6aedc"} Oct 02 12:15:46 crc kubenswrapper[4766]: I1002 12:15:46.798260 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76","Type":"ContainerStarted","Data":"4faf26383ea6435d9442c68ee4ee432d7a4faa80bfc21ad586bcffc7a9888317"} Oct 02 12:15:47 crc kubenswrapper[4766]: I1002 12:15:47.811789 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76","Type":"ContainerStarted","Data":"a5d2a45b808b06fbd3a0d9821d2c5bb16074dade85d55825f64a36ee300bc34a"} Oct 02 12:15:47 crc kubenswrapper[4766]: I1002 12:15:47.814582 4766 generic.go:334] "Generic (PLEG): container finished" podID="b55d18ab-6060-4870-84af-498433ff950e" containerID="755cf7b928cc324f5cdb67ee919f1e599e60b3dfb94300263200ab61daed114b" exitCode=0 Oct 02 12:15:47 crc kubenswrapper[4766]: I1002 12:15:47.814638 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5kgp9" event={"ID":"b55d18ab-6060-4870-84af-498433ff950e","Type":"ContainerDied","Data":"755cf7b928cc324f5cdb67ee919f1e599e60b3dfb94300263200ab61daed114b"} Oct 02 12:15:48 crc kubenswrapper[4766]: I1002 12:15:48.829133 4766 generic.go:334] "Generic (PLEG): container finished" podID="b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76" containerID="a5d2a45b808b06fbd3a0d9821d2c5bb16074dade85d55825f64a36ee300bc34a" exitCode=0 Oct 02 12:15:48 crc kubenswrapper[4766]: I1002 12:15:48.829231 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76","Type":"ContainerDied","Data":"a5d2a45b808b06fbd3a0d9821d2c5bb16074dade85d55825f64a36ee300bc34a"} Oct 02 12:15:48 crc kubenswrapper[4766]: I1002 12:15:48.832914 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.215110 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.239392 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76/mariadb-client-5-default/0.log" Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.265291 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.272390 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.305330 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlvk8\" (UniqueName: \"kubernetes.io/projected/b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76-kube-api-access-jlvk8\") pod \"b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76\" (UID: \"b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76\") " Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.312826 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76-kube-api-access-jlvk8" (OuterVolumeSpecName: "kube-api-access-jlvk8") pod "b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76" (UID: "b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76"). InnerVolumeSpecName "kube-api-access-jlvk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.396047 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Oct 02 12:15:50 crc kubenswrapper[4766]: E1002 12:15:50.396460 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76" containerName="mariadb-client-5-default" Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.396481 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76" containerName="mariadb-client-5-default" Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.396709 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76" containerName="mariadb-client-5-default" Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.397632 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.407176 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlvk8\" (UniqueName: \"kubernetes.io/projected/b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76-kube-api-access-jlvk8\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.407451 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.509333 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n4n6\" (UniqueName: \"kubernetes.io/projected/ee239314-0767-49b7-a95c-dd5d87010d85-kube-api-access-9n4n6\") pod \"mariadb-client-6-default\" (UID: \"ee239314-0767-49b7-a95c-dd5d87010d85\") " pod="openstack/mariadb-client-6-default" Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.611749 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n4n6\" (UniqueName: \"kubernetes.io/projected/ee239314-0767-49b7-a95c-dd5d87010d85-kube-api-access-9n4n6\") pod \"mariadb-client-6-default\" (UID: \"ee239314-0767-49b7-a95c-dd5d87010d85\") " pod="openstack/mariadb-client-6-default" Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.633740 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n4n6\" (UniqueName: \"kubernetes.io/projected/ee239314-0767-49b7-a95c-dd5d87010d85-kube-api-access-9n4n6\") pod \"mariadb-client-6-default\" (UID: \"ee239314-0767-49b7-a95c-dd5d87010d85\") " pod="openstack/mariadb-client-6-default" Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.731254 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.865347 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4faf26383ea6435d9442c68ee4ee432d7a4faa80bfc21ad586bcffc7a9888317" Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.865441 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Oct 02 12:15:50 crc kubenswrapper[4766]: I1002 12:15:50.867620 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5kgp9" event={"ID":"b55d18ab-6060-4870-84af-498433ff950e","Type":"ContainerStarted","Data":"0166e4f8f01efbaa013a505584021e61749f78bd52a353cb7d1762d632a2eca5"} Oct 02 12:15:51 crc kubenswrapper[4766]: I1002 12:15:51.273716 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 02 12:15:51 crc kubenswrapper[4766]: I1002 12:15:51.878676 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"ee239314-0767-49b7-a95c-dd5d87010d85","Type":"ContainerStarted","Data":"257bf96cd249f58dc74ae8c124eab422fd626a996ce99eb8aa4a4f43449af723"} Oct 02 12:15:51 crc kubenswrapper[4766]: I1002 12:15:51.878760 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"ee239314-0767-49b7-a95c-dd5d87010d85","Type":"ContainerStarted","Data":"20e3c48e6adf20d06c77247d14d1b889768552b198e1c097d0cc0d123ad61fbe"} Oct 02 12:15:51 crc kubenswrapper[4766]: I1002 12:15:51.884374 4766 generic.go:334] "Generic (PLEG): container finished" podID="b55d18ab-6060-4870-84af-498433ff950e" containerID="0166e4f8f01efbaa013a505584021e61749f78bd52a353cb7d1762d632a2eca5" exitCode=0 Oct 02 12:15:51 crc kubenswrapper[4766]: I1002 12:15:51.893674 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76" path="/var/lib/kubelet/pods/b54fb70e-9fe4-4d9c-8822-c21c3a1f1d76/volumes" Oct 02 12:15:51 crc kubenswrapper[4766]: I1002 12:15:51.894745 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5kgp9" event={"ID":"b55d18ab-6060-4870-84af-498433ff950e","Type":"ContainerDied","Data":"0166e4f8f01efbaa013a505584021e61749f78bd52a353cb7d1762d632a2eca5"} Oct 02 12:15:51 crc kubenswrapper[4766]: I1002 12:15:51.903014 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=1.902977546 podStartE2EDuration="1.902977546s" podCreationTimestamp="2025-10-02 12:15:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:15:51.893867335 +0000 UTC m=+5066.836738279" watchObservedRunningTime="2025-10-02 12:15:51.902977546 +0000 UTC m=+5066.845848490" Oct 02 12:15:52 crc kubenswrapper[4766]: I1002 12:15:52.881955 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:15:52 crc kubenswrapper[4766]: E1002 12:15:52.883322 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:15:52 crc kubenswrapper[4766]: I1002 12:15:52.896884 4766 generic.go:334] "Generic (PLEG): container finished" podID="ee239314-0767-49b7-a95c-dd5d87010d85" containerID="257bf96cd249f58dc74ae8c124eab422fd626a996ce99eb8aa4a4f43449af723" exitCode=0 Oct 02 12:15:52 crc kubenswrapper[4766]: I1002 12:15:52.896991 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"ee239314-0767-49b7-a95c-dd5d87010d85","Type":"ContainerDied","Data":"257bf96cd249f58dc74ae8c124eab422fd626a996ce99eb8aa4a4f43449af723"} Oct 02 12:15:52 crc kubenswrapper[4766]: I1002 12:15:52.900148 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5kgp9" event={"ID":"b55d18ab-6060-4870-84af-498433ff950e","Type":"ContainerStarted","Data":"a325f979ed495ab2dd8d9663f086660304a19481f98caead863435c7eaf6ee1c"} Oct 02 12:15:52 crc kubenswrapper[4766]: I1002 12:15:52.945121 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5kgp9" podStartSLOduration=4.33697908 podStartE2EDuration="7.945096201s" podCreationTimestamp="2025-10-02 12:15:45 +0000 UTC" firstStartedPulling="2025-10-02 12:15:48.832634343 +0000 UTC m=+5063.775505287" lastFinishedPulling="2025-10-02 12:15:52.440751464 +0000 UTC m=+5067.383622408" observedRunningTime="2025-10-02 12:15:52.937710005 +0000 UTC m=+5067.880580969" watchObservedRunningTime="2025-10-02 12:15:52.945096201 +0000 UTC m=+5067.887967145" Oct 02 12:15:54 crc kubenswrapper[4766]: I1002 12:15:54.265557 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 02 12:15:54 crc kubenswrapper[4766]: I1002 12:15:54.304634 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 02 12:15:54 crc kubenswrapper[4766]: I1002 12:15:54.311255 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Oct 02 12:15:54 crc kubenswrapper[4766]: I1002 12:15:54.376417 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n4n6\" (UniqueName: \"kubernetes.io/projected/ee239314-0767-49b7-a95c-dd5d87010d85-kube-api-access-9n4n6\") pod \"ee239314-0767-49b7-a95c-dd5d87010d85\" (UID: \"ee239314-0767-49b7-a95c-dd5d87010d85\") " Oct 02 12:15:54 crc kubenswrapper[4766]: I1002 12:15:54.383627 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee239314-0767-49b7-a95c-dd5d87010d85-kube-api-access-9n4n6" (OuterVolumeSpecName: "kube-api-access-9n4n6") pod "ee239314-0767-49b7-a95c-dd5d87010d85" (UID: "ee239314-0767-49b7-a95c-dd5d87010d85"). InnerVolumeSpecName "kube-api-access-9n4n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:15:54 crc kubenswrapper[4766]: I1002 12:15:54.441901 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Oct 02 12:15:54 crc kubenswrapper[4766]: E1002 12:15:54.442255 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee239314-0767-49b7-a95c-dd5d87010d85" containerName="mariadb-client-6-default" Oct 02 12:15:54 crc kubenswrapper[4766]: I1002 12:15:54.442278 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee239314-0767-49b7-a95c-dd5d87010d85" containerName="mariadb-client-6-default" Oct 02 12:15:54 crc kubenswrapper[4766]: I1002 12:15:54.442433 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee239314-0767-49b7-a95c-dd5d87010d85" containerName="mariadb-client-6-default" Oct 02 12:15:54 crc kubenswrapper[4766]: I1002 12:15:54.442968 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 02 12:15:54 crc kubenswrapper[4766]: I1002 12:15:54.456278 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 02 12:15:54 crc kubenswrapper[4766]: I1002 12:15:54.478930 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbv2q\" (UniqueName: \"kubernetes.io/projected/5269fe2f-f673-458f-8cbf-c8c7f9466388-kube-api-access-qbv2q\") pod \"mariadb-client-7-default\" (UID: \"5269fe2f-f673-458f-8cbf-c8c7f9466388\") " pod="openstack/mariadb-client-7-default" Oct 02 12:15:54 crc kubenswrapper[4766]: I1002 12:15:54.479232 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n4n6\" (UniqueName: \"kubernetes.io/projected/ee239314-0767-49b7-a95c-dd5d87010d85-kube-api-access-9n4n6\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:54 crc kubenswrapper[4766]: I1002 12:15:54.580541 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbv2q\" (UniqueName: \"kubernetes.io/projected/5269fe2f-f673-458f-8cbf-c8c7f9466388-kube-api-access-qbv2q\") pod \"mariadb-client-7-default\" (UID: \"5269fe2f-f673-458f-8cbf-c8c7f9466388\") " pod="openstack/mariadb-client-7-default" Oct 02 12:15:54 crc kubenswrapper[4766]: I1002 12:15:54.597084 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbv2q\" (UniqueName: \"kubernetes.io/projected/5269fe2f-f673-458f-8cbf-c8c7f9466388-kube-api-access-qbv2q\") pod \"mariadb-client-7-default\" (UID: \"5269fe2f-f673-458f-8cbf-c8c7f9466388\") " pod="openstack/mariadb-client-7-default" Oct 02 12:15:54 crc kubenswrapper[4766]: I1002 12:15:54.773483 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 02 12:15:54 crc kubenswrapper[4766]: I1002 12:15:54.926911 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20e3c48e6adf20d06c77247d14d1b889768552b198e1c097d0cc0d123ad61fbe" Oct 02 12:15:54 crc kubenswrapper[4766]: I1002 12:15:54.926993 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Oct 02 12:15:55 crc kubenswrapper[4766]: I1002 12:15:55.316091 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 02 12:15:55 crc kubenswrapper[4766]: I1002 12:15:55.636822 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5kgp9" Oct 02 12:15:55 crc kubenswrapper[4766]: I1002 12:15:55.637341 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5kgp9" Oct 02 12:15:55 crc kubenswrapper[4766]: I1002 12:15:55.895810 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee239314-0767-49b7-a95c-dd5d87010d85" path="/var/lib/kubelet/pods/ee239314-0767-49b7-a95c-dd5d87010d85/volumes" Oct 02 12:15:55 crc kubenswrapper[4766]: I1002 12:15:55.937033 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"5269fe2f-f673-458f-8cbf-c8c7f9466388","Type":"ContainerDied","Data":"0ed1837e834eba5fd7adc86145d761c5ce6ee988f10c66a64493636dfeebabf3"} Oct 02 12:15:55 crc kubenswrapper[4766]: I1002 12:15:55.936965 4766 generic.go:334] "Generic (PLEG): container finished" podID="5269fe2f-f673-458f-8cbf-c8c7f9466388" containerID="0ed1837e834eba5fd7adc86145d761c5ce6ee988f10c66a64493636dfeebabf3" exitCode=0 Oct 02 12:15:55 crc kubenswrapper[4766]: I1002 12:15:55.937133 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"5269fe2f-f673-458f-8cbf-c8c7f9466388","Type":"ContainerStarted","Data":"757af6732c19d2dfc5dc4926519ba0c0368465372794a688c314616ccc84aec7"} Oct 02 12:15:56 crc kubenswrapper[4766]: I1002 12:15:56.683584 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5kgp9" podUID="b55d18ab-6060-4870-84af-498433ff950e" containerName="registry-server" probeResult="failure" output=< Oct 02 12:15:56 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Oct 02 12:15:56 crc kubenswrapper[4766]: > Oct 02 12:15:57 crc kubenswrapper[4766]: I1002 12:15:57.695078 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 02 12:15:57 crc kubenswrapper[4766]: I1002 12:15:57.717693 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_5269fe2f-f673-458f-8cbf-c8c7f9466388/mariadb-client-7-default/0.log" Oct 02 12:15:57 crc kubenswrapper[4766]: I1002 12:15:57.734780 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbv2q\" (UniqueName: \"kubernetes.io/projected/5269fe2f-f673-458f-8cbf-c8c7f9466388-kube-api-access-qbv2q\") pod \"5269fe2f-f673-458f-8cbf-c8c7f9466388\" (UID: \"5269fe2f-f673-458f-8cbf-c8c7f9466388\") " Oct 02 12:15:57 crc kubenswrapper[4766]: I1002 12:15:57.744277 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5269fe2f-f673-458f-8cbf-c8c7f9466388-kube-api-access-qbv2q" (OuterVolumeSpecName: "kube-api-access-qbv2q") pod "5269fe2f-f673-458f-8cbf-c8c7f9466388" (UID: "5269fe2f-f673-458f-8cbf-c8c7f9466388"). InnerVolumeSpecName "kube-api-access-qbv2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:15:57 crc kubenswrapper[4766]: I1002 12:15:57.769868 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 02 12:15:57 crc kubenswrapper[4766]: I1002 12:15:57.775825 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Oct 02 12:15:57 crc kubenswrapper[4766]: I1002 12:15:57.837350 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbv2q\" (UniqueName: \"kubernetes.io/projected/5269fe2f-f673-458f-8cbf-c8c7f9466388-kube-api-access-qbv2q\") on node \"crc\" DevicePath \"\"" Oct 02 12:15:57 crc kubenswrapper[4766]: I1002 12:15:57.896189 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5269fe2f-f673-458f-8cbf-c8c7f9466388" path="/var/lib/kubelet/pods/5269fe2f-f673-458f-8cbf-c8c7f9466388/volumes" Oct 02 12:15:57 crc kubenswrapper[4766]: I1002 12:15:57.909367 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Oct 02 12:15:57 crc kubenswrapper[4766]: E1002 12:15:57.910008 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5269fe2f-f673-458f-8cbf-c8c7f9466388" containerName="mariadb-client-7-default" Oct 02 12:15:57 crc kubenswrapper[4766]: I1002 12:15:57.910036 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5269fe2f-f673-458f-8cbf-c8c7f9466388" containerName="mariadb-client-7-default" Oct 02 12:15:57 crc kubenswrapper[4766]: I1002 12:15:57.910255 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5269fe2f-f673-458f-8cbf-c8c7f9466388" containerName="mariadb-client-7-default" Oct 02 12:15:57 crc kubenswrapper[4766]: I1002 12:15:57.910917 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 02 12:15:57 crc kubenswrapper[4766]: I1002 12:15:57.915206 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 02 12:15:57 crc kubenswrapper[4766]: I1002 12:15:57.958751 4766 scope.go:117] "RemoveContainer" containerID="0ed1837e834eba5fd7adc86145d761c5ce6ee988f10c66a64493636dfeebabf3" Oct 02 12:15:57 crc kubenswrapper[4766]: I1002 12:15:57.958796 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Oct 02 12:15:58 crc kubenswrapper[4766]: I1002 12:15:58.040166 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdwsm\" (UniqueName: \"kubernetes.io/projected/cafa80de-c8f4-4d11-a1c9-c13b9247988a-kube-api-access-wdwsm\") pod \"mariadb-client-2\" (UID: \"cafa80de-c8f4-4d11-a1c9-c13b9247988a\") " pod="openstack/mariadb-client-2" Oct 02 12:15:58 crc kubenswrapper[4766]: I1002 12:15:58.141726 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdwsm\" (UniqueName: \"kubernetes.io/projected/cafa80de-c8f4-4d11-a1c9-c13b9247988a-kube-api-access-wdwsm\") pod \"mariadb-client-2\" (UID: \"cafa80de-c8f4-4d11-a1c9-c13b9247988a\") " pod="openstack/mariadb-client-2" Oct 02 12:15:58 crc kubenswrapper[4766]: I1002 12:15:58.158903 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdwsm\" (UniqueName: \"kubernetes.io/projected/cafa80de-c8f4-4d11-a1c9-c13b9247988a-kube-api-access-wdwsm\") pod \"mariadb-client-2\" (UID: \"cafa80de-c8f4-4d11-a1c9-c13b9247988a\") " pod="openstack/mariadb-client-2" Oct 02 12:15:58 crc kubenswrapper[4766]: I1002 12:15:58.238340 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 02 12:15:58 crc kubenswrapper[4766]: I1002 12:15:58.787433 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Oct 02 12:15:58 crc kubenswrapper[4766]: W1002 12:15:58.797277 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcafa80de_c8f4_4d11_a1c9_c13b9247988a.slice/crio-7f7424d48f02f3335a72d0d0105995e66e2e27b8b14e5a4de8d3a7beb0c4a626 WatchSource:0}: Error finding container 7f7424d48f02f3335a72d0d0105995e66e2e27b8b14e5a4de8d3a7beb0c4a626: Status 404 returned error can't find the container with id 7f7424d48f02f3335a72d0d0105995e66e2e27b8b14e5a4de8d3a7beb0c4a626 Oct 02 12:15:58 crc kubenswrapper[4766]: I1002 12:15:58.974620 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"cafa80de-c8f4-4d11-a1c9-c13b9247988a","Type":"ContainerStarted","Data":"7f7424d48f02f3335a72d0d0105995e66e2e27b8b14e5a4de8d3a7beb0c4a626"} Oct 02 12:15:59 crc kubenswrapper[4766]: I1002 12:15:59.988269 4766 generic.go:334] "Generic (PLEG): container finished" podID="cafa80de-c8f4-4d11-a1c9-c13b9247988a" containerID="9074f8e893892e38b7e65bdee66076339126175bd0e84fbb9e06af1fbc1a4413" exitCode=0 Oct 02 12:15:59 crc kubenswrapper[4766]: I1002 12:15:59.988322 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"cafa80de-c8f4-4d11-a1c9-c13b9247988a","Type":"ContainerDied","Data":"9074f8e893892e38b7e65bdee66076339126175bd0e84fbb9e06af1fbc1a4413"} Oct 02 12:16:01 crc kubenswrapper[4766]: I1002 12:16:01.385975 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 02 12:16:01 crc kubenswrapper[4766]: I1002 12:16:01.409084 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_cafa80de-c8f4-4d11-a1c9-c13b9247988a/mariadb-client-2/0.log" Oct 02 12:16:01 crc kubenswrapper[4766]: I1002 12:16:01.441351 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Oct 02 12:16:01 crc kubenswrapper[4766]: I1002 12:16:01.446767 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Oct 02 12:16:01 crc kubenswrapper[4766]: I1002 12:16:01.506730 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdwsm\" (UniqueName: \"kubernetes.io/projected/cafa80de-c8f4-4d11-a1c9-c13b9247988a-kube-api-access-wdwsm\") pod \"cafa80de-c8f4-4d11-a1c9-c13b9247988a\" (UID: \"cafa80de-c8f4-4d11-a1c9-c13b9247988a\") " Oct 02 12:16:01 crc kubenswrapper[4766]: I1002 12:16:01.515016 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cafa80de-c8f4-4d11-a1c9-c13b9247988a-kube-api-access-wdwsm" (OuterVolumeSpecName: "kube-api-access-wdwsm") pod "cafa80de-c8f4-4d11-a1c9-c13b9247988a" (UID: "cafa80de-c8f4-4d11-a1c9-c13b9247988a"). InnerVolumeSpecName "kube-api-access-wdwsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:16:01 crc kubenswrapper[4766]: I1002 12:16:01.608548 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdwsm\" (UniqueName: \"kubernetes.io/projected/cafa80de-c8f4-4d11-a1c9-c13b9247988a-kube-api-access-wdwsm\") on node \"crc\" DevicePath \"\"" Oct 02 12:16:01 crc kubenswrapper[4766]: I1002 12:16:01.892162 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cafa80de-c8f4-4d11-a1c9-c13b9247988a" path="/var/lib/kubelet/pods/cafa80de-c8f4-4d11-a1c9-c13b9247988a/volumes" Oct 02 12:16:02 crc kubenswrapper[4766]: I1002 12:16:02.009209 4766 scope.go:117] "RemoveContainer" containerID="9074f8e893892e38b7e65bdee66076339126175bd0e84fbb9e06af1fbc1a4413" Oct 02 12:16:02 crc kubenswrapper[4766]: I1002 12:16:02.009338 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Oct 02 12:16:04 crc kubenswrapper[4766]: I1002 12:16:04.881677 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:16:04 crc kubenswrapper[4766]: E1002 12:16:04.882336 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:16:05 crc kubenswrapper[4766]: I1002 12:16:05.697411 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5kgp9" Oct 02 12:16:05 crc kubenswrapper[4766]: I1002 12:16:05.749630 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5kgp9" Oct 02 12:16:05 crc kubenswrapper[4766]: I1002 12:16:05.945403 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5kgp9"] Oct 02 12:16:07 crc kubenswrapper[4766]: I1002 12:16:07.068219 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5kgp9" podUID="b55d18ab-6060-4870-84af-498433ff950e" containerName="registry-server" containerID="cri-o://a325f979ed495ab2dd8d9663f086660304a19481f98caead863435c7eaf6ee1c" gracePeriod=2 Oct 02 12:16:07 crc kubenswrapper[4766]: I1002 12:16:07.451424 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5kgp9" Oct 02 12:16:07 crc kubenswrapper[4766]: I1002 12:16:07.519410 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55d18ab-6060-4870-84af-498433ff950e-utilities\") pod \"b55d18ab-6060-4870-84af-498433ff950e\" (UID: \"b55d18ab-6060-4870-84af-498433ff950e\") " Oct 02 12:16:07 crc kubenswrapper[4766]: I1002 12:16:07.519603 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsb54\" (UniqueName: \"kubernetes.io/projected/b55d18ab-6060-4870-84af-498433ff950e-kube-api-access-zsb54\") pod \"b55d18ab-6060-4870-84af-498433ff950e\" (UID: \"b55d18ab-6060-4870-84af-498433ff950e\") " Oct 02 12:16:07 crc kubenswrapper[4766]: I1002 12:16:07.519697 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55d18ab-6060-4870-84af-498433ff950e-catalog-content\") pod \"b55d18ab-6060-4870-84af-498433ff950e\" (UID: \"b55d18ab-6060-4870-84af-498433ff950e\") " Oct 02 12:16:07 crc kubenswrapper[4766]: I1002 12:16:07.520945 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b55d18ab-6060-4870-84af-498433ff950e-utilities" (OuterVolumeSpecName: "utilities") pod "b55d18ab-6060-4870-84af-498433ff950e" (UID: "b55d18ab-6060-4870-84af-498433ff950e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:16:07 crc kubenswrapper[4766]: I1002 12:16:07.527720 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b55d18ab-6060-4870-84af-498433ff950e-kube-api-access-zsb54" (OuterVolumeSpecName: "kube-api-access-zsb54") pod "b55d18ab-6060-4870-84af-498433ff950e" (UID: "b55d18ab-6060-4870-84af-498433ff950e"). InnerVolumeSpecName "kube-api-access-zsb54". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:16:07 crc kubenswrapper[4766]: I1002 12:16:07.610660 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b55d18ab-6060-4870-84af-498433ff950e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b55d18ab-6060-4870-84af-498433ff950e" (UID: "b55d18ab-6060-4870-84af-498433ff950e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:16:07 crc kubenswrapper[4766]: I1002 12:16:07.621364 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55d18ab-6060-4870-84af-498433ff950e-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:16:07 crc kubenswrapper[4766]: I1002 12:16:07.621407 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsb54\" (UniqueName: \"kubernetes.io/projected/b55d18ab-6060-4870-84af-498433ff950e-kube-api-access-zsb54\") on node \"crc\" DevicePath \"\"" Oct 02 12:16:07 crc kubenswrapper[4766]: I1002 12:16:07.621419 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55d18ab-6060-4870-84af-498433ff950e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:16:08 crc kubenswrapper[4766]: I1002 12:16:08.078683 4766 generic.go:334] "Generic (PLEG): container finished" podID="b55d18ab-6060-4870-84af-498433ff950e" containerID="a325f979ed495ab2dd8d9663f086660304a19481f98caead863435c7eaf6ee1c" exitCode=0 Oct 02 12:16:08 crc kubenswrapper[4766]: I1002 12:16:08.078757 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5kgp9" Oct 02 12:16:08 crc kubenswrapper[4766]: I1002 12:16:08.078752 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5kgp9" event={"ID":"b55d18ab-6060-4870-84af-498433ff950e","Type":"ContainerDied","Data":"a325f979ed495ab2dd8d9663f086660304a19481f98caead863435c7eaf6ee1c"} Oct 02 12:16:08 crc kubenswrapper[4766]: I1002 12:16:08.078843 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5kgp9" event={"ID":"b55d18ab-6060-4870-84af-498433ff950e","Type":"ContainerDied","Data":"52d22f247ea8f5df3c6030b4a81ba4bbb8496db790ede4c0e8ce464440a6aedc"} Oct 02 12:16:08 crc kubenswrapper[4766]: I1002 12:16:08.078872 4766 scope.go:117] "RemoveContainer" containerID="a325f979ed495ab2dd8d9663f086660304a19481f98caead863435c7eaf6ee1c" Oct 02 12:16:08 crc kubenswrapper[4766]: I1002 12:16:08.110229 4766 scope.go:117] "RemoveContainer" containerID="0166e4f8f01efbaa013a505584021e61749f78bd52a353cb7d1762d632a2eca5" Oct 02 12:16:08 crc kubenswrapper[4766]: I1002 12:16:08.112486 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5kgp9"] Oct 02 12:16:08 crc kubenswrapper[4766]: I1002 12:16:08.121850 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5kgp9"] Oct 02 12:16:08 crc kubenswrapper[4766]: I1002 12:16:08.136127 4766 scope.go:117] "RemoveContainer" containerID="755cf7b928cc324f5cdb67ee919f1e599e60b3dfb94300263200ab61daed114b" Oct 02 12:16:08 crc kubenswrapper[4766]: I1002 12:16:08.169095 4766 scope.go:117] "RemoveContainer" containerID="a325f979ed495ab2dd8d9663f086660304a19481f98caead863435c7eaf6ee1c" Oct 02 12:16:08 crc kubenswrapper[4766]: E1002 12:16:08.169684 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a325f979ed495ab2dd8d9663f086660304a19481f98caead863435c7eaf6ee1c\": container with ID starting with a325f979ed495ab2dd8d9663f086660304a19481f98caead863435c7eaf6ee1c not found: ID does not exist" containerID="a325f979ed495ab2dd8d9663f086660304a19481f98caead863435c7eaf6ee1c" Oct 02 12:16:08 crc kubenswrapper[4766]: I1002 12:16:08.169750 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a325f979ed495ab2dd8d9663f086660304a19481f98caead863435c7eaf6ee1c"} err="failed to get container status \"a325f979ed495ab2dd8d9663f086660304a19481f98caead863435c7eaf6ee1c\": rpc error: code = NotFound desc = could not find container \"a325f979ed495ab2dd8d9663f086660304a19481f98caead863435c7eaf6ee1c\": container with ID starting with a325f979ed495ab2dd8d9663f086660304a19481f98caead863435c7eaf6ee1c not found: ID does not exist" Oct 02 12:16:08 crc kubenswrapper[4766]: I1002 12:16:08.169774 4766 scope.go:117] "RemoveContainer" containerID="0166e4f8f01efbaa013a505584021e61749f78bd52a353cb7d1762d632a2eca5" Oct 02 12:16:08 crc kubenswrapper[4766]: E1002 12:16:08.170241 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0166e4f8f01efbaa013a505584021e61749f78bd52a353cb7d1762d632a2eca5\": container with ID starting with 0166e4f8f01efbaa013a505584021e61749f78bd52a353cb7d1762d632a2eca5 not found: ID does not exist" containerID="0166e4f8f01efbaa013a505584021e61749f78bd52a353cb7d1762d632a2eca5" Oct 02 12:16:08 crc kubenswrapper[4766]: I1002 12:16:08.170266 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0166e4f8f01efbaa013a505584021e61749f78bd52a353cb7d1762d632a2eca5"} err="failed to get container status \"0166e4f8f01efbaa013a505584021e61749f78bd52a353cb7d1762d632a2eca5\": rpc error: code = NotFound desc = could not find container \"0166e4f8f01efbaa013a505584021e61749f78bd52a353cb7d1762d632a2eca5\": container with ID starting with 0166e4f8f01efbaa013a505584021e61749f78bd52a353cb7d1762d632a2eca5 not found: ID does not exist" Oct 02 12:16:08 crc kubenswrapper[4766]: I1002 12:16:08.170301 4766 scope.go:117] "RemoveContainer" containerID="755cf7b928cc324f5cdb67ee919f1e599e60b3dfb94300263200ab61daed114b" Oct 02 12:16:08 crc kubenswrapper[4766]: E1002 12:16:08.170951 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"755cf7b928cc324f5cdb67ee919f1e599e60b3dfb94300263200ab61daed114b\": container with ID starting with 755cf7b928cc324f5cdb67ee919f1e599e60b3dfb94300263200ab61daed114b not found: ID does not exist" containerID="755cf7b928cc324f5cdb67ee919f1e599e60b3dfb94300263200ab61daed114b" Oct 02 12:16:08 crc kubenswrapper[4766]: I1002 12:16:08.170978 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"755cf7b928cc324f5cdb67ee919f1e599e60b3dfb94300263200ab61daed114b"} err="failed to get container status \"755cf7b928cc324f5cdb67ee919f1e599e60b3dfb94300263200ab61daed114b\": rpc error: code = NotFound desc = could not find container \"755cf7b928cc324f5cdb67ee919f1e599e60b3dfb94300263200ab61daed114b\": container with ID starting with 755cf7b928cc324f5cdb67ee919f1e599e60b3dfb94300263200ab61daed114b not found: ID does not exist" Oct 02 12:16:09 crc kubenswrapper[4766]: I1002 12:16:09.892684 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b55d18ab-6060-4870-84af-498433ff950e" path="/var/lib/kubelet/pods/b55d18ab-6060-4870-84af-498433ff950e/volumes" Oct 02 12:16:17 crc kubenswrapper[4766]: I1002 12:16:17.882024 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:16:17 crc kubenswrapper[4766]: E1002 12:16:17.883159 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:16:32 crc kubenswrapper[4766]: I1002 12:16:32.880896 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:16:32 crc kubenswrapper[4766]: E1002 12:16:32.881761 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:16:34 crc kubenswrapper[4766]: I1002 12:16:34.955323 4766 scope.go:117] "RemoveContainer" containerID="27a4033829e97c9864d1d7d2ef5a66362db8f2c09e7a0b22c0abdc34c63ef5aa" Oct 02 12:16:45 crc kubenswrapper[4766]: I1002 12:16:45.890099 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:16:45 crc kubenswrapper[4766]: E1002 12:16:45.891173 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:16:57 crc kubenswrapper[4766]: I1002 12:16:57.881406 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:16:57 crc kubenswrapper[4766]: E1002 12:16:57.882292 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:17:12 crc kubenswrapper[4766]: I1002 12:17:12.882805 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:17:12 crc kubenswrapper[4766]: E1002 12:17:12.883981 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:17:27 crc kubenswrapper[4766]: I1002 12:17:27.882309 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:17:27 crc kubenswrapper[4766]: E1002 12:17:27.883233 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:17:41 crc kubenswrapper[4766]: I1002 12:17:41.881300 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:17:41 crc kubenswrapper[4766]: E1002 12:17:41.883656 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:17:56 crc kubenswrapper[4766]: I1002 12:17:56.882664 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:17:56 crc kubenswrapper[4766]: E1002 12:17:56.883706 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:18:08 crc kubenswrapper[4766]: I1002 12:18:08.881659 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:18:08 crc kubenswrapper[4766]: E1002 12:18:08.885042 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:18:22 crc kubenswrapper[4766]: I1002 12:18:22.881368 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:18:22 crc kubenswrapper[4766]: E1002 12:18:22.882472 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:18:36 crc kubenswrapper[4766]: I1002 12:18:36.881698 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:18:36 crc kubenswrapper[4766]: E1002 12:18:36.882842 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:18:50 crc kubenswrapper[4766]: I1002 12:18:50.882401 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:18:50 crc kubenswrapper[4766]: E1002 12:18:50.885162 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:19:02 crc kubenswrapper[4766]: I1002 12:19:02.881859 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:19:02 crc kubenswrapper[4766]: E1002 12:19:02.882881 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:19:13 crc kubenswrapper[4766]: I1002 12:19:13.882655 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:19:13 crc kubenswrapper[4766]: E1002 12:19:13.883760 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:19:24 crc kubenswrapper[4766]: I1002 12:19:24.881807 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:19:25 crc kubenswrapper[4766]: I1002 12:19:25.838297 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"7be1eb90dbe4a2beb104498de2e75466d69e548c7f34b0f0f5bbe74fe4681dd4"} Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.478269 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Oct 02 12:19:57 crc kubenswrapper[4766]: E1002 12:19:57.479648 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b55d18ab-6060-4870-84af-498433ff950e" containerName="extract-utilities" Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.479667 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b55d18ab-6060-4870-84af-498433ff950e" containerName="extract-utilities" Oct 02 12:19:57 crc kubenswrapper[4766]: E1002 12:19:57.479717 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b55d18ab-6060-4870-84af-498433ff950e" containerName="extract-content" Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.479727 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b55d18ab-6060-4870-84af-498433ff950e" containerName="extract-content" Oct 02 12:19:57 crc kubenswrapper[4766]: E1002 12:19:57.479749 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b55d18ab-6060-4870-84af-498433ff950e" containerName="registry-server" Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.479757 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b55d18ab-6060-4870-84af-498433ff950e" containerName="registry-server" Oct 02 12:19:57 crc kubenswrapper[4766]: E1002 12:19:57.479766 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cafa80de-c8f4-4d11-a1c9-c13b9247988a" containerName="mariadb-client-2" Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.479773 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="cafa80de-c8f4-4d11-a1c9-c13b9247988a" containerName="mariadb-client-2" Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.480006 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="cafa80de-c8f4-4d11-a1c9-c13b9247988a" containerName="mariadb-client-2" Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.480027 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b55d18ab-6060-4870-84af-498433ff950e" containerName="registry-server" Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.480811 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.483975 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-zvb69" Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.488410 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.540523 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9m6q\" (UniqueName: \"kubernetes.io/projected/c82724de-a001-4f24-83ca-aa7d76bb293f-kube-api-access-d9m6q\") pod \"mariadb-copy-data\" (UID: \"c82724de-a001-4f24-83ca-aa7d76bb293f\") " pod="openstack/mariadb-copy-data" Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.540865 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0a82b6f6-eef7-42b0-8a3d-02ab3b548175\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a82b6f6-eef7-42b0-8a3d-02ab3b548175\") pod \"mariadb-copy-data\" (UID: \"c82724de-a001-4f24-83ca-aa7d76bb293f\") " pod="openstack/mariadb-copy-data" Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.642618 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9m6q\" (UniqueName: \"kubernetes.io/projected/c82724de-a001-4f24-83ca-aa7d76bb293f-kube-api-access-d9m6q\") pod \"mariadb-copy-data\" (UID: \"c82724de-a001-4f24-83ca-aa7d76bb293f\") " pod="openstack/mariadb-copy-data" Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.642675 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0a82b6f6-eef7-42b0-8a3d-02ab3b548175\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a82b6f6-eef7-42b0-8a3d-02ab3b548175\") pod \"mariadb-copy-data\" (UID: \"c82724de-a001-4f24-83ca-aa7d76bb293f\") " pod="openstack/mariadb-copy-data" Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.647382 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.647447 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0a82b6f6-eef7-42b0-8a3d-02ab3b548175\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a82b6f6-eef7-42b0-8a3d-02ab3b548175\") pod \"mariadb-copy-data\" (UID: \"c82724de-a001-4f24-83ca-aa7d76bb293f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c1d5fc6e8302bd68a5bea02867c6422dc4080d69c036e722554bef127ad705b3/globalmount\"" pod="openstack/mariadb-copy-data" Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.671659 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9m6q\" (UniqueName: \"kubernetes.io/projected/c82724de-a001-4f24-83ca-aa7d76bb293f-kube-api-access-d9m6q\") pod \"mariadb-copy-data\" (UID: \"c82724de-a001-4f24-83ca-aa7d76bb293f\") " pod="openstack/mariadb-copy-data" Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.679681 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0a82b6f6-eef7-42b0-8a3d-02ab3b548175\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0a82b6f6-eef7-42b0-8a3d-02ab3b548175\") pod \"mariadb-copy-data\" (UID: \"c82724de-a001-4f24-83ca-aa7d76bb293f\") " pod="openstack/mariadb-copy-data" Oct 02 12:19:57 crc kubenswrapper[4766]: I1002 12:19:57.812131 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Oct 02 12:19:58 crc kubenswrapper[4766]: I1002 12:19:58.327898 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Oct 02 12:19:59 crc kubenswrapper[4766]: I1002 12:19:59.106016 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"c82724de-a001-4f24-83ca-aa7d76bb293f","Type":"ContainerStarted","Data":"b05a51bf407471eca1e38e6d4bbd5b1631389222360f38a63a0a71009c52275b"} Oct 02 12:19:59 crc kubenswrapper[4766]: I1002 12:19:59.106397 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"c82724de-a001-4f24-83ca-aa7d76bb293f","Type":"ContainerStarted","Data":"872e5a70b0b00e775514d1c081c774fd6bff0c43b3e03966e2c4aab8a4748648"} Oct 02 12:19:59 crc kubenswrapper[4766]: I1002 12:19:59.133833 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.133803438 podStartE2EDuration="3.133803438s" podCreationTimestamp="2025-10-02 12:19:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:19:59.124793659 +0000 UTC m=+5314.067664613" watchObservedRunningTime="2025-10-02 12:19:59.133803438 +0000 UTC m=+5314.076674382" Oct 02 12:20:00 crc kubenswrapper[4766]: I1002 12:20:00.727788 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 02 12:20:00 crc kubenswrapper[4766]: I1002 12:20:00.729368 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 02 12:20:00 crc kubenswrapper[4766]: I1002 12:20:00.740469 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 02 12:20:00 crc kubenswrapper[4766]: I1002 12:20:00.900444 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkl48\" (UniqueName: \"kubernetes.io/projected/412cbc56-4865-45fd-b201-41d437ed03c7-kube-api-access-jkl48\") pod \"mariadb-client\" (UID: \"412cbc56-4865-45fd-b201-41d437ed03c7\") " pod="openstack/mariadb-client" Oct 02 12:20:01 crc kubenswrapper[4766]: I1002 12:20:01.002788 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkl48\" (UniqueName: \"kubernetes.io/projected/412cbc56-4865-45fd-b201-41d437ed03c7-kube-api-access-jkl48\") pod \"mariadb-client\" (UID: \"412cbc56-4865-45fd-b201-41d437ed03c7\") " pod="openstack/mariadb-client" Oct 02 12:20:01 crc kubenswrapper[4766]: I1002 12:20:01.027920 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkl48\" (UniqueName: \"kubernetes.io/projected/412cbc56-4865-45fd-b201-41d437ed03c7-kube-api-access-jkl48\") pod \"mariadb-client\" (UID: \"412cbc56-4865-45fd-b201-41d437ed03c7\") " pod="openstack/mariadb-client" Oct 02 12:20:01 crc kubenswrapper[4766]: I1002 12:20:01.057104 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 02 12:20:01 crc kubenswrapper[4766]: I1002 12:20:01.505531 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 02 12:20:01 crc kubenswrapper[4766]: W1002 12:20:01.507727 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod412cbc56_4865_45fd_b201_41d437ed03c7.slice/crio-f75eb19a62136d370a80caa1f0b0259a41ebfafa155b8dcd5e563c4bb4d7fc38 WatchSource:0}: Error finding container f75eb19a62136d370a80caa1f0b0259a41ebfafa155b8dcd5e563c4bb4d7fc38: Status 404 returned error can't find the container with id f75eb19a62136d370a80caa1f0b0259a41ebfafa155b8dcd5e563c4bb4d7fc38 Oct 02 12:20:02 crc kubenswrapper[4766]: I1002 12:20:02.133261 4766 generic.go:334] "Generic (PLEG): container finished" podID="412cbc56-4865-45fd-b201-41d437ed03c7" containerID="50fc7bcc0a9fa87abf66a05b3c98f12a9b8f4103a70963f52033b126432f7377" exitCode=0 Oct 02 12:20:02 crc kubenswrapper[4766]: I1002 12:20:02.133321 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"412cbc56-4865-45fd-b201-41d437ed03c7","Type":"ContainerDied","Data":"50fc7bcc0a9fa87abf66a05b3c98f12a9b8f4103a70963f52033b126432f7377"} Oct 02 12:20:02 crc kubenswrapper[4766]: I1002 12:20:02.133350 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"412cbc56-4865-45fd-b201-41d437ed03c7","Type":"ContainerStarted","Data":"f75eb19a62136d370a80caa1f0b0259a41ebfafa155b8dcd5e563c4bb4d7fc38"} Oct 02 12:20:03 crc kubenswrapper[4766]: I1002 12:20:03.456560 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 02 12:20:03 crc kubenswrapper[4766]: I1002 12:20:03.487223 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_412cbc56-4865-45fd-b201-41d437ed03c7/mariadb-client/0.log" Oct 02 12:20:03 crc kubenswrapper[4766]: I1002 12:20:03.513054 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 02 12:20:03 crc kubenswrapper[4766]: I1002 12:20:03.519274 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 02 12:20:03 crc kubenswrapper[4766]: I1002 12:20:03.555409 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkl48\" (UniqueName: \"kubernetes.io/projected/412cbc56-4865-45fd-b201-41d437ed03c7-kube-api-access-jkl48\") pod \"412cbc56-4865-45fd-b201-41d437ed03c7\" (UID: \"412cbc56-4865-45fd-b201-41d437ed03c7\") " Oct 02 12:20:03 crc kubenswrapper[4766]: I1002 12:20:03.563194 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/412cbc56-4865-45fd-b201-41d437ed03c7-kube-api-access-jkl48" (OuterVolumeSpecName: "kube-api-access-jkl48") pod "412cbc56-4865-45fd-b201-41d437ed03c7" (UID: "412cbc56-4865-45fd-b201-41d437ed03c7"). InnerVolumeSpecName "kube-api-access-jkl48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:20:03 crc kubenswrapper[4766]: I1002 12:20:03.635009 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Oct 02 12:20:03 crc kubenswrapper[4766]: E1002 12:20:03.635913 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412cbc56-4865-45fd-b201-41d437ed03c7" containerName="mariadb-client" Oct 02 12:20:03 crc kubenswrapper[4766]: I1002 12:20:03.635943 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="412cbc56-4865-45fd-b201-41d437ed03c7" containerName="mariadb-client" Oct 02 12:20:03 crc kubenswrapper[4766]: I1002 12:20:03.636152 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="412cbc56-4865-45fd-b201-41d437ed03c7" containerName="mariadb-client" Oct 02 12:20:03 crc kubenswrapper[4766]: I1002 12:20:03.637125 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 02 12:20:03 crc kubenswrapper[4766]: I1002 12:20:03.643185 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 02 12:20:03 crc kubenswrapper[4766]: I1002 12:20:03.658117 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkl48\" (UniqueName: \"kubernetes.io/projected/412cbc56-4865-45fd-b201-41d437ed03c7-kube-api-access-jkl48\") on node \"crc\" DevicePath \"\"" Oct 02 12:20:03 crc kubenswrapper[4766]: I1002 12:20:03.759955 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzk6s\" (UniqueName: \"kubernetes.io/projected/d2136dc3-a0bf-41d4-97ad-cf61592f68d7-kube-api-access-rzk6s\") pod \"mariadb-client\" (UID: \"d2136dc3-a0bf-41d4-97ad-cf61592f68d7\") " pod="openstack/mariadb-client" Oct 02 12:20:03 crc kubenswrapper[4766]: I1002 12:20:03.862351 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzk6s\" (UniqueName: \"kubernetes.io/projected/d2136dc3-a0bf-41d4-97ad-cf61592f68d7-kube-api-access-rzk6s\") pod \"mariadb-client\" (UID: \"d2136dc3-a0bf-41d4-97ad-cf61592f68d7\") " pod="openstack/mariadb-client" Oct 02 12:20:03 crc kubenswrapper[4766]: I1002 12:20:03.882433 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzk6s\" (UniqueName: \"kubernetes.io/projected/d2136dc3-a0bf-41d4-97ad-cf61592f68d7-kube-api-access-rzk6s\") pod \"mariadb-client\" (UID: \"d2136dc3-a0bf-41d4-97ad-cf61592f68d7\") " pod="openstack/mariadb-client" Oct 02 12:20:03 crc kubenswrapper[4766]: I1002 12:20:03.899079 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="412cbc56-4865-45fd-b201-41d437ed03c7" path="/var/lib/kubelet/pods/412cbc56-4865-45fd-b201-41d437ed03c7/volumes" Oct 02 12:20:03 crc kubenswrapper[4766]: I1002 12:20:03.963107 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 02 12:20:04 crc kubenswrapper[4766]: I1002 12:20:04.161068 4766 scope.go:117] "RemoveContainer" containerID="50fc7bcc0a9fa87abf66a05b3c98f12a9b8f4103a70963f52033b126432f7377" Oct 02 12:20:04 crc kubenswrapper[4766]: I1002 12:20:04.161211 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 02 12:20:04 crc kubenswrapper[4766]: I1002 12:20:04.436763 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Oct 02 12:20:05 crc kubenswrapper[4766]: I1002 12:20:05.173384 4766 generic.go:334] "Generic (PLEG): container finished" podID="d2136dc3-a0bf-41d4-97ad-cf61592f68d7" containerID="da2ff533e579ba8db4cac2b657386a4458b63aef13724aa11b3b45b1e8648c96" exitCode=0 Oct 02 12:20:05 crc kubenswrapper[4766]: I1002 12:20:05.173590 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"d2136dc3-a0bf-41d4-97ad-cf61592f68d7","Type":"ContainerDied","Data":"da2ff533e579ba8db4cac2b657386a4458b63aef13724aa11b3b45b1e8648c96"} Oct 02 12:20:05 crc kubenswrapper[4766]: I1002 12:20:05.173646 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"d2136dc3-a0bf-41d4-97ad-cf61592f68d7","Type":"ContainerStarted","Data":"a2be7676827d037d98a09a0ee1334f12feff4af542c4d22d4a4cf906cddcaf0a"} Oct 02 12:20:06 crc kubenswrapper[4766]: I1002 12:20:06.496847 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 02 12:20:06 crc kubenswrapper[4766]: I1002 12:20:06.517410 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_d2136dc3-a0bf-41d4-97ad-cf61592f68d7/mariadb-client/0.log" Oct 02 12:20:06 crc kubenswrapper[4766]: I1002 12:20:06.546867 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Oct 02 12:20:06 crc kubenswrapper[4766]: I1002 12:20:06.553617 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Oct 02 12:20:06 crc kubenswrapper[4766]: I1002 12:20:06.628717 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzk6s\" (UniqueName: \"kubernetes.io/projected/d2136dc3-a0bf-41d4-97ad-cf61592f68d7-kube-api-access-rzk6s\") pod \"d2136dc3-a0bf-41d4-97ad-cf61592f68d7\" (UID: \"d2136dc3-a0bf-41d4-97ad-cf61592f68d7\") " Oct 02 12:20:06 crc kubenswrapper[4766]: I1002 12:20:06.636609 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2136dc3-a0bf-41d4-97ad-cf61592f68d7-kube-api-access-rzk6s" (OuterVolumeSpecName: "kube-api-access-rzk6s") pod "d2136dc3-a0bf-41d4-97ad-cf61592f68d7" (UID: "d2136dc3-a0bf-41d4-97ad-cf61592f68d7"). InnerVolumeSpecName "kube-api-access-rzk6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:20:06 crc kubenswrapper[4766]: I1002 12:20:06.730777 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzk6s\" (UniqueName: \"kubernetes.io/projected/d2136dc3-a0bf-41d4-97ad-cf61592f68d7-kube-api-access-rzk6s\") on node \"crc\" DevicePath \"\"" Oct 02 12:20:07 crc kubenswrapper[4766]: I1002 12:20:07.195415 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2be7676827d037d98a09a0ee1334f12feff4af542c4d22d4a4cf906cddcaf0a" Oct 02 12:20:07 crc kubenswrapper[4766]: I1002 12:20:07.195495 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Oct 02 12:20:07 crc kubenswrapper[4766]: I1002 12:20:07.896585 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2136dc3-a0bf-41d4-97ad-cf61592f68d7" path="/var/lib/kubelet/pods/d2136dc3-a0bf-41d4-97ad-cf61592f68d7/volumes" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.503892 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 12:20:47 crc kubenswrapper[4766]: E1002 12:20:47.505069 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2136dc3-a0bf-41d4-97ad-cf61592f68d7" containerName="mariadb-client" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.505088 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2136dc3-a0bf-41d4-97ad-cf61592f68d7" containerName="mariadb-client" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.505302 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2136dc3-a0bf-41d4-97ad-cf61592f68d7" containerName="mariadb-client" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.506671 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.510005 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.510308 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.516340 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-9z9hr" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.525606 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.527225 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.538612 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.565100 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.566653 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.574450 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.583398 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.658725 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dc71b5-2203-47e7-9006-85d5c360d2a7-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.658789 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5dc71b5-2203-47e7-9006-85d5c360d2a7-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.658828 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bcde44b5-40c8-4212-81f5-a140a33d29d4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bcde44b5-40c8-4212-81f5-a140a33d29d4\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.658851 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5dc71b5-2203-47e7-9006-85d5c360d2a7-config\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.658950 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/24e149ea-94ee-4a26-9e9a-900be46fb609-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.658966 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e149ea-94ee-4a26-9e9a-900be46fb609-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.658995 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppbkb\" (UniqueName: \"kubernetes.io/projected/d5dc71b5-2203-47e7-9006-85d5c360d2a7-kube-api-access-ppbkb\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.659030 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/653c6c64-ca0a-46f2-8548-2e9b94dd9f34-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.659063 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e149ea-94ee-4a26-9e9a-900be46fb609-config\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.659096 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fbfc796b-cf91-4025-bd64-b350e48606ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fbfc796b-cf91-4025-bd64-b350e48606ea\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.659114 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653c6c64-ca0a-46f2-8548-2e9b94dd9f34-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.659752 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5dc71b5-2203-47e7-9006-85d5c360d2a7-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.659786 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/653c6c64-ca0a-46f2-8548-2e9b94dd9f34-config\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.659826 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/653c6c64-ca0a-46f2-8548-2e9b94dd9f34-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.659849 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d25ef8ac-e685-4752-874b-be66d9b6100e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d25ef8ac-e685-4752-874b-be66d9b6100e\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.659891 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqrkm\" (UniqueName: \"kubernetes.io/projected/653c6c64-ca0a-46f2-8548-2e9b94dd9f34-kube-api-access-rqrkm\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.659911 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v72j6\" (UniqueName: \"kubernetes.io/projected/24e149ea-94ee-4a26-9e9a-900be46fb609-kube-api-access-v72j6\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.659936 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24e149ea-94ee-4a26-9e9a-900be46fb609-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.707812 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.709768 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.714767 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-lt8tf" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.714895 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.715002 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.729095 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.737202 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.739205 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.748359 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.750444 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.757994 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761490 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dc71b5-2203-47e7-9006-85d5c360d2a7-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761552 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5dc71b5-2203-47e7-9006-85d5c360d2a7-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761587 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bcde44b5-40c8-4212-81f5-a140a33d29d4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bcde44b5-40c8-4212-81f5-a140a33d29d4\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761614 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5dc71b5-2203-47e7-9006-85d5c360d2a7-config\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761640 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/24e149ea-94ee-4a26-9e9a-900be46fb609-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761661 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e149ea-94ee-4a26-9e9a-900be46fb609-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761689 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppbkb\" (UniqueName: \"kubernetes.io/projected/d5dc71b5-2203-47e7-9006-85d5c360d2a7-kube-api-access-ppbkb\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761722 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/653c6c64-ca0a-46f2-8548-2e9b94dd9f34-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761752 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e149ea-94ee-4a26-9e9a-900be46fb609-config\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761785 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fbfc796b-cf91-4025-bd64-b350e48606ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fbfc796b-cf91-4025-bd64-b350e48606ea\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761812 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653c6c64-ca0a-46f2-8548-2e9b94dd9f34-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761835 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5dc71b5-2203-47e7-9006-85d5c360d2a7-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761859 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/653c6c64-ca0a-46f2-8548-2e9b94dd9f34-config\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761889 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/653c6c64-ca0a-46f2-8548-2e9b94dd9f34-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761908 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d25ef8ac-e685-4752-874b-be66d9b6100e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d25ef8ac-e685-4752-874b-be66d9b6100e\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761935 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqrkm\" (UniqueName: \"kubernetes.io/projected/653c6c64-ca0a-46f2-8548-2e9b94dd9f34-kube-api-access-rqrkm\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761953 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v72j6\" (UniqueName: \"kubernetes.io/projected/24e149ea-94ee-4a26-9e9a-900be46fb609-kube-api-access-v72j6\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.761974 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24e149ea-94ee-4a26-9e9a-900be46fb609-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.763194 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24e149ea-94ee-4a26-9e9a-900be46fb609-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.765069 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24e149ea-94ee-4a26-9e9a-900be46fb609-config\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.766068 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5dc71b5-2203-47e7-9006-85d5c360d2a7-config\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.766204 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d5dc71b5-2203-47e7-9006-85d5c360d2a7-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.766751 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/24e149ea-94ee-4a26-9e9a-900be46fb609-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.767030 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.768988 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/653c6c64-ca0a-46f2-8548-2e9b94dd9f34-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.769985 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d5dc71b5-2203-47e7-9006-85d5c360d2a7-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.770390 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/653c6c64-ca0a-46f2-8548-2e9b94dd9f34-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.772708 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5dc71b5-2203-47e7-9006-85d5c360d2a7-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.775003 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.775040 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bcde44b5-40c8-4212-81f5-a140a33d29d4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bcde44b5-40c8-4212-81f5-a140a33d29d4\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/52fc28eb757e6ae5f4dd0f3adc47579bb18d7325858b0648afcc4540870e814c/globalmount\"" pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.775772 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.775803 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fbfc796b-cf91-4025-bd64-b350e48606ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fbfc796b-cf91-4025-bd64-b350e48606ea\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8ed612e3ffa34483f20787e66885c551026282f3f35cff8a01980ca4f1f12f11/globalmount\"" pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.777351 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.777398 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d25ef8ac-e685-4752-874b-be66d9b6100e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d25ef8ac-e685-4752-874b-be66d9b6100e\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/993924ab0f12bcad562d90b57c736e11007fd8869fcbcc1b6f56aaf2e813844b/globalmount\"" pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.786783 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24e149ea-94ee-4a26-9e9a-900be46fb609-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.787856 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/653c6c64-ca0a-46f2-8548-2e9b94dd9f34-config\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.788018 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/653c6c64-ca0a-46f2-8548-2e9b94dd9f34-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.795696 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqrkm\" (UniqueName: \"kubernetes.io/projected/653c6c64-ca0a-46f2-8548-2e9b94dd9f34-kube-api-access-rqrkm\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.798848 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v72j6\" (UniqueName: \"kubernetes.io/projected/24e149ea-94ee-4a26-9e9a-900be46fb609-kube-api-access-v72j6\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.800525 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppbkb\" (UniqueName: \"kubernetes.io/projected/d5dc71b5-2203-47e7-9006-85d5c360d2a7-kube-api-access-ppbkb\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.818814 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fbfc796b-cf91-4025-bd64-b350e48606ea\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fbfc796b-cf91-4025-bd64-b350e48606ea\") pod \"ovsdbserver-nb-1\" (UID: \"d5dc71b5-2203-47e7-9006-85d5c360d2a7\") " pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.826762 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d25ef8ac-e685-4752-874b-be66d9b6100e\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d25ef8ac-e685-4752-874b-be66d9b6100e\") pod \"ovsdbserver-nb-0\" (UID: \"653c6c64-ca0a-46f2-8548-2e9b94dd9f34\") " pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.835173 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bcde44b5-40c8-4212-81f5-a140a33d29d4\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bcde44b5-40c8-4212-81f5-a140a33d29d4\") pod \"ovsdbserver-nb-2\" (UID: \"24e149ea-94ee-4a26-9e9a-900be46fb609\") " pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.836407 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.854745 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.863723 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6153a2cd-5c95-43ec-8238-f2a2e63598cb-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.864203 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-421d887f-ec7e-422b-9a2d-b0a3171344cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-421d887f-ec7e-422b-9a2d-b0a3171344cf\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.864317 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6153a2cd-5c95-43ec-8238-f2a2e63598cb-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.864398 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5be0a42a-c47f-4b40-ae00-72f013eaf3cb-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.864454 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-993f34c1-0a29-4c96-be48-1b9ffe53eae2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-993f34c1-0a29-4c96-be48-1b9ffe53eae2\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.864586 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.864612 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.864690 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6153a2cd-5c95-43ec-8238-f2a2e63598cb-config\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.864756 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5be0a42a-c47f-4b40-ae00-72f013eaf3cb-config\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.864801 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmhk6\" (UniqueName: \"kubernetes.io/projected/5be0a42a-c47f-4b40-ae00-72f013eaf3cb-kube-api-access-bmhk6\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.864860 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-61f596d3-eb99-414f-8069-e62fc4c95ca7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61f596d3-eb99-414f-8069-e62fc4c95ca7\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.865125 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5be0a42a-c47f-4b40-ae00-72f013eaf3cb-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.865302 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg75m\" (UniqueName: \"kubernetes.io/projected/02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a-kube-api-access-pg75m\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.865368 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9f2r\" (UniqueName: \"kubernetes.io/projected/6153a2cd-5c95-43ec-8238-f2a2e63598cb-kube-api-access-p9f2r\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.865458 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be0a42a-c47f-4b40-ae00-72f013eaf3cb-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.865560 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6153a2cd-5c95-43ec-8238-f2a2e63598cb-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.865718 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a-config\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.865886 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.888975 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.967715 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5be0a42a-c47f-4b40-ae00-72f013eaf3cb-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.968337 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg75m\" (UniqueName: \"kubernetes.io/projected/02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a-kube-api-access-pg75m\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.968382 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9f2r\" (UniqueName: \"kubernetes.io/projected/6153a2cd-5c95-43ec-8238-f2a2e63598cb-kube-api-access-p9f2r\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.968426 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be0a42a-c47f-4b40-ae00-72f013eaf3cb-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.968455 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6153a2cd-5c95-43ec-8238-f2a2e63598cb-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.968560 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a-config\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.968616 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.968654 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6153a2cd-5c95-43ec-8238-f2a2e63598cb-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.968750 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-421d887f-ec7e-422b-9a2d-b0a3171344cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-421d887f-ec7e-422b-9a2d-b0a3171344cf\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.968779 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6153a2cd-5c95-43ec-8238-f2a2e63598cb-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.968820 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5be0a42a-c47f-4b40-ae00-72f013eaf3cb-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.968844 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-993f34c1-0a29-4c96-be48-1b9ffe53eae2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-993f34c1-0a29-4c96-be48-1b9ffe53eae2\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.968872 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.968893 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.968924 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6153a2cd-5c95-43ec-8238-f2a2e63598cb-config\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.968951 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5be0a42a-c47f-4b40-ae00-72f013eaf3cb-config\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.968977 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmhk6\" (UniqueName: \"kubernetes.io/projected/5be0a42a-c47f-4b40-ae00-72f013eaf3cb-kube-api-access-bmhk6\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.969016 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-61f596d3-eb99-414f-8069-e62fc4c95ca7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61f596d3-eb99-414f-8069-e62fc4c95ca7\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.971200 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6153a2cd-5c95-43ec-8238-f2a2e63598cb-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.971286 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5be0a42a-c47f-4b40-ae00-72f013eaf3cb-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.971973 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6153a2cd-5c95-43ec-8238-f2a2e63598cb-config\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.972263 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a-config\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.972418 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5be0a42a-c47f-4b40-ae00-72f013eaf3cb-config\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.972460 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5be0a42a-c47f-4b40-ae00-72f013eaf3cb-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.976326 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.977151 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.978422 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6153a2cd-5c95-43ec-8238-f2a2e63598cb-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.987796 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6153a2cd-5c95-43ec-8238-f2a2e63598cb-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.987986 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5be0a42a-c47f-4b40-ae00-72f013eaf3cb-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.988538 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.992018 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg75m\" (UniqueName: \"kubernetes.io/projected/02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a-kube-api-access-pg75m\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.992101 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9f2r\" (UniqueName: \"kubernetes.io/projected/6153a2cd-5c95-43ec-8238-f2a2e63598cb-kube-api-access-p9f2r\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.992308 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.992338 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.992339 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-993f34c1-0a29-4c96-be48-1b9ffe53eae2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-993f34c1-0a29-4c96-be48-1b9ffe53eae2\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c68cb369aa969a3160bf39f6d84e73771b6571eb4c14d22eb1bc4998e5c949b8/globalmount\"" pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.992360 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-421d887f-ec7e-422b-9a2d-b0a3171344cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-421d887f-ec7e-422b-9a2d-b0a3171344cf\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/616dc3425c9f625296b6e6b325e21af8f3bb03eaca543b3183870bf2aaca39f1/globalmount\"" pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.997280 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.997402 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-61f596d3-eb99-414f-8069-e62fc4c95ca7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61f596d3-eb99-414f-8069-e62fc4c95ca7\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/88f838f9477f5faf9263a32d726f01e83579375a63a47c8896c000cebcbf20a9/globalmount\"" pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:47 crc kubenswrapper[4766]: I1002 12:20:47.995248 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmhk6\" (UniqueName: \"kubernetes.io/projected/5be0a42a-c47f-4b40-ae00-72f013eaf3cb-kube-api-access-bmhk6\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:48 crc kubenswrapper[4766]: I1002 12:20:48.057992 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-61f596d3-eb99-414f-8069-e62fc4c95ca7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-61f596d3-eb99-414f-8069-e62fc4c95ca7\") pod \"ovsdbserver-sb-0\" (UID: \"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a\") " pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:48 crc kubenswrapper[4766]: I1002 12:20:48.069724 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-421d887f-ec7e-422b-9a2d-b0a3171344cf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-421d887f-ec7e-422b-9a2d-b0a3171344cf\") pod \"ovsdbserver-sb-1\" (UID: \"6153a2cd-5c95-43ec-8238-f2a2e63598cb\") " pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:48 crc kubenswrapper[4766]: I1002 12:20:48.071927 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-993f34c1-0a29-4c96-be48-1b9ffe53eae2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-993f34c1-0a29-4c96-be48-1b9ffe53eae2\") pod \"ovsdbserver-sb-2\" (UID: \"5be0a42a-c47f-4b40-ae00-72f013eaf3cb\") " pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:48 crc kubenswrapper[4766]: I1002 12:20:48.269776 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:48 crc kubenswrapper[4766]: I1002 12:20:48.280821 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:48 crc kubenswrapper[4766]: I1002 12:20:48.365241 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:48 crc kubenswrapper[4766]: I1002 12:20:48.433960 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Oct 02 12:20:48 crc kubenswrapper[4766]: I1002 12:20:48.542005 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Oct 02 12:20:48 crc kubenswrapper[4766]: I1002 12:20:48.558238 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"d5dc71b5-2203-47e7-9006-85d5c360d2a7","Type":"ContainerStarted","Data":"ee86576af521a3af92bb7cc5b561d43d49915e9c24efdcaf29272821214e60c1"} Oct 02 12:20:48 crc kubenswrapper[4766]: I1002 12:20:48.562894 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"24e149ea-94ee-4a26-9e9a-900be46fb609","Type":"ContainerStarted","Data":"0b0bb7145668f101c590afd3e5f09d8027ccac30bea3cffa329f06c8d8206412"} Oct 02 12:20:48 crc kubenswrapper[4766]: I1002 12:20:48.649864 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Oct 02 12:20:48 crc kubenswrapper[4766]: I1002 12:20:48.987115 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 02 12:20:49 crc kubenswrapper[4766]: W1002 12:20:49.015593 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02d613a1_8be4_43a0_a8b0_fa8c25fa9d4a.slice/crio-8ddc107d9c38c7f19b183c372351febfd3017855812624cf58be629b5af8dcc7 WatchSource:0}: Error finding container 8ddc107d9c38c7f19b183c372351febfd3017855812624cf58be629b5af8dcc7: Status 404 returned error can't find the container with id 8ddc107d9c38c7f19b183c372351febfd3017855812624cf58be629b5af8dcc7 Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.156312 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 02 12:20:49 crc kubenswrapper[4766]: W1002 12:20:49.163400 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod653c6c64_ca0a_46f2_8548_2e9b94dd9f34.slice/crio-16607a0c17a01cb2d51fcaf514094ed457e9fe48d2b8e7ffc5332c95cab9024c WatchSource:0}: Error finding container 16607a0c17a01cb2d51fcaf514094ed457e9fe48d2b8e7ffc5332c95cab9024c: Status 404 returned error can't find the container with id 16607a0c17a01cb2d51fcaf514094ed457e9fe48d2b8e7ffc5332c95cab9024c Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.572791 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a","Type":"ContainerStarted","Data":"fa2a791f086f2bd6d7d171a8b872001a0da527354131c6840e01af23bb6614d4"} Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.573214 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a","Type":"ContainerStarted","Data":"5eaf628376235f749e108313a8eced77066c105801b2bb19bd99d6d07b906ab1"} Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.573228 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a","Type":"ContainerStarted","Data":"8ddc107d9c38c7f19b183c372351febfd3017855812624cf58be629b5af8dcc7"} Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.574600 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"653c6c64-ca0a-46f2-8548-2e9b94dd9f34","Type":"ContainerStarted","Data":"50440576f19bfcfd6760fdf7548868335ecf3546155169cd4ef2050b80094a43"} Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.574646 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"653c6c64-ca0a-46f2-8548-2e9b94dd9f34","Type":"ContainerStarted","Data":"c0ac6714879c305fedb7bcbc72a29af8aa6160a76d00fae13eea611d0b66a717"} Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.574655 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"653c6c64-ca0a-46f2-8548-2e9b94dd9f34","Type":"ContainerStarted","Data":"16607a0c17a01cb2d51fcaf514094ed457e9fe48d2b8e7ffc5332c95cab9024c"} Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.576684 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"d5dc71b5-2203-47e7-9006-85d5c360d2a7","Type":"ContainerStarted","Data":"add07f1fe76a83c79da990608753185f0098ceb9cf797791f3fbd13dfcab657d"} Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.576738 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"d5dc71b5-2203-47e7-9006-85d5c360d2a7","Type":"ContainerStarted","Data":"16aca6b393fc019a82526cfc5e676613941c22dc171345e2ed051c51c5863048"} Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.578650 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"24e149ea-94ee-4a26-9e9a-900be46fb609","Type":"ContainerStarted","Data":"e6f919b7559edecd117165dcf4ca748595837304dbc6d726b53d05ce821bb927"} Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.578693 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"24e149ea-94ee-4a26-9e9a-900be46fb609","Type":"ContainerStarted","Data":"ca27a1b1961e29f0256684222b587bb11c53ccbe54e047b9f021d2d740e11e64"} Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.581197 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"5be0a42a-c47f-4b40-ae00-72f013eaf3cb","Type":"ContainerStarted","Data":"d3b51743d13a9f2551370dcb3eb889d5f29e8cb9ce3026c883145ec35210b3c1"} Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.581232 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"5be0a42a-c47f-4b40-ae00-72f013eaf3cb","Type":"ContainerStarted","Data":"654759191ab9738231f97dea3c15fd2940a2b76a2a2940114af84cfae7c35ea0"} Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.581248 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"5be0a42a-c47f-4b40-ae00-72f013eaf3cb","Type":"ContainerStarted","Data":"878b6295d3125de886c0f71b6bfa2e77490edeaf8a0bdab00d7e019b381a901f"} Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.597708 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.597685907 podStartE2EDuration="3.597685907s" podCreationTimestamp="2025-10-02 12:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:20:49.59312376 +0000 UTC m=+5364.535994714" watchObservedRunningTime="2025-10-02 12:20:49.597685907 +0000 UTC m=+5364.540556851" Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.641604 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.641577612 podStartE2EDuration="3.641577612s" podCreationTimestamp="2025-10-02 12:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:20:49.634850796 +0000 UTC m=+5364.577721750" watchObservedRunningTime="2025-10-02 12:20:49.641577612 +0000 UTC m=+5364.584448546" Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.643619 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.643610537 podStartE2EDuration="3.643610537s" podCreationTimestamp="2025-10-02 12:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:20:49.619151754 +0000 UTC m=+5364.562022698" watchObservedRunningTime="2025-10-02 12:20:49.643610537 +0000 UTC m=+5364.586481481" Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.660700 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.660683034 podStartE2EDuration="3.660683034s" podCreationTimestamp="2025-10-02 12:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:20:49.655925751 +0000 UTC m=+5364.598796705" watchObservedRunningTime="2025-10-02 12:20:49.660683034 +0000 UTC m=+5364.603553978" Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.685740 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.685723045 podStartE2EDuration="3.685723045s" podCreationTimestamp="2025-10-02 12:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:20:49.68304828 +0000 UTC m=+5364.625919224" watchObservedRunningTime="2025-10-02 12:20:49.685723045 +0000 UTC m=+5364.628593989" Oct 02 12:20:49 crc kubenswrapper[4766]: I1002 12:20:49.950592 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Oct 02 12:20:50 crc kubenswrapper[4766]: I1002 12:20:50.593797 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"6153a2cd-5c95-43ec-8238-f2a2e63598cb","Type":"ContainerStarted","Data":"dfdf0dc78a1c40579c8b5e753b71a505f8e2fd05fec22bcf784e5e7dac494e16"} Oct 02 12:20:50 crc kubenswrapper[4766]: I1002 12:20:50.594383 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"6153a2cd-5c95-43ec-8238-f2a2e63598cb","Type":"ContainerStarted","Data":"a363d87a8daf088d66abb7126c2bd9325d168de494d963cd1458fd2935ef953a"} Oct 02 12:20:50 crc kubenswrapper[4766]: I1002 12:20:50.594399 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"6153a2cd-5c95-43ec-8238-f2a2e63598cb","Type":"ContainerStarted","Data":"d6eea5be7fae1220d86d8de7de1da4fee164d04f3f63fda070a8d266a86aaecb"} Oct 02 12:20:50 crc kubenswrapper[4766]: I1002 12:20:50.618358 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=4.618337515 podStartE2EDuration="4.618337515s" podCreationTimestamp="2025-10-02 12:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:20:50.613843431 +0000 UTC m=+5365.556714405" watchObservedRunningTime="2025-10-02 12:20:50.618337515 +0000 UTC m=+5365.561208459" Oct 02 12:20:50 crc kubenswrapper[4766]: I1002 12:20:50.837998 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:50 crc kubenswrapper[4766]: I1002 12:20:50.856213 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:50 crc kubenswrapper[4766]: I1002 12:20:50.889708 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:51 crc kubenswrapper[4766]: I1002 12:20:51.270060 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:51 crc kubenswrapper[4766]: I1002 12:20:51.282824 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:51 crc kubenswrapper[4766]: I1002 12:20:51.312257 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:51 crc kubenswrapper[4766]: I1002 12:20:51.365691 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:51 crc kubenswrapper[4766]: I1002 12:20:51.604142 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:52 crc kubenswrapper[4766]: I1002 12:20:52.836673 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:52 crc kubenswrapper[4766]: I1002 12:20:52.855835 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:52 crc kubenswrapper[4766]: I1002 12:20:52.889377 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.281680 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.315619 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.366147 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.594794 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g"] Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.596310 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.607016 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g"] Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.608300 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.682036 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-dns-svc\") pod \"dnsmasq-dns-6c9c4b4dfc-jbq5g\" (UID: \"de64eb24-cc59-4239-97cb-f4c08584b96f\") " pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.682421 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c4b4dfc-jbq5g\" (UID: \"de64eb24-cc59-4239-97cb-f4c08584b96f\") " pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.682559 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bmrm\" (UniqueName: \"kubernetes.io/projected/de64eb24-cc59-4239-97cb-f4c08584b96f-kube-api-access-2bmrm\") pod \"dnsmasq-dns-6c9c4b4dfc-jbq5g\" (UID: \"de64eb24-cc59-4239-97cb-f4c08584b96f\") " pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.682677 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-config\") pod \"dnsmasq-dns-6c9c4b4dfc-jbq5g\" (UID: \"de64eb24-cc59-4239-97cb-f4c08584b96f\") " pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.785221 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bmrm\" (UniqueName: \"kubernetes.io/projected/de64eb24-cc59-4239-97cb-f4c08584b96f-kube-api-access-2bmrm\") pod \"dnsmasq-dns-6c9c4b4dfc-jbq5g\" (UID: \"de64eb24-cc59-4239-97cb-f4c08584b96f\") " pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.785285 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c4b4dfc-jbq5g\" (UID: \"de64eb24-cc59-4239-97cb-f4c08584b96f\") " pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.785318 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-config\") pod \"dnsmasq-dns-6c9c4b4dfc-jbq5g\" (UID: \"de64eb24-cc59-4239-97cb-f4c08584b96f\") " pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.785568 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-dns-svc\") pod \"dnsmasq-dns-6c9c4b4dfc-jbq5g\" (UID: \"de64eb24-cc59-4239-97cb-f4c08584b96f\") " pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.787093 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-config\") pod \"dnsmasq-dns-6c9c4b4dfc-jbq5g\" (UID: \"de64eb24-cc59-4239-97cb-f4c08584b96f\") " pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.787190 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-ovsdbserver-sb\") pod \"dnsmasq-dns-6c9c4b4dfc-jbq5g\" (UID: \"de64eb24-cc59-4239-97cb-f4c08584b96f\") " pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.787326 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-dns-svc\") pod \"dnsmasq-dns-6c9c4b4dfc-jbq5g\" (UID: \"de64eb24-cc59-4239-97cb-f4c08584b96f\") " pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.807602 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bmrm\" (UniqueName: \"kubernetes.io/projected/de64eb24-cc59-4239-97cb-f4c08584b96f-kube-api-access-2bmrm\") pod \"dnsmasq-dns-6c9c4b4dfc-jbq5g\" (UID: \"de64eb24-cc59-4239-97cb-f4c08584b96f\") " pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.905969 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.923357 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.923685 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.941195 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.986398 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Oct 02 12:20:53 crc kubenswrapper[4766]: I1002 12:20:53.992534 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.238935 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g"] Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.271703 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98cbd5ff7-4qtjg"] Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.273626 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.278194 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.299363 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98cbd5ff7-4qtjg"] Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.368746 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.399203 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-ovsdbserver-sb\") pod \"dnsmasq-dns-98cbd5ff7-4qtjg\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.399264 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prnn4\" (UniqueName: \"kubernetes.io/projected/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-kube-api-access-prnn4\") pod \"dnsmasq-dns-98cbd5ff7-4qtjg\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.399319 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-ovsdbserver-nb\") pod \"dnsmasq-dns-98cbd5ff7-4qtjg\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.399559 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-config\") pod \"dnsmasq-dns-98cbd5ff7-4qtjg\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.399901 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-dns-svc\") pod \"dnsmasq-dns-98cbd5ff7-4qtjg\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.419718 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.478278 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.502349 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g"] Oct 02 12:20:54 crc kubenswrapper[4766]: W1002 12:20:54.508937 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde64eb24_cc59_4239_97cb_f4c08584b96f.slice/crio-d202903462d7ed67625b8cf971553fd1369424e387c60922338404fa10047a66 WatchSource:0}: Error finding container d202903462d7ed67625b8cf971553fd1369424e387c60922338404fa10047a66: Status 404 returned error can't find the container with id d202903462d7ed67625b8cf971553fd1369424e387c60922338404fa10047a66 Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.512908 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-ovsdbserver-sb\") pod \"dnsmasq-dns-98cbd5ff7-4qtjg\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.512966 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prnn4\" (UniqueName: \"kubernetes.io/projected/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-kube-api-access-prnn4\") pod \"dnsmasq-dns-98cbd5ff7-4qtjg\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.513082 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-ovsdbserver-nb\") pod \"dnsmasq-dns-98cbd5ff7-4qtjg\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.513153 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-config\") pod \"dnsmasq-dns-98cbd5ff7-4qtjg\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.513251 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-dns-svc\") pod \"dnsmasq-dns-98cbd5ff7-4qtjg\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.514265 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-ovsdbserver-nb\") pod \"dnsmasq-dns-98cbd5ff7-4qtjg\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.514926 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-ovsdbserver-sb\") pod \"dnsmasq-dns-98cbd5ff7-4qtjg\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.515627 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-dns-svc\") pod \"dnsmasq-dns-98cbd5ff7-4qtjg\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.516074 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-config\") pod \"dnsmasq-dns-98cbd5ff7-4qtjg\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.537410 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prnn4\" (UniqueName: \"kubernetes.io/projected/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-kube-api-access-prnn4\") pod \"dnsmasq-dns-98cbd5ff7-4qtjg\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.620579 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:54 crc kubenswrapper[4766]: I1002 12:20:54.641750 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" event={"ID":"de64eb24-cc59-4239-97cb-f4c08584b96f","Type":"ContainerStarted","Data":"d202903462d7ed67625b8cf971553fd1369424e387c60922338404fa10047a66"} Oct 02 12:20:55 crc kubenswrapper[4766]: I1002 12:20:55.203377 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98cbd5ff7-4qtjg"] Oct 02 12:20:55 crc kubenswrapper[4766]: I1002 12:20:55.651141 4766 generic.go:334] "Generic (PLEG): container finished" podID="de64eb24-cc59-4239-97cb-f4c08584b96f" containerID="81446c0efc98cecffaa58a5705d5b941f84c78601b100382d33b796f18d0215f" exitCode=0 Oct 02 12:20:55 crc kubenswrapper[4766]: I1002 12:20:55.651220 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" event={"ID":"de64eb24-cc59-4239-97cb-f4c08584b96f","Type":"ContainerDied","Data":"81446c0efc98cecffaa58a5705d5b941f84c78601b100382d33b796f18d0215f"} Oct 02 12:20:55 crc kubenswrapper[4766]: I1002 12:20:55.653517 4766 generic.go:334] "Generic (PLEG): container finished" podID="1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e" containerID="7f9d6aa64912a1f0490b908af1cb430114db9094fd6c749b47906b82ea636561" exitCode=0 Oct 02 12:20:55 crc kubenswrapper[4766]: I1002 12:20:55.653581 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" event={"ID":"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e","Type":"ContainerDied","Data":"7f9d6aa64912a1f0490b908af1cb430114db9094fd6c749b47906b82ea636561"} Oct 02 12:20:55 crc kubenswrapper[4766]: I1002 12:20:55.653606 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" event={"ID":"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e","Type":"ContainerStarted","Data":"4c73cfef327f763a386ee0bbfc0947c935349546e4ea63548a8f3f70e58bd14d"} Oct 02 12:20:55 crc kubenswrapper[4766]: I1002 12:20:55.975575 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.054837 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-config\") pod \"de64eb24-cc59-4239-97cb-f4c08584b96f\" (UID: \"de64eb24-cc59-4239-97cb-f4c08584b96f\") " Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.055000 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bmrm\" (UniqueName: \"kubernetes.io/projected/de64eb24-cc59-4239-97cb-f4c08584b96f-kube-api-access-2bmrm\") pod \"de64eb24-cc59-4239-97cb-f4c08584b96f\" (UID: \"de64eb24-cc59-4239-97cb-f4c08584b96f\") " Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.055134 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-ovsdbserver-sb\") pod \"de64eb24-cc59-4239-97cb-f4c08584b96f\" (UID: \"de64eb24-cc59-4239-97cb-f4c08584b96f\") " Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.055173 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-dns-svc\") pod \"de64eb24-cc59-4239-97cb-f4c08584b96f\" (UID: \"de64eb24-cc59-4239-97cb-f4c08584b96f\") " Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.060799 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de64eb24-cc59-4239-97cb-f4c08584b96f-kube-api-access-2bmrm" (OuterVolumeSpecName: "kube-api-access-2bmrm") pod "de64eb24-cc59-4239-97cb-f4c08584b96f" (UID: "de64eb24-cc59-4239-97cb-f4c08584b96f"). InnerVolumeSpecName "kube-api-access-2bmrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.074711 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-config" (OuterVolumeSpecName: "config") pod "de64eb24-cc59-4239-97cb-f4c08584b96f" (UID: "de64eb24-cc59-4239-97cb-f4c08584b96f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.079600 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "de64eb24-cc59-4239-97cb-f4c08584b96f" (UID: "de64eb24-cc59-4239-97cb-f4c08584b96f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.080392 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "de64eb24-cc59-4239-97cb-f4c08584b96f" (UID: "de64eb24-cc59-4239-97cb-f4c08584b96f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.157984 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.158090 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bmrm\" (UniqueName: \"kubernetes.io/projected/de64eb24-cc59-4239-97cb-f4c08584b96f-kube-api-access-2bmrm\") on node \"crc\" DevicePath \"\"" Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.158106 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.158122 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de64eb24-cc59-4239-97cb-f4c08584b96f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.669136 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" event={"ID":"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e","Type":"ContainerStarted","Data":"963829ec9ae72105303eb74de2dea6160242d4f5c6d2babc8c50942fa2834978"} Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.669892 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.674471 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" event={"ID":"de64eb24-cc59-4239-97cb-f4c08584b96f","Type":"ContainerDied","Data":"d202903462d7ed67625b8cf971553fd1369424e387c60922338404fa10047a66"} Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.674607 4766 scope.go:117] "RemoveContainer" containerID="81446c0efc98cecffaa58a5705d5b941f84c78601b100382d33b796f18d0215f" Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.674545 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g" Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.696227 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" podStartSLOduration=2.696197699 podStartE2EDuration="2.696197699s" podCreationTimestamp="2025-10-02 12:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:20:56.690794206 +0000 UTC m=+5371.633665170" watchObservedRunningTime="2025-10-02 12:20:56.696197699 +0000 UTC m=+5371.639068643" Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.739434 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g"] Oct 02 12:20:56 crc kubenswrapper[4766]: I1002 12:20:56.759719 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c9c4b4dfc-jbq5g"] Oct 02 12:20:57 crc kubenswrapper[4766]: I1002 12:20:57.896409 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de64eb24-cc59-4239-97cb-f4c08584b96f" path="/var/lib/kubelet/pods/de64eb24-cc59-4239-97cb-f4c08584b96f/volumes" Oct 02 12:20:57 crc kubenswrapper[4766]: I1002 12:20:57.900601 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 02 12:20:58 crc kubenswrapper[4766]: I1002 12:20:58.332288 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Oct 02 12:21:00 crc kubenswrapper[4766]: I1002 12:21:00.875789 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Oct 02 12:21:00 crc kubenswrapper[4766]: E1002 12:21:00.876598 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de64eb24-cc59-4239-97cb-f4c08584b96f" containerName="init" Oct 02 12:21:00 crc kubenswrapper[4766]: I1002 12:21:00.876616 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="de64eb24-cc59-4239-97cb-f4c08584b96f" containerName="init" Oct 02 12:21:00 crc kubenswrapper[4766]: I1002 12:21:00.876787 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="de64eb24-cc59-4239-97cb-f4c08584b96f" containerName="init" Oct 02 12:21:00 crc kubenswrapper[4766]: I1002 12:21:00.877620 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 02 12:21:00 crc kubenswrapper[4766]: I1002 12:21:00.880819 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Oct 02 12:21:00 crc kubenswrapper[4766]: I1002 12:21:00.898201 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 02 12:21:00 crc kubenswrapper[4766]: I1002 12:21:00.975310 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f6f16bd8-c8d7-4576-bb54-d3815955b151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6f16bd8-c8d7-4576-bb54-d3815955b151\") pod \"ovn-copy-data\" (UID: \"59964a1b-9dfe-49fc-b2e7-6d6f30959b26\") " pod="openstack/ovn-copy-data" Oct 02 12:21:00 crc kubenswrapper[4766]: I1002 12:21:00.975387 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/59964a1b-9dfe-49fc-b2e7-6d6f30959b26-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"59964a1b-9dfe-49fc-b2e7-6d6f30959b26\") " pod="openstack/ovn-copy-data" Oct 02 12:21:00 crc kubenswrapper[4766]: I1002 12:21:00.975432 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9w8g\" (UniqueName: \"kubernetes.io/projected/59964a1b-9dfe-49fc-b2e7-6d6f30959b26-kube-api-access-j9w8g\") pod \"ovn-copy-data\" (UID: \"59964a1b-9dfe-49fc-b2e7-6d6f30959b26\") " pod="openstack/ovn-copy-data" Oct 02 12:21:01 crc kubenswrapper[4766]: I1002 12:21:01.077232 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f6f16bd8-c8d7-4576-bb54-d3815955b151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6f16bd8-c8d7-4576-bb54-d3815955b151\") pod \"ovn-copy-data\" (UID: \"59964a1b-9dfe-49fc-b2e7-6d6f30959b26\") " pod="openstack/ovn-copy-data" Oct 02 12:21:01 crc kubenswrapper[4766]: I1002 12:21:01.077304 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/59964a1b-9dfe-49fc-b2e7-6d6f30959b26-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"59964a1b-9dfe-49fc-b2e7-6d6f30959b26\") " pod="openstack/ovn-copy-data" Oct 02 12:21:01 crc kubenswrapper[4766]: I1002 12:21:01.077335 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9w8g\" (UniqueName: \"kubernetes.io/projected/59964a1b-9dfe-49fc-b2e7-6d6f30959b26-kube-api-access-j9w8g\") pod \"ovn-copy-data\" (UID: \"59964a1b-9dfe-49fc-b2e7-6d6f30959b26\") " pod="openstack/ovn-copy-data" Oct 02 12:21:01 crc kubenswrapper[4766]: I1002 12:21:01.080075 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:21:01 crc kubenswrapper[4766]: I1002 12:21:01.080169 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f6f16bd8-c8d7-4576-bb54-d3815955b151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6f16bd8-c8d7-4576-bb54-d3815955b151\") pod \"ovn-copy-data\" (UID: \"59964a1b-9dfe-49fc-b2e7-6d6f30959b26\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d5dfed3d09646afd20553c4504caba6a8ac66b6d909d5a42c6087aeab4d38990/globalmount\"" pod="openstack/ovn-copy-data" Oct 02 12:21:01 crc kubenswrapper[4766]: I1002 12:21:01.086913 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/59964a1b-9dfe-49fc-b2e7-6d6f30959b26-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"59964a1b-9dfe-49fc-b2e7-6d6f30959b26\") " pod="openstack/ovn-copy-data" Oct 02 12:21:01 crc kubenswrapper[4766]: I1002 12:21:01.095904 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9w8g\" (UniqueName: \"kubernetes.io/projected/59964a1b-9dfe-49fc-b2e7-6d6f30959b26-kube-api-access-j9w8g\") pod \"ovn-copy-data\" (UID: \"59964a1b-9dfe-49fc-b2e7-6d6f30959b26\") " pod="openstack/ovn-copy-data" Oct 02 12:21:01 crc kubenswrapper[4766]: I1002 12:21:01.114840 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f6f16bd8-c8d7-4576-bb54-d3815955b151\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f6f16bd8-c8d7-4576-bb54-d3815955b151\") pod \"ovn-copy-data\" (UID: \"59964a1b-9dfe-49fc-b2e7-6d6f30959b26\") " pod="openstack/ovn-copy-data" Oct 02 12:21:01 crc kubenswrapper[4766]: I1002 12:21:01.207476 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Oct 02 12:21:01 crc kubenswrapper[4766]: I1002 12:21:01.761863 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Oct 02 12:21:01 crc kubenswrapper[4766]: W1002 12:21:01.773712 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59964a1b_9dfe_49fc_b2e7_6d6f30959b26.slice/crio-e34eaaef7d97ea0d353f2cc75e57d5c51d07a4632d1d29487d44adfa995dbb28 WatchSource:0}: Error finding container e34eaaef7d97ea0d353f2cc75e57d5c51d07a4632d1d29487d44adfa995dbb28: Status 404 returned error can't find the container with id e34eaaef7d97ea0d353f2cc75e57d5c51d07a4632d1d29487d44adfa995dbb28 Oct 02 12:21:02 crc kubenswrapper[4766]: I1002 12:21:02.729171 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"59964a1b-9dfe-49fc-b2e7-6d6f30959b26","Type":"ContainerStarted","Data":"5699db6901cb6ca0eab95b4f6c530e58db7bcaa073d11aaa30e5af65ab4f72ee"} Oct 02 12:21:02 crc kubenswrapper[4766]: I1002 12:21:02.729767 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"59964a1b-9dfe-49fc-b2e7-6d6f30959b26","Type":"ContainerStarted","Data":"e34eaaef7d97ea0d353f2cc75e57d5c51d07a4632d1d29487d44adfa995dbb28"} Oct 02 12:21:02 crc kubenswrapper[4766]: I1002 12:21:02.749478 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.749452845 podStartE2EDuration="3.749452845s" podCreationTimestamp="2025-10-02 12:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:21:02.748860896 +0000 UTC m=+5377.691731840" watchObservedRunningTime="2025-10-02 12:21:02.749452845 +0000 UTC m=+5377.692323789" Oct 02 12:21:04 crc kubenswrapper[4766]: I1002 12:21:04.621862 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:21:04 crc kubenswrapper[4766]: I1002 12:21:04.683715 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-vjg7m"] Oct 02 12:21:04 crc kubenswrapper[4766]: I1002 12:21:04.695271 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" podUID="b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9" containerName="dnsmasq-dns" containerID="cri-o://9b020ea37c7baf24cbf4c64d013b1d959ae4561cc3241bebff26b7ad136de938" gracePeriod=10 Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.156873 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.165249 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-dns-svc\") pod \"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9\" (UID: \"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9\") " Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.165331 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-config\") pod \"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9\" (UID: \"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9\") " Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.165385 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh6pk\" (UniqueName: \"kubernetes.io/projected/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-kube-api-access-zh6pk\") pod \"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9\" (UID: \"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9\") " Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.172769 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-kube-api-access-zh6pk" (OuterVolumeSpecName: "kube-api-access-zh6pk") pod "b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9" (UID: "b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9"). InnerVolumeSpecName "kube-api-access-zh6pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.216839 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9" (UID: "b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.218932 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-config" (OuterVolumeSpecName: "config") pod "b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9" (UID: "b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.267654 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.267706 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.267721 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh6pk\" (UniqueName: \"kubernetes.io/projected/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9-kube-api-access-zh6pk\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.768449 4766 generic.go:334] "Generic (PLEG): container finished" podID="b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9" containerID="9b020ea37c7baf24cbf4c64d013b1d959ae4561cc3241bebff26b7ad136de938" exitCode=0 Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.768552 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" event={"ID":"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9","Type":"ContainerDied","Data":"9b020ea37c7baf24cbf4c64d013b1d959ae4561cc3241bebff26b7ad136de938"} Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.769709 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" event={"ID":"b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9","Type":"ContainerDied","Data":"ddfde4ed3a78cb0bd6341dbcab24cdba28a982ca7ee8efc751b8d406d7625713"} Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.768603 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-vjg7m" Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.769779 4766 scope.go:117] "RemoveContainer" containerID="9b020ea37c7baf24cbf4c64d013b1d959ae4561cc3241bebff26b7ad136de938" Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.790647 4766 scope.go:117] "RemoveContainer" containerID="f9e57669eda5c559501482777482036deab2a93cddb4849de48d7ba9f9d94570" Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.822438 4766 scope.go:117] "RemoveContainer" containerID="9b020ea37c7baf24cbf4c64d013b1d959ae4561cc3241bebff26b7ad136de938" Oct 02 12:21:05 crc kubenswrapper[4766]: E1002 12:21:05.823361 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b020ea37c7baf24cbf4c64d013b1d959ae4561cc3241bebff26b7ad136de938\": container with ID starting with 9b020ea37c7baf24cbf4c64d013b1d959ae4561cc3241bebff26b7ad136de938 not found: ID does not exist" containerID="9b020ea37c7baf24cbf4c64d013b1d959ae4561cc3241bebff26b7ad136de938" Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.823443 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b020ea37c7baf24cbf4c64d013b1d959ae4561cc3241bebff26b7ad136de938"} err="failed to get container status \"9b020ea37c7baf24cbf4c64d013b1d959ae4561cc3241bebff26b7ad136de938\": rpc error: code = NotFound desc = could not find container \"9b020ea37c7baf24cbf4c64d013b1d959ae4561cc3241bebff26b7ad136de938\": container with ID starting with 9b020ea37c7baf24cbf4c64d013b1d959ae4561cc3241bebff26b7ad136de938 not found: ID does not exist" Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.823495 4766 scope.go:117] "RemoveContainer" containerID="f9e57669eda5c559501482777482036deab2a93cddb4849de48d7ba9f9d94570" Oct 02 12:21:05 crc kubenswrapper[4766]: E1002 12:21:05.824055 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9e57669eda5c559501482777482036deab2a93cddb4849de48d7ba9f9d94570\": container with ID starting with f9e57669eda5c559501482777482036deab2a93cddb4849de48d7ba9f9d94570 not found: ID does not exist" containerID="f9e57669eda5c559501482777482036deab2a93cddb4849de48d7ba9f9d94570" Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.824108 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9e57669eda5c559501482777482036deab2a93cddb4849de48d7ba9f9d94570"} err="failed to get container status \"f9e57669eda5c559501482777482036deab2a93cddb4849de48d7ba9f9d94570\": rpc error: code = NotFound desc = could not find container \"f9e57669eda5c559501482777482036deab2a93cddb4849de48d7ba9f9d94570\": container with ID starting with f9e57669eda5c559501482777482036deab2a93cddb4849de48d7ba9f9d94570 not found: ID does not exist" Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.826973 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-vjg7m"] Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.839058 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-vjg7m"] Oct 02 12:21:05 crc kubenswrapper[4766]: I1002 12:21:05.896349 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9" path="/var/lib/kubelet/pods/b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9/volumes" Oct 02 12:21:07 crc kubenswrapper[4766]: I1002 12:21:07.749382 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 02 12:21:07 crc kubenswrapper[4766]: E1002 12:21:07.750446 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9" containerName="dnsmasq-dns" Oct 02 12:21:07 crc kubenswrapper[4766]: I1002 12:21:07.750466 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9" containerName="dnsmasq-dns" Oct 02 12:21:07 crc kubenswrapper[4766]: E1002 12:21:07.750487 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9" containerName="init" Oct 02 12:21:07 crc kubenswrapper[4766]: I1002 12:21:07.750495 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9" containerName="init" Oct 02 12:21:07 crc kubenswrapper[4766]: I1002 12:21:07.750702 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7eb999f-bdc2-43c4-9e88-1d775fa3c5a9" containerName="dnsmasq-dns" Oct 02 12:21:07 crc kubenswrapper[4766]: I1002 12:21:07.751727 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 12:21:07 crc kubenswrapper[4766]: I1002 12:21:07.756233 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-6zg8p" Oct 02 12:21:07 crc kubenswrapper[4766]: I1002 12:21:07.757268 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 02 12:21:07 crc kubenswrapper[4766]: I1002 12:21:07.763558 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 02 12:21:07 crc kubenswrapper[4766]: I1002 12:21:07.783235 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 12:21:07 crc kubenswrapper[4766]: I1002 12:21:07.925805 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d182bc5-db60-4980-8df6-469f2efb5188-scripts\") pod \"ovn-northd-0\" (UID: \"0d182bc5-db60-4980-8df6-469f2efb5188\") " pod="openstack/ovn-northd-0" Oct 02 12:21:07 crc kubenswrapper[4766]: I1002 12:21:07.925971 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d182bc5-db60-4980-8df6-469f2efb5188-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0d182bc5-db60-4980-8df6-469f2efb5188\") " pod="openstack/ovn-northd-0" Oct 02 12:21:07 crc kubenswrapper[4766]: I1002 12:21:07.926027 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0d182bc5-db60-4980-8df6-469f2efb5188-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0d182bc5-db60-4980-8df6-469f2efb5188\") " pod="openstack/ovn-northd-0" Oct 02 12:21:07 crc kubenswrapper[4766]: I1002 12:21:07.926080 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d182bc5-db60-4980-8df6-469f2efb5188-config\") pod \"ovn-northd-0\" (UID: \"0d182bc5-db60-4980-8df6-469f2efb5188\") " pod="openstack/ovn-northd-0" Oct 02 12:21:07 crc kubenswrapper[4766]: I1002 12:21:07.926115 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvh78\" (UniqueName: \"kubernetes.io/projected/0d182bc5-db60-4980-8df6-469f2efb5188-kube-api-access-fvh78\") pod \"ovn-northd-0\" (UID: \"0d182bc5-db60-4980-8df6-469f2efb5188\") " pod="openstack/ovn-northd-0" Oct 02 12:21:08 crc kubenswrapper[4766]: I1002 12:21:08.027530 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d182bc5-db60-4980-8df6-469f2efb5188-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0d182bc5-db60-4980-8df6-469f2efb5188\") " pod="openstack/ovn-northd-0" Oct 02 12:21:08 crc kubenswrapper[4766]: I1002 12:21:08.027622 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0d182bc5-db60-4980-8df6-469f2efb5188-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0d182bc5-db60-4980-8df6-469f2efb5188\") " pod="openstack/ovn-northd-0" Oct 02 12:21:08 crc kubenswrapper[4766]: I1002 12:21:08.027661 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d182bc5-db60-4980-8df6-469f2efb5188-config\") pod \"ovn-northd-0\" (UID: \"0d182bc5-db60-4980-8df6-469f2efb5188\") " pod="openstack/ovn-northd-0" Oct 02 12:21:08 crc kubenswrapper[4766]: I1002 12:21:08.027694 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvh78\" (UniqueName: \"kubernetes.io/projected/0d182bc5-db60-4980-8df6-469f2efb5188-kube-api-access-fvh78\") pod \"ovn-northd-0\" (UID: \"0d182bc5-db60-4980-8df6-469f2efb5188\") " pod="openstack/ovn-northd-0" Oct 02 12:21:08 crc kubenswrapper[4766]: I1002 12:21:08.027730 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d182bc5-db60-4980-8df6-469f2efb5188-scripts\") pod \"ovn-northd-0\" (UID: \"0d182bc5-db60-4980-8df6-469f2efb5188\") " pod="openstack/ovn-northd-0" Oct 02 12:21:08 crc kubenswrapper[4766]: I1002 12:21:08.028463 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0d182bc5-db60-4980-8df6-469f2efb5188-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"0d182bc5-db60-4980-8df6-469f2efb5188\") " pod="openstack/ovn-northd-0" Oct 02 12:21:08 crc kubenswrapper[4766]: I1002 12:21:08.028880 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0d182bc5-db60-4980-8df6-469f2efb5188-scripts\") pod \"ovn-northd-0\" (UID: \"0d182bc5-db60-4980-8df6-469f2efb5188\") " pod="openstack/ovn-northd-0" Oct 02 12:21:08 crc kubenswrapper[4766]: I1002 12:21:08.028968 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d182bc5-db60-4980-8df6-469f2efb5188-config\") pod \"ovn-northd-0\" (UID: \"0d182bc5-db60-4980-8df6-469f2efb5188\") " pod="openstack/ovn-northd-0" Oct 02 12:21:08 crc kubenswrapper[4766]: I1002 12:21:08.037613 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d182bc5-db60-4980-8df6-469f2efb5188-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"0d182bc5-db60-4980-8df6-469f2efb5188\") " pod="openstack/ovn-northd-0" Oct 02 12:21:08 crc kubenswrapper[4766]: I1002 12:21:08.072729 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvh78\" (UniqueName: \"kubernetes.io/projected/0d182bc5-db60-4980-8df6-469f2efb5188-kube-api-access-fvh78\") pod \"ovn-northd-0\" (UID: \"0d182bc5-db60-4980-8df6-469f2efb5188\") " pod="openstack/ovn-northd-0" Oct 02 12:21:08 crc kubenswrapper[4766]: I1002 12:21:08.087223 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 02 12:21:08 crc kubenswrapper[4766]: I1002 12:21:08.416453 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 02 12:21:08 crc kubenswrapper[4766]: I1002 12:21:08.805448 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0d182bc5-db60-4980-8df6-469f2efb5188","Type":"ContainerStarted","Data":"8267fe96e3e015b1c130603f985a5e307cbb773eddbc0540e94fc22ed8d518b6"} Oct 02 12:21:08 crc kubenswrapper[4766]: I1002 12:21:08.805497 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0d182bc5-db60-4980-8df6-469f2efb5188","Type":"ContainerStarted","Data":"707d7978d3838428cf6ebcb2c81bc19a6f72657bf06a7a951453d979a1e0aec7"} Oct 02 12:21:08 crc kubenswrapper[4766]: I1002 12:21:08.805524 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"0d182bc5-db60-4980-8df6-469f2efb5188","Type":"ContainerStarted","Data":"147b2e5248da36f7c946b1e93ab30a763a536acf6a9bb99ff88370b7e59e19dc"} Oct 02 12:21:08 crc kubenswrapper[4766]: I1002 12:21:08.805632 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 02 12:21:08 crc kubenswrapper[4766]: I1002 12:21:08.829351 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.829326781 podStartE2EDuration="1.829326781s" podCreationTimestamp="2025-10-02 12:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:21:08.82490492 +0000 UTC m=+5383.767775864" watchObservedRunningTime="2025-10-02 12:21:08.829326781 +0000 UTC m=+5383.772197725" Oct 02 12:21:08 crc kubenswrapper[4766]: E1002 12:21:08.939780 4766 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.200:45468->38.129.56.200:32845: write tcp 38.129.56.200:45468->38.129.56.200:32845: write: broken pipe Oct 02 12:21:13 crc kubenswrapper[4766]: I1002 12:21:13.139978 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-rsxk9"] Oct 02 12:21:13 crc kubenswrapper[4766]: I1002 12:21:13.142219 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rsxk9" Oct 02 12:21:13 crc kubenswrapper[4766]: I1002 12:21:13.154887 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rsxk9"] Oct 02 12:21:13 crc kubenswrapper[4766]: I1002 12:21:13.327713 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7v96\" (UniqueName: \"kubernetes.io/projected/7b184e35-c4de-43ab-afe1-439ce0de43ab-kube-api-access-s7v96\") pod \"keystone-db-create-rsxk9\" (UID: \"7b184e35-c4de-43ab-afe1-439ce0de43ab\") " pod="openstack/keystone-db-create-rsxk9" Oct 02 12:21:13 crc kubenswrapper[4766]: I1002 12:21:13.430152 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7v96\" (UniqueName: \"kubernetes.io/projected/7b184e35-c4de-43ab-afe1-439ce0de43ab-kube-api-access-s7v96\") pod \"keystone-db-create-rsxk9\" (UID: \"7b184e35-c4de-43ab-afe1-439ce0de43ab\") " pod="openstack/keystone-db-create-rsxk9" Oct 02 12:21:13 crc kubenswrapper[4766]: I1002 12:21:13.451010 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7v96\" (UniqueName: \"kubernetes.io/projected/7b184e35-c4de-43ab-afe1-439ce0de43ab-kube-api-access-s7v96\") pod \"keystone-db-create-rsxk9\" (UID: \"7b184e35-c4de-43ab-afe1-439ce0de43ab\") " pod="openstack/keystone-db-create-rsxk9" Oct 02 12:21:13 crc kubenswrapper[4766]: I1002 12:21:13.491035 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rsxk9" Oct 02 12:21:13 crc kubenswrapper[4766]: I1002 12:21:13.940946 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rsxk9"] Oct 02 12:21:13 crc kubenswrapper[4766]: W1002 12:21:13.948619 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b184e35_c4de_43ab_afe1_439ce0de43ab.slice/crio-347b66bc959781a7ebfa2a597f90620aedbe48369475230d8859a62987413515 WatchSource:0}: Error finding container 347b66bc959781a7ebfa2a597f90620aedbe48369475230d8859a62987413515: Status 404 returned error can't find the container with id 347b66bc959781a7ebfa2a597f90620aedbe48369475230d8859a62987413515 Oct 02 12:21:14 crc kubenswrapper[4766]: I1002 12:21:14.863427 4766 generic.go:334] "Generic (PLEG): container finished" podID="7b184e35-c4de-43ab-afe1-439ce0de43ab" containerID="85f9c4808df8eb59856f69020186fb2783f9e1d86173430f018310f898750ac7" exitCode=0 Oct 02 12:21:14 crc kubenswrapper[4766]: I1002 12:21:14.863552 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rsxk9" event={"ID":"7b184e35-c4de-43ab-afe1-439ce0de43ab","Type":"ContainerDied","Data":"85f9c4808df8eb59856f69020186fb2783f9e1d86173430f018310f898750ac7"} Oct 02 12:21:14 crc kubenswrapper[4766]: I1002 12:21:14.865836 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rsxk9" event={"ID":"7b184e35-c4de-43ab-afe1-439ce0de43ab","Type":"ContainerStarted","Data":"347b66bc959781a7ebfa2a597f90620aedbe48369475230d8859a62987413515"} Oct 02 12:21:16 crc kubenswrapper[4766]: I1002 12:21:16.199289 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rsxk9" Oct 02 12:21:16 crc kubenswrapper[4766]: I1002 12:21:16.384407 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7v96\" (UniqueName: \"kubernetes.io/projected/7b184e35-c4de-43ab-afe1-439ce0de43ab-kube-api-access-s7v96\") pod \"7b184e35-c4de-43ab-afe1-439ce0de43ab\" (UID: \"7b184e35-c4de-43ab-afe1-439ce0de43ab\") " Oct 02 12:21:16 crc kubenswrapper[4766]: I1002 12:21:16.391693 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b184e35-c4de-43ab-afe1-439ce0de43ab-kube-api-access-s7v96" (OuterVolumeSpecName: "kube-api-access-s7v96") pod "7b184e35-c4de-43ab-afe1-439ce0de43ab" (UID: "7b184e35-c4de-43ab-afe1-439ce0de43ab"). InnerVolumeSpecName "kube-api-access-s7v96". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:21:16 crc kubenswrapper[4766]: I1002 12:21:16.487335 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7v96\" (UniqueName: \"kubernetes.io/projected/7b184e35-c4de-43ab-afe1-439ce0de43ab-kube-api-access-s7v96\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:16 crc kubenswrapper[4766]: I1002 12:21:16.886810 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rsxk9" event={"ID":"7b184e35-c4de-43ab-afe1-439ce0de43ab","Type":"ContainerDied","Data":"347b66bc959781a7ebfa2a597f90620aedbe48369475230d8859a62987413515"} Oct 02 12:21:16 crc kubenswrapper[4766]: I1002 12:21:16.886868 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rsxk9" Oct 02 12:21:16 crc kubenswrapper[4766]: I1002 12:21:16.886868 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="347b66bc959781a7ebfa2a597f90620aedbe48369475230d8859a62987413515" Oct 02 12:21:18 crc kubenswrapper[4766]: I1002 12:21:18.149136 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 02 12:21:23 crc kubenswrapper[4766]: I1002 12:21:23.157635 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-a884-account-create-gvlx4"] Oct 02 12:21:23 crc kubenswrapper[4766]: E1002 12:21:23.158850 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b184e35-c4de-43ab-afe1-439ce0de43ab" containerName="mariadb-database-create" Oct 02 12:21:23 crc kubenswrapper[4766]: I1002 12:21:23.158864 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b184e35-c4de-43ab-afe1-439ce0de43ab" containerName="mariadb-database-create" Oct 02 12:21:23 crc kubenswrapper[4766]: I1002 12:21:23.159037 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b184e35-c4de-43ab-afe1-439ce0de43ab" containerName="mariadb-database-create" Oct 02 12:21:23 crc kubenswrapper[4766]: I1002 12:21:23.159748 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a884-account-create-gvlx4" Oct 02 12:21:23 crc kubenswrapper[4766]: I1002 12:21:23.166659 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 02 12:21:23 crc kubenswrapper[4766]: I1002 12:21:23.169876 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a884-account-create-gvlx4"] Oct 02 12:21:23 crc kubenswrapper[4766]: I1002 12:21:23.322635 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcjvl\" (UniqueName: \"kubernetes.io/projected/8a8ac68a-691d-4b52-9f2b-7343fe201c62-kube-api-access-kcjvl\") pod \"keystone-a884-account-create-gvlx4\" (UID: \"8a8ac68a-691d-4b52-9f2b-7343fe201c62\") " pod="openstack/keystone-a884-account-create-gvlx4" Oct 02 12:21:23 crc kubenswrapper[4766]: I1002 12:21:23.424910 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcjvl\" (UniqueName: \"kubernetes.io/projected/8a8ac68a-691d-4b52-9f2b-7343fe201c62-kube-api-access-kcjvl\") pod \"keystone-a884-account-create-gvlx4\" (UID: \"8a8ac68a-691d-4b52-9f2b-7343fe201c62\") " pod="openstack/keystone-a884-account-create-gvlx4" Oct 02 12:21:23 crc kubenswrapper[4766]: I1002 12:21:23.447873 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcjvl\" (UniqueName: \"kubernetes.io/projected/8a8ac68a-691d-4b52-9f2b-7343fe201c62-kube-api-access-kcjvl\") pod \"keystone-a884-account-create-gvlx4\" (UID: \"8a8ac68a-691d-4b52-9f2b-7343fe201c62\") " pod="openstack/keystone-a884-account-create-gvlx4" Oct 02 12:21:23 crc kubenswrapper[4766]: I1002 12:21:23.481826 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a884-account-create-gvlx4" Oct 02 12:21:23 crc kubenswrapper[4766]: I1002 12:21:23.909364 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a884-account-create-gvlx4"] Oct 02 12:21:23 crc kubenswrapper[4766]: W1002 12:21:23.916665 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a8ac68a_691d_4b52_9f2b_7343fe201c62.slice/crio-897ffea7808c955e8ddaa4d770b654c2df3d0b228c799b2ff5b48075af341e7a WatchSource:0}: Error finding container 897ffea7808c955e8ddaa4d770b654c2df3d0b228c799b2ff5b48075af341e7a: Status 404 returned error can't find the container with id 897ffea7808c955e8ddaa4d770b654c2df3d0b228c799b2ff5b48075af341e7a Oct 02 12:21:23 crc kubenswrapper[4766]: I1002 12:21:23.945029 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a884-account-create-gvlx4" event={"ID":"8a8ac68a-691d-4b52-9f2b-7343fe201c62","Type":"ContainerStarted","Data":"897ffea7808c955e8ddaa4d770b654c2df3d0b228c799b2ff5b48075af341e7a"} Oct 02 12:21:24 crc kubenswrapper[4766]: I1002 12:21:24.953536 4766 generic.go:334] "Generic (PLEG): container finished" podID="8a8ac68a-691d-4b52-9f2b-7343fe201c62" containerID="f152fe8f4dd1181b805f35d64d5d7a68433133d0624fa6882386389e9b4e4373" exitCode=0 Oct 02 12:21:24 crc kubenswrapper[4766]: I1002 12:21:24.953580 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a884-account-create-gvlx4" event={"ID":"8a8ac68a-691d-4b52-9f2b-7343fe201c62","Type":"ContainerDied","Data":"f152fe8f4dd1181b805f35d64d5d7a68433133d0624fa6882386389e9b4e4373"} Oct 02 12:21:26 crc kubenswrapper[4766]: I1002 12:21:26.273554 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a884-account-create-gvlx4" Oct 02 12:21:26 crc kubenswrapper[4766]: I1002 12:21:26.378328 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcjvl\" (UniqueName: \"kubernetes.io/projected/8a8ac68a-691d-4b52-9f2b-7343fe201c62-kube-api-access-kcjvl\") pod \"8a8ac68a-691d-4b52-9f2b-7343fe201c62\" (UID: \"8a8ac68a-691d-4b52-9f2b-7343fe201c62\") " Oct 02 12:21:26 crc kubenswrapper[4766]: I1002 12:21:26.384664 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a8ac68a-691d-4b52-9f2b-7343fe201c62-kube-api-access-kcjvl" (OuterVolumeSpecName: "kube-api-access-kcjvl") pod "8a8ac68a-691d-4b52-9f2b-7343fe201c62" (UID: "8a8ac68a-691d-4b52-9f2b-7343fe201c62"). InnerVolumeSpecName "kube-api-access-kcjvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:21:26 crc kubenswrapper[4766]: I1002 12:21:26.480382 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcjvl\" (UniqueName: \"kubernetes.io/projected/8a8ac68a-691d-4b52-9f2b-7343fe201c62-kube-api-access-kcjvl\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:26 crc kubenswrapper[4766]: I1002 12:21:26.976257 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a884-account-create-gvlx4" event={"ID":"8a8ac68a-691d-4b52-9f2b-7343fe201c62","Type":"ContainerDied","Data":"897ffea7808c955e8ddaa4d770b654c2df3d0b228c799b2ff5b48075af341e7a"} Oct 02 12:21:26 crc kubenswrapper[4766]: I1002 12:21:26.976312 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="897ffea7808c955e8ddaa4d770b654c2df3d0b228c799b2ff5b48075af341e7a" Oct 02 12:21:26 crc kubenswrapper[4766]: I1002 12:21:26.976388 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a884-account-create-gvlx4" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.537895 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-th9pl"] Oct 02 12:21:28 crc kubenswrapper[4766]: E1002 12:21:28.540434 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8ac68a-691d-4b52-9f2b-7343fe201c62" containerName="mariadb-account-create" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.540467 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8ac68a-691d-4b52-9f2b-7343fe201c62" containerName="mariadb-account-create" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.540707 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a8ac68a-691d-4b52-9f2b-7343fe201c62" containerName="mariadb-account-create" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.541463 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-th9pl" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.545046 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.545128 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.545178 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.545394 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zl29b" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.551669 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-th9pl"] Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.724610 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78bcl\" (UniqueName: \"kubernetes.io/projected/85a70b5d-e9b9-4ff9-a10c-33f95a054491-kube-api-access-78bcl\") pod \"keystone-db-sync-th9pl\" (UID: \"85a70b5d-e9b9-4ff9-a10c-33f95a054491\") " pod="openstack/keystone-db-sync-th9pl" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.724880 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a70b5d-e9b9-4ff9-a10c-33f95a054491-combined-ca-bundle\") pod \"keystone-db-sync-th9pl\" (UID: \"85a70b5d-e9b9-4ff9-a10c-33f95a054491\") " pod="openstack/keystone-db-sync-th9pl" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.725061 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a70b5d-e9b9-4ff9-a10c-33f95a054491-config-data\") pod \"keystone-db-sync-th9pl\" (UID: \"85a70b5d-e9b9-4ff9-a10c-33f95a054491\") " pod="openstack/keystone-db-sync-th9pl" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.826824 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78bcl\" (UniqueName: \"kubernetes.io/projected/85a70b5d-e9b9-4ff9-a10c-33f95a054491-kube-api-access-78bcl\") pod \"keystone-db-sync-th9pl\" (UID: \"85a70b5d-e9b9-4ff9-a10c-33f95a054491\") " pod="openstack/keystone-db-sync-th9pl" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.826949 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a70b5d-e9b9-4ff9-a10c-33f95a054491-combined-ca-bundle\") pod \"keystone-db-sync-th9pl\" (UID: \"85a70b5d-e9b9-4ff9-a10c-33f95a054491\") " pod="openstack/keystone-db-sync-th9pl" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.826993 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a70b5d-e9b9-4ff9-a10c-33f95a054491-config-data\") pod \"keystone-db-sync-th9pl\" (UID: \"85a70b5d-e9b9-4ff9-a10c-33f95a054491\") " pod="openstack/keystone-db-sync-th9pl" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.833949 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a70b5d-e9b9-4ff9-a10c-33f95a054491-config-data\") pod \"keystone-db-sync-th9pl\" (UID: \"85a70b5d-e9b9-4ff9-a10c-33f95a054491\") " pod="openstack/keystone-db-sync-th9pl" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.836157 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a70b5d-e9b9-4ff9-a10c-33f95a054491-combined-ca-bundle\") pod \"keystone-db-sync-th9pl\" (UID: \"85a70b5d-e9b9-4ff9-a10c-33f95a054491\") " pod="openstack/keystone-db-sync-th9pl" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.847224 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78bcl\" (UniqueName: \"kubernetes.io/projected/85a70b5d-e9b9-4ff9-a10c-33f95a054491-kube-api-access-78bcl\") pod \"keystone-db-sync-th9pl\" (UID: \"85a70b5d-e9b9-4ff9-a10c-33f95a054491\") " pod="openstack/keystone-db-sync-th9pl" Oct 02 12:21:28 crc kubenswrapper[4766]: I1002 12:21:28.883340 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-th9pl" Oct 02 12:21:29 crc kubenswrapper[4766]: I1002 12:21:29.368691 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-th9pl"] Oct 02 12:21:29 crc kubenswrapper[4766]: W1002 12:21:29.374654 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85a70b5d_e9b9_4ff9_a10c_33f95a054491.slice/crio-fecf233bbedb351d9b97a0f6a96e1df9a21058840d5fb967f72f5bf60b0308c1 WatchSource:0}: Error finding container fecf233bbedb351d9b97a0f6a96e1df9a21058840d5fb967f72f5bf60b0308c1: Status 404 returned error can't find the container with id fecf233bbedb351d9b97a0f6a96e1df9a21058840d5fb967f72f5bf60b0308c1 Oct 02 12:21:30 crc kubenswrapper[4766]: I1002 12:21:30.004528 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-th9pl" event={"ID":"85a70b5d-e9b9-4ff9-a10c-33f95a054491","Type":"ContainerStarted","Data":"711e5bfa63fa16d5c228a12b4a0e9ca42cf3d441d86f57ea0f5694c80bda342e"} Oct 02 12:21:30 crc kubenswrapper[4766]: I1002 12:21:30.005106 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-th9pl" event={"ID":"85a70b5d-e9b9-4ff9-a10c-33f95a054491","Type":"ContainerStarted","Data":"fecf233bbedb351d9b97a0f6a96e1df9a21058840d5fb967f72f5bf60b0308c1"} Oct 02 12:21:30 crc kubenswrapper[4766]: I1002 12:21:30.027790 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-th9pl" podStartSLOduration=2.027759437 podStartE2EDuration="2.027759437s" podCreationTimestamp="2025-10-02 12:21:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:21:30.024472342 +0000 UTC m=+5404.967343286" watchObservedRunningTime="2025-10-02 12:21:30.027759437 +0000 UTC m=+5404.970630391" Oct 02 12:21:32 crc kubenswrapper[4766]: I1002 12:21:32.029209 4766 generic.go:334] "Generic (PLEG): container finished" podID="85a70b5d-e9b9-4ff9-a10c-33f95a054491" containerID="711e5bfa63fa16d5c228a12b4a0e9ca42cf3d441d86f57ea0f5694c80bda342e" exitCode=0 Oct 02 12:21:32 crc kubenswrapper[4766]: I1002 12:21:32.029253 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-th9pl" event={"ID":"85a70b5d-e9b9-4ff9-a10c-33f95a054491","Type":"ContainerDied","Data":"711e5bfa63fa16d5c228a12b4a0e9ca42cf3d441d86f57ea0f5694c80bda342e"} Oct 02 12:21:33 crc kubenswrapper[4766]: I1002 12:21:33.376984 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-th9pl" Oct 02 12:21:33 crc kubenswrapper[4766]: I1002 12:21:33.513561 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a70b5d-e9b9-4ff9-a10c-33f95a054491-combined-ca-bundle\") pod \"85a70b5d-e9b9-4ff9-a10c-33f95a054491\" (UID: \"85a70b5d-e9b9-4ff9-a10c-33f95a054491\") " Oct 02 12:21:33 crc kubenswrapper[4766]: I1002 12:21:33.513749 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78bcl\" (UniqueName: \"kubernetes.io/projected/85a70b5d-e9b9-4ff9-a10c-33f95a054491-kube-api-access-78bcl\") pod \"85a70b5d-e9b9-4ff9-a10c-33f95a054491\" (UID: \"85a70b5d-e9b9-4ff9-a10c-33f95a054491\") " Oct 02 12:21:33 crc kubenswrapper[4766]: I1002 12:21:33.513802 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a70b5d-e9b9-4ff9-a10c-33f95a054491-config-data\") pod \"85a70b5d-e9b9-4ff9-a10c-33f95a054491\" (UID: \"85a70b5d-e9b9-4ff9-a10c-33f95a054491\") " Oct 02 12:21:33 crc kubenswrapper[4766]: I1002 12:21:33.519470 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85a70b5d-e9b9-4ff9-a10c-33f95a054491-kube-api-access-78bcl" (OuterVolumeSpecName: "kube-api-access-78bcl") pod "85a70b5d-e9b9-4ff9-a10c-33f95a054491" (UID: "85a70b5d-e9b9-4ff9-a10c-33f95a054491"). InnerVolumeSpecName "kube-api-access-78bcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:21:33 crc kubenswrapper[4766]: I1002 12:21:33.538972 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85a70b5d-e9b9-4ff9-a10c-33f95a054491-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85a70b5d-e9b9-4ff9-a10c-33f95a054491" (UID: "85a70b5d-e9b9-4ff9-a10c-33f95a054491"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:21:33 crc kubenswrapper[4766]: I1002 12:21:33.559837 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85a70b5d-e9b9-4ff9-a10c-33f95a054491-config-data" (OuterVolumeSpecName: "config-data") pod "85a70b5d-e9b9-4ff9-a10c-33f95a054491" (UID: "85a70b5d-e9b9-4ff9-a10c-33f95a054491"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:21:33 crc kubenswrapper[4766]: I1002 12:21:33.616451 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78bcl\" (UniqueName: \"kubernetes.io/projected/85a70b5d-e9b9-4ff9-a10c-33f95a054491-kube-api-access-78bcl\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:33 crc kubenswrapper[4766]: I1002 12:21:33.616558 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a70b5d-e9b9-4ff9-a10c-33f95a054491-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:33 crc kubenswrapper[4766]: I1002 12:21:33.616577 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a70b5d-e9b9-4ff9-a10c-33f95a054491-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.047552 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-th9pl" event={"ID":"85a70b5d-e9b9-4ff9-a10c-33f95a054491","Type":"ContainerDied","Data":"fecf233bbedb351d9b97a0f6a96e1df9a21058840d5fb967f72f5bf60b0308c1"} Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.047606 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-th9pl" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.047615 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fecf233bbedb351d9b97a0f6a96e1df9a21058840d5fb967f72f5bf60b0308c1" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.267304 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c568f7c8f-8x8gb"] Oct 02 12:21:34 crc kubenswrapper[4766]: E1002 12:21:34.267670 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a70b5d-e9b9-4ff9-a10c-33f95a054491" containerName="keystone-db-sync" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.267689 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a70b5d-e9b9-4ff9-a10c-33f95a054491" containerName="keystone-db-sync" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.267873 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="85a70b5d-e9b9-4ff9-a10c-33f95a054491" containerName="keystone-db-sync" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.268745 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.296809 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c568f7c8f-8x8gb"] Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.347927 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7bszm"] Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.349186 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.352413 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.352846 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.352967 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zl29b" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.353015 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.360612 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7bszm"] Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.430448 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-combined-ca-bundle\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.430582 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxg8h\" (UniqueName: \"kubernetes.io/projected/82a33696-df5c-4529-ae67-a7b78bd20819-kube-api-access-vxg8h\") pod \"dnsmasq-dns-6c568f7c8f-8x8gb\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.430611 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-config\") pod \"dnsmasq-dns-6c568f7c8f-8x8gb\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.430650 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-config-data\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.430671 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-credential-keys\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.430725 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-fernet-keys\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.430766 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-scripts\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.430788 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-dns-svc\") pod \"dnsmasq-dns-6c568f7c8f-8x8gb\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.430838 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-ovsdbserver-nb\") pod \"dnsmasq-dns-6c568f7c8f-8x8gb\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.430914 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-ovsdbserver-sb\") pod \"dnsmasq-dns-6c568f7c8f-8x8gb\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.431000 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54lk5\" (UniqueName: \"kubernetes.io/projected/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-kube-api-access-54lk5\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.532757 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-fernet-keys\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.533114 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-scripts\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.533151 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-dns-svc\") pod \"dnsmasq-dns-6c568f7c8f-8x8gb\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.533188 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-ovsdbserver-nb\") pod \"dnsmasq-dns-6c568f7c8f-8x8gb\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.533238 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-ovsdbserver-sb\") pod \"dnsmasq-dns-6c568f7c8f-8x8gb\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.533290 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54lk5\" (UniqueName: \"kubernetes.io/projected/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-kube-api-access-54lk5\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.533318 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-combined-ca-bundle\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.533369 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxg8h\" (UniqueName: \"kubernetes.io/projected/82a33696-df5c-4529-ae67-a7b78bd20819-kube-api-access-vxg8h\") pod \"dnsmasq-dns-6c568f7c8f-8x8gb\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.533429 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-config\") pod \"dnsmasq-dns-6c568f7c8f-8x8gb\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.533469 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-config-data\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.533496 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-credential-keys\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.535020 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-dns-svc\") pod \"dnsmasq-dns-6c568f7c8f-8x8gb\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.535662 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-ovsdbserver-sb\") pod \"dnsmasq-dns-6c568f7c8f-8x8gb\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.535680 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-config\") pod \"dnsmasq-dns-6c568f7c8f-8x8gb\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.536065 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-ovsdbserver-nb\") pod \"dnsmasq-dns-6c568f7c8f-8x8gb\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.538700 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-scripts\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.538699 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-fernet-keys\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.539056 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-config-data\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.539419 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-combined-ca-bundle\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.539916 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-credential-keys\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.554441 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54lk5\" (UniqueName: \"kubernetes.io/projected/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-kube-api-access-54lk5\") pod \"keystone-bootstrap-7bszm\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.554491 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxg8h\" (UniqueName: \"kubernetes.io/projected/82a33696-df5c-4529-ae67-a7b78bd20819-kube-api-access-vxg8h\") pod \"dnsmasq-dns-6c568f7c8f-8x8gb\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.594586 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:34 crc kubenswrapper[4766]: I1002 12:21:34.667100 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:35 crc kubenswrapper[4766]: I1002 12:21:35.117188 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c568f7c8f-8x8gb"] Oct 02 12:21:35 crc kubenswrapper[4766]: I1002 12:21:35.180894 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7bszm"] Oct 02 12:21:35 crc kubenswrapper[4766]: I1002 12:21:35.198737 4766 scope.go:117] "RemoveContainer" containerID="c1dd8fd73a1d6e341f575513e3287e186f929214855a53c7c7cc4f9488222795" Oct 02 12:21:35 crc kubenswrapper[4766]: I1002 12:21:35.227677 4766 scope.go:117] "RemoveContainer" containerID="eed742e9bb0bbc991e9b8aa57e14717cdf90bf437717ecf42cd6d2afcd0386b9" Oct 02 12:21:36 crc kubenswrapper[4766]: I1002 12:21:36.065779 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7bszm" event={"ID":"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec","Type":"ContainerStarted","Data":"99fa510c7eea63d8fb340958cb18f6ca7fcbb65cf925fc8b16c3f5214a18c806"} Oct 02 12:21:36 crc kubenswrapper[4766]: I1002 12:21:36.066255 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7bszm" event={"ID":"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec","Type":"ContainerStarted","Data":"6bbcb215c85a5f073645fda19b5fcc599faf169fd12b207e49295375bb5230d2"} Oct 02 12:21:36 crc kubenswrapper[4766]: I1002 12:21:36.068401 4766 generic.go:334] "Generic (PLEG): container finished" podID="82a33696-df5c-4529-ae67-a7b78bd20819" containerID="8fd7eee6f8257c95f3196e47492951ec33e9abb744d4234d572227644c576b88" exitCode=0 Oct 02 12:21:36 crc kubenswrapper[4766]: I1002 12:21:36.068461 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" event={"ID":"82a33696-df5c-4529-ae67-a7b78bd20819","Type":"ContainerDied","Data":"8fd7eee6f8257c95f3196e47492951ec33e9abb744d4234d572227644c576b88"} Oct 02 12:21:36 crc kubenswrapper[4766]: I1002 12:21:36.068496 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" event={"ID":"82a33696-df5c-4529-ae67-a7b78bd20819","Type":"ContainerStarted","Data":"42eab045605f5334806d3d0cf0d16fd6e00f2444168b26af254dfeb0f44a8a5e"} Oct 02 12:21:36 crc kubenswrapper[4766]: I1002 12:21:36.087469 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7bszm" podStartSLOduration=2.087441 podStartE2EDuration="2.087441s" podCreationTimestamp="2025-10-02 12:21:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:21:36.083961708 +0000 UTC m=+5411.026832652" watchObservedRunningTime="2025-10-02 12:21:36.087441 +0000 UTC m=+5411.030311944" Oct 02 12:21:37 crc kubenswrapper[4766]: I1002 12:21:37.085993 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" event={"ID":"82a33696-df5c-4529-ae67-a7b78bd20819","Type":"ContainerStarted","Data":"8d3312ef5427078d677b8e92dee643b76437d3f40f10f0597ef740cb6427a7d6"} Oct 02 12:21:37 crc kubenswrapper[4766]: I1002 12:21:37.117253 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" podStartSLOduration=3.11723111 podStartE2EDuration="3.11723111s" podCreationTimestamp="2025-10-02 12:21:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:21:37.109781881 +0000 UTC m=+5412.052652835" watchObservedRunningTime="2025-10-02 12:21:37.11723111 +0000 UTC m=+5412.060102054" Oct 02 12:21:38 crc kubenswrapper[4766]: I1002 12:21:38.094797 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:40 crc kubenswrapper[4766]: I1002 12:21:40.117230 4766 generic.go:334] "Generic (PLEG): container finished" podID="6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec" containerID="99fa510c7eea63d8fb340958cb18f6ca7fcbb65cf925fc8b16c3f5214a18c806" exitCode=0 Oct 02 12:21:40 crc kubenswrapper[4766]: I1002 12:21:40.117405 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7bszm" event={"ID":"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec","Type":"ContainerDied","Data":"99fa510c7eea63d8fb340958cb18f6ca7fcbb65cf925fc8b16c3f5214a18c806"} Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.498317 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.670635 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54lk5\" (UniqueName: \"kubernetes.io/projected/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-kube-api-access-54lk5\") pod \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.670717 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-fernet-keys\") pod \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.670747 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-combined-ca-bundle\") pod \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.670854 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-credential-keys\") pod \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.670894 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-config-data\") pod \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.670940 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-scripts\") pod \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\" (UID: \"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec\") " Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.677363 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec" (UID: "6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.677650 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-scripts" (OuterVolumeSpecName: "scripts") pod "6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec" (UID: "6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.677805 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec" (UID: "6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.677968 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-kube-api-access-54lk5" (OuterVolumeSpecName: "kube-api-access-54lk5") pod "6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec" (UID: "6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec"). InnerVolumeSpecName "kube-api-access-54lk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.700603 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec" (UID: "6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.701143 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-config-data" (OuterVolumeSpecName: "config-data") pod "6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec" (UID: "6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.774157 4766 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.774191 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.774200 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.774209 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54lk5\" (UniqueName: \"kubernetes.io/projected/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-kube-api-access-54lk5\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.774219 4766 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:41 crc kubenswrapper[4766]: I1002 12:21:41.774230 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.147672 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7bszm" event={"ID":"6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec","Type":"ContainerDied","Data":"6bbcb215c85a5f073645fda19b5fcc599faf169fd12b207e49295375bb5230d2"} Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.147720 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bbcb215c85a5f073645fda19b5fcc599faf169fd12b207e49295375bb5230d2" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.147790 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7bszm" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.230855 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7bszm"] Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.237397 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7bszm"] Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.316992 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-qsk9z"] Oct 02 12:21:42 crc kubenswrapper[4766]: E1002 12:21:42.317413 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec" containerName="keystone-bootstrap" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.317437 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec" containerName="keystone-bootstrap" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.317627 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec" containerName="keystone-bootstrap" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.318296 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.321571 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.321768 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.321915 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zl29b" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.322114 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.343658 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qsk9z"] Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.489241 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-config-data\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.489369 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-combined-ca-bundle\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.489443 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-credential-keys\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.489521 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-fernet-keys\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.489594 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-scripts\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.489776 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4glq\" (UniqueName: \"kubernetes.io/projected/20ea34ce-4dc9-4338-b967-8b43085a24e5-kube-api-access-v4glq\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.591165 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-scripts\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.591239 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4glq\" (UniqueName: \"kubernetes.io/projected/20ea34ce-4dc9-4338-b967-8b43085a24e5-kube-api-access-v4glq\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.591306 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-config-data\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.591351 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-combined-ca-bundle\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.591373 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-credential-keys\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.591398 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-fernet-keys\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.609979 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-scripts\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.610100 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-config-data\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.610172 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-fernet-keys\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.614020 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-combined-ca-bundle\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.622266 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4glq\" (UniqueName: \"kubernetes.io/projected/20ea34ce-4dc9-4338-b967-8b43085a24e5-kube-api-access-v4glq\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.623031 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-credential-keys\") pod \"keystone-bootstrap-qsk9z\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:42 crc kubenswrapper[4766]: I1002 12:21:42.638551 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:43 crc kubenswrapper[4766]: I1002 12:21:43.090200 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-qsk9z"] Oct 02 12:21:43 crc kubenswrapper[4766]: I1002 12:21:43.157008 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qsk9z" event={"ID":"20ea34ce-4dc9-4338-b967-8b43085a24e5","Type":"ContainerStarted","Data":"35a11445478782a971a1f27007225076ba8feb8a651e420fabb91047c9af4bd7"} Oct 02 12:21:43 crc kubenswrapper[4766]: I1002 12:21:43.892363 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec" path="/var/lib/kubelet/pods/6ac7f7ff-85e4-4af4-bdf9-98b97fade6ec/volumes" Oct 02 12:21:44 crc kubenswrapper[4766]: I1002 12:21:44.165971 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qsk9z" event={"ID":"20ea34ce-4dc9-4338-b967-8b43085a24e5","Type":"ContainerStarted","Data":"361cc80e8fc351e0092760f1e57526b19f060f043dc42c4124362a8155e2c2de"} Oct 02 12:21:44 crc kubenswrapper[4766]: I1002 12:21:44.199130 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-qsk9z" podStartSLOduration=2.199104238 podStartE2EDuration="2.199104238s" podCreationTimestamp="2025-10-02 12:21:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:21:44.190154951 +0000 UTC m=+5419.133025895" watchObservedRunningTime="2025-10-02 12:21:44.199104238 +0000 UTC m=+5419.141975182" Oct 02 12:21:44 crc kubenswrapper[4766]: I1002 12:21:44.595745 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:21:44 crc kubenswrapper[4766]: I1002 12:21:44.657018 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98cbd5ff7-4qtjg"] Oct 02 12:21:44 crc kubenswrapper[4766]: I1002 12:21:44.657406 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" podUID="1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e" containerName="dnsmasq-dns" containerID="cri-o://963829ec9ae72105303eb74de2dea6160242d4f5c6d2babc8c50942fa2834978" gracePeriod=10 Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.178468 4766 generic.go:334] "Generic (PLEG): container finished" podID="1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e" containerID="963829ec9ae72105303eb74de2dea6160242d4f5c6d2babc8c50942fa2834978" exitCode=0 Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.178620 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" event={"ID":"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e","Type":"ContainerDied","Data":"963829ec9ae72105303eb74de2dea6160242d4f5c6d2babc8c50942fa2834978"} Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.178937 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" event={"ID":"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e","Type":"ContainerDied","Data":"4c73cfef327f763a386ee0bbfc0947c935349546e4ea63548a8f3f70e58bd14d"} Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.178957 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c73cfef327f763a386ee0bbfc0947c935349546e4ea63548a8f3f70e58bd14d" Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.187172 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.350469 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-dns-svc\") pod \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.350574 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-ovsdbserver-sb\") pod \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.350762 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prnn4\" (UniqueName: \"kubernetes.io/projected/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-kube-api-access-prnn4\") pod \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.350796 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-config\") pod \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.350818 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-ovsdbserver-nb\") pod \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\" (UID: \"1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e\") " Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.356751 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-kube-api-access-prnn4" (OuterVolumeSpecName: "kube-api-access-prnn4") pod "1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e" (UID: "1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e"). InnerVolumeSpecName "kube-api-access-prnn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.395233 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e" (UID: "1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.396477 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e" (UID: "1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.400878 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-config" (OuterVolumeSpecName: "config") pod "1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e" (UID: "1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.411522 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e" (UID: "1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.453254 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.453297 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prnn4\" (UniqueName: \"kubernetes.io/projected/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-kube-api-access-prnn4\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.453315 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.453328 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:45 crc kubenswrapper[4766]: I1002 12:21:45.453696 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:46 crc kubenswrapper[4766]: I1002 12:21:46.188200 4766 generic.go:334] "Generic (PLEG): container finished" podID="20ea34ce-4dc9-4338-b967-8b43085a24e5" containerID="361cc80e8fc351e0092760f1e57526b19f060f043dc42c4124362a8155e2c2de" exitCode=0 Oct 02 12:21:46 crc kubenswrapper[4766]: I1002 12:21:46.188291 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qsk9z" event={"ID":"20ea34ce-4dc9-4338-b967-8b43085a24e5","Type":"ContainerDied","Data":"361cc80e8fc351e0092760f1e57526b19f060f043dc42c4124362a8155e2c2de"} Oct 02 12:21:46 crc kubenswrapper[4766]: I1002 12:21:46.188657 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98cbd5ff7-4qtjg" Oct 02 12:21:46 crc kubenswrapper[4766]: I1002 12:21:46.211673 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98cbd5ff7-4qtjg"] Oct 02 12:21:46 crc kubenswrapper[4766]: I1002 12:21:46.217798 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98cbd5ff7-4qtjg"] Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.502298 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.589928 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4glq\" (UniqueName: \"kubernetes.io/projected/20ea34ce-4dc9-4338-b967-8b43085a24e5-kube-api-access-v4glq\") pod \"20ea34ce-4dc9-4338-b967-8b43085a24e5\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.590083 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-fernet-keys\") pod \"20ea34ce-4dc9-4338-b967-8b43085a24e5\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.590154 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-config-data\") pod \"20ea34ce-4dc9-4338-b967-8b43085a24e5\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.590181 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-credential-keys\") pod \"20ea34ce-4dc9-4338-b967-8b43085a24e5\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.590220 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-combined-ca-bundle\") pod \"20ea34ce-4dc9-4338-b967-8b43085a24e5\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.590298 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-scripts\") pod \"20ea34ce-4dc9-4338-b967-8b43085a24e5\" (UID: \"20ea34ce-4dc9-4338-b967-8b43085a24e5\") " Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.598163 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "20ea34ce-4dc9-4338-b967-8b43085a24e5" (UID: "20ea34ce-4dc9-4338-b967-8b43085a24e5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.598275 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ea34ce-4dc9-4338-b967-8b43085a24e5-kube-api-access-v4glq" (OuterVolumeSpecName: "kube-api-access-v4glq") pod "20ea34ce-4dc9-4338-b967-8b43085a24e5" (UID: "20ea34ce-4dc9-4338-b967-8b43085a24e5"). InnerVolumeSpecName "kube-api-access-v4glq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.602000 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "20ea34ce-4dc9-4338-b967-8b43085a24e5" (UID: "20ea34ce-4dc9-4338-b967-8b43085a24e5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.606664 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-scripts" (OuterVolumeSpecName: "scripts") pod "20ea34ce-4dc9-4338-b967-8b43085a24e5" (UID: "20ea34ce-4dc9-4338-b967-8b43085a24e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.621457 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20ea34ce-4dc9-4338-b967-8b43085a24e5" (UID: "20ea34ce-4dc9-4338-b967-8b43085a24e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.626153 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-config-data" (OuterVolumeSpecName: "config-data") pod "20ea34ce-4dc9-4338-b967-8b43085a24e5" (UID: "20ea34ce-4dc9-4338-b967-8b43085a24e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.692778 4766 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.692832 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.692847 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.692857 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4glq\" (UniqueName: \"kubernetes.io/projected/20ea34ce-4dc9-4338-b967-8b43085a24e5-kube-api-access-v4glq\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.692867 4766 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.692875 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20ea34ce-4dc9-4338-b967-8b43085a24e5-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:21:47 crc kubenswrapper[4766]: I1002 12:21:47.898137 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e" path="/var/lib/kubelet/pods/1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e/volumes" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.207166 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-qsk9z" event={"ID":"20ea34ce-4dc9-4338-b967-8b43085a24e5","Type":"ContainerDied","Data":"35a11445478782a971a1f27007225076ba8feb8a651e420fabb91047c9af4bd7"} Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.207221 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-qsk9z" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.207249 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35a11445478782a971a1f27007225076ba8feb8a651e420fabb91047c9af4bd7" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.324378 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5f56bd4789-wt4gd"] Oct 02 12:21:48 crc kubenswrapper[4766]: E1002 12:21:48.324980 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e" containerName="init" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.325007 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e" containerName="init" Oct 02 12:21:48 crc kubenswrapper[4766]: E1002 12:21:48.325033 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ea34ce-4dc9-4338-b967-8b43085a24e5" containerName="keystone-bootstrap" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.325042 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ea34ce-4dc9-4338-b967-8b43085a24e5" containerName="keystone-bootstrap" Oct 02 12:21:48 crc kubenswrapper[4766]: E1002 12:21:48.325056 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e" containerName="dnsmasq-dns" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.325066 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e" containerName="dnsmasq-dns" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.329770 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e81fcf2-3d4d-4bfb-8fe6-76c3c26bb69e" containerName="dnsmasq-dns" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.329815 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ea34ce-4dc9-4338-b967-8b43085a24e5" containerName="keystone-bootstrap" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.330673 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.335242 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.335482 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.335661 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zl29b" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.336332 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.345030 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5f56bd4789-wt4gd"] Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.405491 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-config-data\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.405566 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-scripts\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.405626 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcp4x\" (UniqueName: \"kubernetes.io/projected/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-kube-api-access-kcp4x\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.405869 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-credential-keys\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.405987 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-combined-ca-bundle\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.406018 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-fernet-keys\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.508576 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-credential-keys\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.509931 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-combined-ca-bundle\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.509982 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-fernet-keys\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.510139 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-config-data\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.510189 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-scripts\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.510306 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcp4x\" (UniqueName: \"kubernetes.io/projected/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-kube-api-access-kcp4x\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.514236 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-credential-keys\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.515151 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-combined-ca-bundle\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.516168 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-config-data\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.525000 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-scripts\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.527647 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-fernet-keys\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.532633 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcp4x\" (UniqueName: \"kubernetes.io/projected/d6827aee-b0ec-4d7a-a38c-31cb39c3679d-kube-api-access-kcp4x\") pod \"keystone-5f56bd4789-wt4gd\" (UID: \"d6827aee-b0ec-4d7a-a38c-31cb39c3679d\") " pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:48 crc kubenswrapper[4766]: I1002 12:21:48.665391 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:49 crc kubenswrapper[4766]: I1002 12:21:49.174490 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5f56bd4789-wt4gd"] Oct 02 12:21:49 crc kubenswrapper[4766]: W1002 12:21:49.181958 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6827aee_b0ec_4d7a_a38c_31cb39c3679d.slice/crio-27cc64f7aebed4578be1de79a6e66f2c51aa03b5dac90992b0cc74f6fdd0b2c5 WatchSource:0}: Error finding container 27cc64f7aebed4578be1de79a6e66f2c51aa03b5dac90992b0cc74f6fdd0b2c5: Status 404 returned error can't find the container with id 27cc64f7aebed4578be1de79a6e66f2c51aa03b5dac90992b0cc74f6fdd0b2c5 Oct 02 12:21:49 crc kubenswrapper[4766]: I1002 12:21:49.217722 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f56bd4789-wt4gd" event={"ID":"d6827aee-b0ec-4d7a-a38c-31cb39c3679d","Type":"ContainerStarted","Data":"27cc64f7aebed4578be1de79a6e66f2c51aa03b5dac90992b0cc74f6fdd0b2c5"} Oct 02 12:21:50 crc kubenswrapper[4766]: I1002 12:21:50.229460 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5f56bd4789-wt4gd" event={"ID":"d6827aee-b0ec-4d7a-a38c-31cb39c3679d","Type":"ContainerStarted","Data":"446a5a866c206247dbebac2620b9642dacc5b07dde8570de44ff6ea7f02ed764"} Oct 02 12:21:50 crc kubenswrapper[4766]: I1002 12:21:50.231733 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:21:50 crc kubenswrapper[4766]: I1002 12:21:50.257470 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5f56bd4789-wt4gd" podStartSLOduration=2.257441076 podStartE2EDuration="2.257441076s" podCreationTimestamp="2025-10-02 12:21:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:21:50.249268764 +0000 UTC m=+5425.192139718" watchObservedRunningTime="2025-10-02 12:21:50.257441076 +0000 UTC m=+5425.200312020" Oct 02 12:21:54 crc kubenswrapper[4766]: I1002 12:21:54.432149 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:21:54 crc kubenswrapper[4766]: I1002 12:21:54.432677 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:21:58 crc kubenswrapper[4766]: I1002 12:21:58.509928 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-58tgc"] Oct 02 12:21:58 crc kubenswrapper[4766]: I1002 12:21:58.512403 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-58tgc" Oct 02 12:21:58 crc kubenswrapper[4766]: I1002 12:21:58.519106 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-58tgc"] Oct 02 12:21:58 crc kubenswrapper[4766]: I1002 12:21:58.601586 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c8e6cf-ff8f-4028-a754-0852cc37109d-catalog-content\") pod \"community-operators-58tgc\" (UID: \"36c8e6cf-ff8f-4028-a754-0852cc37109d\") " pod="openshift-marketplace/community-operators-58tgc" Oct 02 12:21:58 crc kubenswrapper[4766]: I1002 12:21:58.601665 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c8e6cf-ff8f-4028-a754-0852cc37109d-utilities\") pod \"community-operators-58tgc\" (UID: \"36c8e6cf-ff8f-4028-a754-0852cc37109d\") " pod="openshift-marketplace/community-operators-58tgc" Oct 02 12:21:58 crc kubenswrapper[4766]: I1002 12:21:58.601689 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5ksp\" (UniqueName: \"kubernetes.io/projected/36c8e6cf-ff8f-4028-a754-0852cc37109d-kube-api-access-b5ksp\") pod \"community-operators-58tgc\" (UID: \"36c8e6cf-ff8f-4028-a754-0852cc37109d\") " pod="openshift-marketplace/community-operators-58tgc" Oct 02 12:21:58 crc kubenswrapper[4766]: I1002 12:21:58.703130 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c8e6cf-ff8f-4028-a754-0852cc37109d-utilities\") pod \"community-operators-58tgc\" (UID: \"36c8e6cf-ff8f-4028-a754-0852cc37109d\") " pod="openshift-marketplace/community-operators-58tgc" Oct 02 12:21:58 crc kubenswrapper[4766]: I1002 12:21:58.703189 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5ksp\" (UniqueName: \"kubernetes.io/projected/36c8e6cf-ff8f-4028-a754-0852cc37109d-kube-api-access-b5ksp\") pod \"community-operators-58tgc\" (UID: \"36c8e6cf-ff8f-4028-a754-0852cc37109d\") " pod="openshift-marketplace/community-operators-58tgc" Oct 02 12:21:58 crc kubenswrapper[4766]: I1002 12:21:58.703299 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c8e6cf-ff8f-4028-a754-0852cc37109d-catalog-content\") pod \"community-operators-58tgc\" (UID: \"36c8e6cf-ff8f-4028-a754-0852cc37109d\") " pod="openshift-marketplace/community-operators-58tgc" Oct 02 12:21:58 crc kubenswrapper[4766]: I1002 12:21:58.703759 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c8e6cf-ff8f-4028-a754-0852cc37109d-utilities\") pod \"community-operators-58tgc\" (UID: \"36c8e6cf-ff8f-4028-a754-0852cc37109d\") " pod="openshift-marketplace/community-operators-58tgc" Oct 02 12:21:58 crc kubenswrapper[4766]: I1002 12:21:58.703806 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c8e6cf-ff8f-4028-a754-0852cc37109d-catalog-content\") pod \"community-operators-58tgc\" (UID: \"36c8e6cf-ff8f-4028-a754-0852cc37109d\") " pod="openshift-marketplace/community-operators-58tgc" Oct 02 12:21:58 crc kubenswrapper[4766]: I1002 12:21:58.724230 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5ksp\" (UniqueName: \"kubernetes.io/projected/36c8e6cf-ff8f-4028-a754-0852cc37109d-kube-api-access-b5ksp\") pod \"community-operators-58tgc\" (UID: \"36c8e6cf-ff8f-4028-a754-0852cc37109d\") " pod="openshift-marketplace/community-operators-58tgc" Oct 02 12:21:58 crc kubenswrapper[4766]: I1002 12:21:58.876738 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-58tgc" Oct 02 12:21:59 crc kubenswrapper[4766]: I1002 12:21:59.360418 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-58tgc"] Oct 02 12:21:59 crc kubenswrapper[4766]: W1002 12:21:59.370744 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36c8e6cf_ff8f_4028_a754_0852cc37109d.slice/crio-4f52e8d07ece0b1e98d3a638dd0454c8d3ebd80b5202d9f981cf6453da02ecb2 WatchSource:0}: Error finding container 4f52e8d07ece0b1e98d3a638dd0454c8d3ebd80b5202d9f981cf6453da02ecb2: Status 404 returned error can't find the container with id 4f52e8d07ece0b1e98d3a638dd0454c8d3ebd80b5202d9f981cf6453da02ecb2 Oct 02 12:22:00 crc kubenswrapper[4766]: I1002 12:22:00.314414 4766 generic.go:334] "Generic (PLEG): container finished" podID="36c8e6cf-ff8f-4028-a754-0852cc37109d" containerID="8f1bf13d2285fd52e481a92f629e99b6e54210d8dfa78606bf4cd062ac7d88d8" exitCode=0 Oct 02 12:22:00 crc kubenswrapper[4766]: I1002 12:22:00.314562 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58tgc" event={"ID":"36c8e6cf-ff8f-4028-a754-0852cc37109d","Type":"ContainerDied","Data":"8f1bf13d2285fd52e481a92f629e99b6e54210d8dfa78606bf4cd062ac7d88d8"} Oct 02 12:22:00 crc kubenswrapper[4766]: I1002 12:22:00.314747 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58tgc" event={"ID":"36c8e6cf-ff8f-4028-a754-0852cc37109d","Type":"ContainerStarted","Data":"4f52e8d07ece0b1e98d3a638dd0454c8d3ebd80b5202d9f981cf6453da02ecb2"} Oct 02 12:22:00 crc kubenswrapper[4766]: I1002 12:22:00.316654 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:22:01 crc kubenswrapper[4766]: I1002 12:22:01.327260 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58tgc" event={"ID":"36c8e6cf-ff8f-4028-a754-0852cc37109d","Type":"ContainerStarted","Data":"851d271d3b577d2cd05e809b22f883abbdd3a463bfb55a2ccaf867cb76580e8e"} Oct 02 12:22:02 crc kubenswrapper[4766]: I1002 12:22:02.339341 4766 generic.go:334] "Generic (PLEG): container finished" podID="36c8e6cf-ff8f-4028-a754-0852cc37109d" containerID="851d271d3b577d2cd05e809b22f883abbdd3a463bfb55a2ccaf867cb76580e8e" exitCode=0 Oct 02 12:22:02 crc kubenswrapper[4766]: I1002 12:22:02.339479 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58tgc" event={"ID":"36c8e6cf-ff8f-4028-a754-0852cc37109d","Type":"ContainerDied","Data":"851d271d3b577d2cd05e809b22f883abbdd3a463bfb55a2ccaf867cb76580e8e"} Oct 02 12:22:03 crc kubenswrapper[4766]: I1002 12:22:03.353013 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58tgc" event={"ID":"36c8e6cf-ff8f-4028-a754-0852cc37109d","Type":"ContainerStarted","Data":"e5aee622939845c48fb4972ca00ea09e0d15d693759974e33fe4b8b9f83162cb"} Oct 02 12:22:03 crc kubenswrapper[4766]: I1002 12:22:03.372671 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-58tgc" podStartSLOduration=2.7774474319999998 podStartE2EDuration="5.372652102s" podCreationTimestamp="2025-10-02 12:21:58 +0000 UTC" firstStartedPulling="2025-10-02 12:22:00.316352269 +0000 UTC m=+5435.259223203" lastFinishedPulling="2025-10-02 12:22:02.911556929 +0000 UTC m=+5437.854427873" observedRunningTime="2025-10-02 12:22:03.37103458 +0000 UTC m=+5438.313905524" watchObservedRunningTime="2025-10-02 12:22:03.372652102 +0000 UTC m=+5438.315523046" Oct 02 12:22:08 crc kubenswrapper[4766]: I1002 12:22:08.878399 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-58tgc" Oct 02 12:22:08 crc kubenswrapper[4766]: I1002 12:22:08.878819 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-58tgc" Oct 02 12:22:08 crc kubenswrapper[4766]: I1002 12:22:08.929914 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-58tgc" Oct 02 12:22:09 crc kubenswrapper[4766]: I1002 12:22:09.457837 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-58tgc" Oct 02 12:22:09 crc kubenswrapper[4766]: I1002 12:22:09.507906 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-58tgc"] Oct 02 12:22:11 crc kubenswrapper[4766]: I1002 12:22:11.434932 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-58tgc" podUID="36c8e6cf-ff8f-4028-a754-0852cc37109d" containerName="registry-server" containerID="cri-o://e5aee622939845c48fb4972ca00ea09e0d15d693759974e33fe4b8b9f83162cb" gracePeriod=2 Oct 02 12:22:12 crc kubenswrapper[4766]: I1002 12:22:12.444436 4766 generic.go:334] "Generic (PLEG): container finished" podID="36c8e6cf-ff8f-4028-a754-0852cc37109d" containerID="e5aee622939845c48fb4972ca00ea09e0d15d693759974e33fe4b8b9f83162cb" exitCode=0 Oct 02 12:22:12 crc kubenswrapper[4766]: I1002 12:22:12.444513 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58tgc" event={"ID":"36c8e6cf-ff8f-4028-a754-0852cc37109d","Type":"ContainerDied","Data":"e5aee622939845c48fb4972ca00ea09e0d15d693759974e33fe4b8b9f83162cb"} Oct 02 12:22:12 crc kubenswrapper[4766]: I1002 12:22:12.444829 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-58tgc" event={"ID":"36c8e6cf-ff8f-4028-a754-0852cc37109d","Type":"ContainerDied","Data":"4f52e8d07ece0b1e98d3a638dd0454c8d3ebd80b5202d9f981cf6453da02ecb2"} Oct 02 12:22:12 crc kubenswrapper[4766]: I1002 12:22:12.444851 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f52e8d07ece0b1e98d3a638dd0454c8d3ebd80b5202d9f981cf6453da02ecb2" Oct 02 12:22:12 crc kubenswrapper[4766]: I1002 12:22:12.454275 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-58tgc" Oct 02 12:22:12 crc kubenswrapper[4766]: I1002 12:22:12.651714 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5ksp\" (UniqueName: \"kubernetes.io/projected/36c8e6cf-ff8f-4028-a754-0852cc37109d-kube-api-access-b5ksp\") pod \"36c8e6cf-ff8f-4028-a754-0852cc37109d\" (UID: \"36c8e6cf-ff8f-4028-a754-0852cc37109d\") " Oct 02 12:22:12 crc kubenswrapper[4766]: I1002 12:22:12.651945 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c8e6cf-ff8f-4028-a754-0852cc37109d-utilities\") pod \"36c8e6cf-ff8f-4028-a754-0852cc37109d\" (UID: \"36c8e6cf-ff8f-4028-a754-0852cc37109d\") " Oct 02 12:22:12 crc kubenswrapper[4766]: I1002 12:22:12.651995 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c8e6cf-ff8f-4028-a754-0852cc37109d-catalog-content\") pod \"36c8e6cf-ff8f-4028-a754-0852cc37109d\" (UID: \"36c8e6cf-ff8f-4028-a754-0852cc37109d\") " Oct 02 12:22:12 crc kubenswrapper[4766]: I1002 12:22:12.652966 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36c8e6cf-ff8f-4028-a754-0852cc37109d-utilities" (OuterVolumeSpecName: "utilities") pod "36c8e6cf-ff8f-4028-a754-0852cc37109d" (UID: "36c8e6cf-ff8f-4028-a754-0852cc37109d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:22:12 crc kubenswrapper[4766]: I1002 12:22:12.666631 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c8e6cf-ff8f-4028-a754-0852cc37109d-kube-api-access-b5ksp" (OuterVolumeSpecName: "kube-api-access-b5ksp") pod "36c8e6cf-ff8f-4028-a754-0852cc37109d" (UID: "36c8e6cf-ff8f-4028-a754-0852cc37109d"). InnerVolumeSpecName "kube-api-access-b5ksp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:22:12 crc kubenswrapper[4766]: I1002 12:22:12.703745 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36c8e6cf-ff8f-4028-a754-0852cc37109d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36c8e6cf-ff8f-4028-a754-0852cc37109d" (UID: "36c8e6cf-ff8f-4028-a754-0852cc37109d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:22:12 crc kubenswrapper[4766]: I1002 12:22:12.754117 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36c8e6cf-ff8f-4028-a754-0852cc37109d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:22:12 crc kubenswrapper[4766]: I1002 12:22:12.754168 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5ksp\" (UniqueName: \"kubernetes.io/projected/36c8e6cf-ff8f-4028-a754-0852cc37109d-kube-api-access-b5ksp\") on node \"crc\" DevicePath \"\"" Oct 02 12:22:12 crc kubenswrapper[4766]: I1002 12:22:12.754180 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36c8e6cf-ff8f-4028-a754-0852cc37109d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:22:13 crc kubenswrapper[4766]: I1002 12:22:13.453963 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-58tgc" Oct 02 12:22:13 crc kubenswrapper[4766]: I1002 12:22:13.489098 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-58tgc"] Oct 02 12:22:13 crc kubenswrapper[4766]: I1002 12:22:13.502562 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-58tgc"] Oct 02 12:22:13 crc kubenswrapper[4766]: I1002 12:22:13.891128 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c8e6cf-ff8f-4028-a754-0852cc37109d" path="/var/lib/kubelet/pods/36c8e6cf-ff8f-4028-a754-0852cc37109d/volumes" Oct 02 12:22:20 crc kubenswrapper[4766]: I1002 12:22:20.197901 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5f56bd4789-wt4gd" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.189295 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 02 12:22:23 crc kubenswrapper[4766]: E1002 12:22:23.190274 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c8e6cf-ff8f-4028-a754-0852cc37109d" containerName="extract-utilities" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.191715 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c8e6cf-ff8f-4028-a754-0852cc37109d" containerName="extract-utilities" Oct 02 12:22:23 crc kubenswrapper[4766]: E1002 12:22:23.191849 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c8e6cf-ff8f-4028-a754-0852cc37109d" containerName="extract-content" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.191865 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c8e6cf-ff8f-4028-a754-0852cc37109d" containerName="extract-content" Oct 02 12:22:23 crc kubenswrapper[4766]: E1002 12:22:23.191897 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c8e6cf-ff8f-4028-a754-0852cc37109d" containerName="registry-server" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.191905 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c8e6cf-ff8f-4028-a754-0852cc37109d" containerName="registry-server" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.192380 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c8e6cf-ff8f-4028-a754-0852cc37109d" containerName="registry-server" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.193285 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.195702 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.196689 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.196989 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-tl9vt" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.197410 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.342700 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/34f0b55d-1a54-413b-8131-71b5816277c4-openstack-config\") pod \"openstackclient\" (UID: \"34f0b55d-1a54-413b-8131-71b5816277c4\") " pod="openstack/openstackclient" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.343068 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/34f0b55d-1a54-413b-8131-71b5816277c4-openstack-config-secret\") pod \"openstackclient\" (UID: \"34f0b55d-1a54-413b-8131-71b5816277c4\") " pod="openstack/openstackclient" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.343175 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v4ld\" (UniqueName: \"kubernetes.io/projected/34f0b55d-1a54-413b-8131-71b5816277c4-kube-api-access-8v4ld\") pod \"openstackclient\" (UID: \"34f0b55d-1a54-413b-8131-71b5816277c4\") " pod="openstack/openstackclient" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.444407 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/34f0b55d-1a54-413b-8131-71b5816277c4-openstack-config-secret\") pod \"openstackclient\" (UID: \"34f0b55d-1a54-413b-8131-71b5816277c4\") " pod="openstack/openstackclient" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.444471 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v4ld\" (UniqueName: \"kubernetes.io/projected/34f0b55d-1a54-413b-8131-71b5816277c4-kube-api-access-8v4ld\") pod \"openstackclient\" (UID: \"34f0b55d-1a54-413b-8131-71b5816277c4\") " pod="openstack/openstackclient" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.444535 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/34f0b55d-1a54-413b-8131-71b5816277c4-openstack-config\") pod \"openstackclient\" (UID: \"34f0b55d-1a54-413b-8131-71b5816277c4\") " pod="openstack/openstackclient" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.445471 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/34f0b55d-1a54-413b-8131-71b5816277c4-openstack-config\") pod \"openstackclient\" (UID: \"34f0b55d-1a54-413b-8131-71b5816277c4\") " pod="openstack/openstackclient" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.451009 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/34f0b55d-1a54-413b-8131-71b5816277c4-openstack-config-secret\") pod \"openstackclient\" (UID: \"34f0b55d-1a54-413b-8131-71b5816277c4\") " pod="openstack/openstackclient" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.469083 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v4ld\" (UniqueName: \"kubernetes.io/projected/34f0b55d-1a54-413b-8131-71b5816277c4-kube-api-access-8v4ld\") pod \"openstackclient\" (UID: \"34f0b55d-1a54-413b-8131-71b5816277c4\") " pod="openstack/openstackclient" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.524163 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 12:22:23 crc kubenswrapper[4766]: I1002 12:22:23.984253 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 12:22:24 crc kubenswrapper[4766]: I1002 12:22:24.431914 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:22:24 crc kubenswrapper[4766]: I1002 12:22:24.432319 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:22:24 crc kubenswrapper[4766]: I1002 12:22:24.543039 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"34f0b55d-1a54-413b-8131-71b5816277c4","Type":"ContainerStarted","Data":"a1a03a1fcf20e0b259fd245736580119ca7cb59fac730795381c654f08dfdbca"} Oct 02 12:22:24 crc kubenswrapper[4766]: I1002 12:22:24.543109 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"34f0b55d-1a54-413b-8131-71b5816277c4","Type":"ContainerStarted","Data":"5f092805a9d179564d3c0274530667768f86e383c4800b92b41acad31dbdf96d"} Oct 02 12:22:24 crc kubenswrapper[4766]: I1002 12:22:24.560784 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.560762076 podStartE2EDuration="1.560762076s" podCreationTimestamp="2025-10-02 12:22:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:22:24.558126402 +0000 UTC m=+5459.500997366" watchObservedRunningTime="2025-10-02 12:22:24.560762076 +0000 UTC m=+5459.503633020" Oct 02 12:22:35 crc kubenswrapper[4766]: I1002 12:22:35.328147 4766 scope.go:117] "RemoveContainer" containerID="257bf96cd249f58dc74ae8c124eab422fd626a996ce99eb8aa4a4f43449af723" Oct 02 12:22:35 crc kubenswrapper[4766]: I1002 12:22:35.357460 4766 scope.go:117] "RemoveContainer" containerID="a5d2a45b808b06fbd3a0d9821d2c5bb16074dade85d55825f64a36ee300bc34a" Oct 02 12:22:35 crc kubenswrapper[4766]: I1002 12:22:35.402624 4766 scope.go:117] "RemoveContainer" containerID="28937aa03a29015a781675cd13c20a1db60c6f2617adf9efb5672ba9c794cbcb" Oct 02 12:22:35 crc kubenswrapper[4766]: I1002 12:22:35.436352 4766 scope.go:117] "RemoveContainer" containerID="7b5729f02b6a5a60fb4aead817aba78fa7ff1174d49db85abafb46f886f856eb" Oct 02 12:22:54 crc kubenswrapper[4766]: I1002 12:22:54.432064 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:22:54 crc kubenswrapper[4766]: I1002 12:22:54.432726 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:22:54 crc kubenswrapper[4766]: I1002 12:22:54.432776 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 12:22:54 crc kubenswrapper[4766]: I1002 12:22:54.433388 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7be1eb90dbe4a2beb104498de2e75466d69e548c7f34b0f0f5bbe74fe4681dd4"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:22:54 crc kubenswrapper[4766]: I1002 12:22:54.433445 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://7be1eb90dbe4a2beb104498de2e75466d69e548c7f34b0f0f5bbe74fe4681dd4" gracePeriod=600 Oct 02 12:22:54 crc kubenswrapper[4766]: I1002 12:22:54.826283 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="7be1eb90dbe4a2beb104498de2e75466d69e548c7f34b0f0f5bbe74fe4681dd4" exitCode=0 Oct 02 12:22:54 crc kubenswrapper[4766]: I1002 12:22:54.826364 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"7be1eb90dbe4a2beb104498de2e75466d69e548c7f34b0f0f5bbe74fe4681dd4"} Oct 02 12:22:54 crc kubenswrapper[4766]: I1002 12:22:54.826793 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984"} Oct 02 12:22:54 crc kubenswrapper[4766]: I1002 12:22:54.826825 4766 scope.go:117] "RemoveContainer" containerID="87a9f167d4ef11245d11d4b5371d1728d24d28f8839e0811aa60cdd393d62465" Oct 02 12:24:12 crc kubenswrapper[4766]: I1002 12:24:12.766970 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-9sl5k"] Oct 02 12:24:12 crc kubenswrapper[4766]: I1002 12:24:12.768891 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9sl5k" Oct 02 12:24:12 crc kubenswrapper[4766]: I1002 12:24:12.784015 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-9sl5k"] Oct 02 12:24:12 crc kubenswrapper[4766]: I1002 12:24:12.867016 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5q4j\" (UniqueName: \"kubernetes.io/projected/fcf42f52-167c-4156-81a6-931d97a25017-kube-api-access-r5q4j\") pod \"barbican-db-create-9sl5k\" (UID: \"fcf42f52-167c-4156-81a6-931d97a25017\") " pod="openstack/barbican-db-create-9sl5k" Oct 02 12:24:12 crc kubenswrapper[4766]: I1002 12:24:12.969545 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5q4j\" (UniqueName: \"kubernetes.io/projected/fcf42f52-167c-4156-81a6-931d97a25017-kube-api-access-r5q4j\") pod \"barbican-db-create-9sl5k\" (UID: \"fcf42f52-167c-4156-81a6-931d97a25017\") " pod="openstack/barbican-db-create-9sl5k" Oct 02 12:24:12 crc kubenswrapper[4766]: I1002 12:24:12.995276 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5q4j\" (UniqueName: \"kubernetes.io/projected/fcf42f52-167c-4156-81a6-931d97a25017-kube-api-access-r5q4j\") pod \"barbican-db-create-9sl5k\" (UID: \"fcf42f52-167c-4156-81a6-931d97a25017\") " pod="openstack/barbican-db-create-9sl5k" Oct 02 12:24:13 crc kubenswrapper[4766]: I1002 12:24:13.090876 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9sl5k" Oct 02 12:24:13 crc kubenswrapper[4766]: I1002 12:24:13.533719 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-9sl5k"] Oct 02 12:24:13 crc kubenswrapper[4766]: I1002 12:24:13.612127 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9sl5k" event={"ID":"fcf42f52-167c-4156-81a6-931d97a25017","Type":"ContainerStarted","Data":"1f66485e775aee76c33f36e676be823d031c4d9a2dba9e5021f8953144226964"} Oct 02 12:24:14 crc kubenswrapper[4766]: I1002 12:24:14.623108 4766 generic.go:334] "Generic (PLEG): container finished" podID="fcf42f52-167c-4156-81a6-931d97a25017" containerID="c62bec45e41c683cb15e79bf201131637cef78af02e507b6817e786cb4106d29" exitCode=0 Oct 02 12:24:14 crc kubenswrapper[4766]: I1002 12:24:14.623253 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9sl5k" event={"ID":"fcf42f52-167c-4156-81a6-931d97a25017","Type":"ContainerDied","Data":"c62bec45e41c683cb15e79bf201131637cef78af02e507b6817e786cb4106d29"} Oct 02 12:24:15 crc kubenswrapper[4766]: I1002 12:24:15.936275 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9sl5k" Oct 02 12:24:16 crc kubenswrapper[4766]: I1002 12:24:16.022100 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5q4j\" (UniqueName: \"kubernetes.io/projected/fcf42f52-167c-4156-81a6-931d97a25017-kube-api-access-r5q4j\") pod \"fcf42f52-167c-4156-81a6-931d97a25017\" (UID: \"fcf42f52-167c-4156-81a6-931d97a25017\") " Oct 02 12:24:16 crc kubenswrapper[4766]: I1002 12:24:16.028361 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcf42f52-167c-4156-81a6-931d97a25017-kube-api-access-r5q4j" (OuterVolumeSpecName: "kube-api-access-r5q4j") pod "fcf42f52-167c-4156-81a6-931d97a25017" (UID: "fcf42f52-167c-4156-81a6-931d97a25017"). InnerVolumeSpecName "kube-api-access-r5q4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:24:16 crc kubenswrapper[4766]: I1002 12:24:16.124733 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5q4j\" (UniqueName: \"kubernetes.io/projected/fcf42f52-167c-4156-81a6-931d97a25017-kube-api-access-r5q4j\") on node \"crc\" DevicePath \"\"" Oct 02 12:24:16 crc kubenswrapper[4766]: I1002 12:24:16.643269 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-9sl5k" event={"ID":"fcf42f52-167c-4156-81a6-931d97a25017","Type":"ContainerDied","Data":"1f66485e775aee76c33f36e676be823d031c4d9a2dba9e5021f8953144226964"} Oct 02 12:24:16 crc kubenswrapper[4766]: I1002 12:24:16.643341 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f66485e775aee76c33f36e676be823d031c4d9a2dba9e5021f8953144226964" Oct 02 12:24:16 crc kubenswrapper[4766]: I1002 12:24:16.643475 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-9sl5k" Oct 02 12:24:22 crc kubenswrapper[4766]: I1002 12:24:22.762301 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-7a42-account-create-m774k"] Oct 02 12:24:22 crc kubenswrapper[4766]: E1002 12:24:22.763164 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf42f52-167c-4156-81a6-931d97a25017" containerName="mariadb-database-create" Oct 02 12:24:22 crc kubenswrapper[4766]: I1002 12:24:22.763180 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf42f52-167c-4156-81a6-931d97a25017" containerName="mariadb-database-create" Oct 02 12:24:22 crc kubenswrapper[4766]: I1002 12:24:22.763336 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcf42f52-167c-4156-81a6-931d97a25017" containerName="mariadb-database-create" Oct 02 12:24:22 crc kubenswrapper[4766]: I1002 12:24:22.764034 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7a42-account-create-m774k" Oct 02 12:24:22 crc kubenswrapper[4766]: I1002 12:24:22.766106 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 02 12:24:22 crc kubenswrapper[4766]: I1002 12:24:22.778878 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-7a42-account-create-m774k"] Oct 02 12:24:22 crc kubenswrapper[4766]: I1002 12:24:22.839798 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lvlf\" (UniqueName: \"kubernetes.io/projected/723650c5-e986-47e1-adc6-bf11bb38b84c-kube-api-access-4lvlf\") pod \"barbican-7a42-account-create-m774k\" (UID: \"723650c5-e986-47e1-adc6-bf11bb38b84c\") " pod="openstack/barbican-7a42-account-create-m774k" Oct 02 12:24:22 crc kubenswrapper[4766]: I1002 12:24:22.941906 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lvlf\" (UniqueName: \"kubernetes.io/projected/723650c5-e986-47e1-adc6-bf11bb38b84c-kube-api-access-4lvlf\") pod \"barbican-7a42-account-create-m774k\" (UID: \"723650c5-e986-47e1-adc6-bf11bb38b84c\") " pod="openstack/barbican-7a42-account-create-m774k" Oct 02 12:24:22 crc kubenswrapper[4766]: I1002 12:24:22.960428 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lvlf\" (UniqueName: \"kubernetes.io/projected/723650c5-e986-47e1-adc6-bf11bb38b84c-kube-api-access-4lvlf\") pod \"barbican-7a42-account-create-m774k\" (UID: \"723650c5-e986-47e1-adc6-bf11bb38b84c\") " pod="openstack/barbican-7a42-account-create-m774k" Oct 02 12:24:23 crc kubenswrapper[4766]: I1002 12:24:23.080552 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7a42-account-create-m774k" Oct 02 12:24:23 crc kubenswrapper[4766]: I1002 12:24:23.535895 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-7a42-account-create-m774k"] Oct 02 12:24:23 crc kubenswrapper[4766]: I1002 12:24:23.699903 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7a42-account-create-m774k" event={"ID":"723650c5-e986-47e1-adc6-bf11bb38b84c","Type":"ContainerStarted","Data":"952b0b32193a830b4c2cc8c2d3b8b03e77229520f006dc67c57eae7b0c6468f1"} Oct 02 12:24:24 crc kubenswrapper[4766]: I1002 12:24:24.709375 4766 generic.go:334] "Generic (PLEG): container finished" podID="723650c5-e986-47e1-adc6-bf11bb38b84c" containerID="73f38845f3f0c4f3c2a10872d1205ad9ce3d06f4bb05ffeb25129dcc07cc73d7" exitCode=0 Oct 02 12:24:24 crc kubenswrapper[4766]: I1002 12:24:24.709472 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7a42-account-create-m774k" event={"ID":"723650c5-e986-47e1-adc6-bf11bb38b84c","Type":"ContainerDied","Data":"73f38845f3f0c4f3c2a10872d1205ad9ce3d06f4bb05ffeb25129dcc07cc73d7"} Oct 02 12:24:26 crc kubenswrapper[4766]: I1002 12:24:26.038962 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7a42-account-create-m774k" Oct 02 12:24:26 crc kubenswrapper[4766]: I1002 12:24:26.091137 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lvlf\" (UniqueName: \"kubernetes.io/projected/723650c5-e986-47e1-adc6-bf11bb38b84c-kube-api-access-4lvlf\") pod \"723650c5-e986-47e1-adc6-bf11bb38b84c\" (UID: \"723650c5-e986-47e1-adc6-bf11bb38b84c\") " Oct 02 12:24:26 crc kubenswrapper[4766]: I1002 12:24:26.096526 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/723650c5-e986-47e1-adc6-bf11bb38b84c-kube-api-access-4lvlf" (OuterVolumeSpecName: "kube-api-access-4lvlf") pod "723650c5-e986-47e1-adc6-bf11bb38b84c" (UID: "723650c5-e986-47e1-adc6-bf11bb38b84c"). InnerVolumeSpecName "kube-api-access-4lvlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:24:26 crc kubenswrapper[4766]: I1002 12:24:26.192778 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lvlf\" (UniqueName: \"kubernetes.io/projected/723650c5-e986-47e1-adc6-bf11bb38b84c-kube-api-access-4lvlf\") on node \"crc\" DevicePath \"\"" Oct 02 12:24:26 crc kubenswrapper[4766]: I1002 12:24:26.725147 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-7a42-account-create-m774k" event={"ID":"723650c5-e986-47e1-adc6-bf11bb38b84c","Type":"ContainerDied","Data":"952b0b32193a830b4c2cc8c2d3b8b03e77229520f006dc67c57eae7b0c6468f1"} Oct 02 12:24:26 crc kubenswrapper[4766]: I1002 12:24:26.725636 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="952b0b32193a830b4c2cc8c2d3b8b03e77229520f006dc67c57eae7b0c6468f1" Oct 02 12:24:26 crc kubenswrapper[4766]: I1002 12:24:26.725253 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-7a42-account-create-m774k" Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.063246 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-wq959"] Oct 02 12:24:28 crc kubenswrapper[4766]: E1002 12:24:28.063664 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723650c5-e986-47e1-adc6-bf11bb38b84c" containerName="mariadb-account-create" Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.063679 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="723650c5-e986-47e1-adc6-bf11bb38b84c" containerName="mariadb-account-create" Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.063874 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="723650c5-e986-47e1-adc6-bf11bb38b84c" containerName="mariadb-account-create" Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.064424 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wq959" Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.066613 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-jfnf5" Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.067463 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.076087 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wq959"] Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.123437 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a01af80c-f331-4cac-8f4e-7653f4b4f296-db-sync-config-data\") pod \"barbican-db-sync-wq959\" (UID: \"a01af80c-f331-4cac-8f4e-7653f4b4f296\") " pod="openstack/barbican-db-sync-wq959" Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.123616 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a01af80c-f331-4cac-8f4e-7653f4b4f296-combined-ca-bundle\") pod \"barbican-db-sync-wq959\" (UID: \"a01af80c-f331-4cac-8f4e-7653f4b4f296\") " pod="openstack/barbican-db-sync-wq959" Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.123674 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7drr\" (UniqueName: \"kubernetes.io/projected/a01af80c-f331-4cac-8f4e-7653f4b4f296-kube-api-access-j7drr\") pod \"barbican-db-sync-wq959\" (UID: \"a01af80c-f331-4cac-8f4e-7653f4b4f296\") " pod="openstack/barbican-db-sync-wq959" Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.225076 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a01af80c-f331-4cac-8f4e-7653f4b4f296-combined-ca-bundle\") pod \"barbican-db-sync-wq959\" (UID: \"a01af80c-f331-4cac-8f4e-7653f4b4f296\") " pod="openstack/barbican-db-sync-wq959" Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.225368 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7drr\" (UniqueName: \"kubernetes.io/projected/a01af80c-f331-4cac-8f4e-7653f4b4f296-kube-api-access-j7drr\") pod \"barbican-db-sync-wq959\" (UID: \"a01af80c-f331-4cac-8f4e-7653f4b4f296\") " pod="openstack/barbican-db-sync-wq959" Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.225430 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a01af80c-f331-4cac-8f4e-7653f4b4f296-db-sync-config-data\") pod \"barbican-db-sync-wq959\" (UID: \"a01af80c-f331-4cac-8f4e-7653f4b4f296\") " pod="openstack/barbican-db-sync-wq959" Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.230564 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a01af80c-f331-4cac-8f4e-7653f4b4f296-combined-ca-bundle\") pod \"barbican-db-sync-wq959\" (UID: \"a01af80c-f331-4cac-8f4e-7653f4b4f296\") " pod="openstack/barbican-db-sync-wq959" Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.231454 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a01af80c-f331-4cac-8f4e-7653f4b4f296-db-sync-config-data\") pod \"barbican-db-sync-wq959\" (UID: \"a01af80c-f331-4cac-8f4e-7653f4b4f296\") " pod="openstack/barbican-db-sync-wq959" Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.242934 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7drr\" (UniqueName: \"kubernetes.io/projected/a01af80c-f331-4cac-8f4e-7653f4b4f296-kube-api-access-j7drr\") pod \"barbican-db-sync-wq959\" (UID: \"a01af80c-f331-4cac-8f4e-7653f4b4f296\") " pod="openstack/barbican-db-sync-wq959" Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.388754 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wq959" Oct 02 12:24:28 crc kubenswrapper[4766]: I1002 12:24:28.854779 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-wq959"] Oct 02 12:24:29 crc kubenswrapper[4766]: I1002 12:24:29.750144 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wq959" event={"ID":"a01af80c-f331-4cac-8f4e-7653f4b4f296","Type":"ContainerStarted","Data":"d6a55397fb39c5e4d77dc188c1335d17402b7ee6dc6da06eb032277a50320162"} Oct 02 12:24:29 crc kubenswrapper[4766]: I1002 12:24:29.750476 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wq959" event={"ID":"a01af80c-f331-4cac-8f4e-7653f4b4f296","Type":"ContainerStarted","Data":"cab5f07642e90798f05db78403b5da39dbe95bfec4b201f3401b22f50d00434d"} Oct 02 12:24:29 crc kubenswrapper[4766]: I1002 12:24:29.765618 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-wq959" podStartSLOduration=1.765601816 podStartE2EDuration="1.765601816s" podCreationTimestamp="2025-10-02 12:24:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:24:29.763687665 +0000 UTC m=+5584.706558609" watchObservedRunningTime="2025-10-02 12:24:29.765601816 +0000 UTC m=+5584.708472760" Oct 02 12:24:30 crc kubenswrapper[4766]: I1002 12:24:30.775806 4766 generic.go:334] "Generic (PLEG): container finished" podID="a01af80c-f331-4cac-8f4e-7653f4b4f296" containerID="d6a55397fb39c5e4d77dc188c1335d17402b7ee6dc6da06eb032277a50320162" exitCode=0 Oct 02 12:24:30 crc kubenswrapper[4766]: I1002 12:24:30.775900 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wq959" event={"ID":"a01af80c-f331-4cac-8f4e-7653f4b4f296","Type":"ContainerDied","Data":"d6a55397fb39c5e4d77dc188c1335d17402b7ee6dc6da06eb032277a50320162"} Oct 02 12:24:32 crc kubenswrapper[4766]: I1002 12:24:32.086421 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wq959" Oct 02 12:24:32 crc kubenswrapper[4766]: I1002 12:24:32.192120 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a01af80c-f331-4cac-8f4e-7653f4b4f296-combined-ca-bundle\") pod \"a01af80c-f331-4cac-8f4e-7653f4b4f296\" (UID: \"a01af80c-f331-4cac-8f4e-7653f4b4f296\") " Oct 02 12:24:32 crc kubenswrapper[4766]: I1002 12:24:32.192274 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a01af80c-f331-4cac-8f4e-7653f4b4f296-db-sync-config-data\") pod \"a01af80c-f331-4cac-8f4e-7653f4b4f296\" (UID: \"a01af80c-f331-4cac-8f4e-7653f4b4f296\") " Oct 02 12:24:32 crc kubenswrapper[4766]: I1002 12:24:32.192352 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7drr\" (UniqueName: \"kubernetes.io/projected/a01af80c-f331-4cac-8f4e-7653f4b4f296-kube-api-access-j7drr\") pod \"a01af80c-f331-4cac-8f4e-7653f4b4f296\" (UID: \"a01af80c-f331-4cac-8f4e-7653f4b4f296\") " Oct 02 12:24:32 crc kubenswrapper[4766]: I1002 12:24:32.197083 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a01af80c-f331-4cac-8f4e-7653f4b4f296-kube-api-access-j7drr" (OuterVolumeSpecName: "kube-api-access-j7drr") pod "a01af80c-f331-4cac-8f4e-7653f4b4f296" (UID: "a01af80c-f331-4cac-8f4e-7653f4b4f296"). InnerVolumeSpecName "kube-api-access-j7drr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:24:32 crc kubenswrapper[4766]: I1002 12:24:32.197908 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a01af80c-f331-4cac-8f4e-7653f4b4f296-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a01af80c-f331-4cac-8f4e-7653f4b4f296" (UID: "a01af80c-f331-4cac-8f4e-7653f4b4f296"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:24:32 crc kubenswrapper[4766]: I1002 12:24:32.215758 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a01af80c-f331-4cac-8f4e-7653f4b4f296-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a01af80c-f331-4cac-8f4e-7653f4b4f296" (UID: "a01af80c-f331-4cac-8f4e-7653f4b4f296"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:24:32 crc kubenswrapper[4766]: I1002 12:24:32.294770 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a01af80c-f331-4cac-8f4e-7653f4b4f296-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:24:32 crc kubenswrapper[4766]: I1002 12:24:32.294806 4766 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a01af80c-f331-4cac-8f4e-7653f4b4f296-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:24:32 crc kubenswrapper[4766]: I1002 12:24:32.294816 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7drr\" (UniqueName: \"kubernetes.io/projected/a01af80c-f331-4cac-8f4e-7653f4b4f296-kube-api-access-j7drr\") on node \"crc\" DevicePath \"\"" Oct 02 12:24:32 crc kubenswrapper[4766]: I1002 12:24:32.803874 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-wq959" event={"ID":"a01af80c-f331-4cac-8f4e-7653f4b4f296","Type":"ContainerDied","Data":"cab5f07642e90798f05db78403b5da39dbe95bfec4b201f3401b22f50d00434d"} Oct 02 12:24:32 crc kubenswrapper[4766]: I1002 12:24:32.803918 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cab5f07642e90798f05db78403b5da39dbe95bfec4b201f3401b22f50d00434d" Oct 02 12:24:32 crc kubenswrapper[4766]: I1002 12:24:32.803979 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-wq959" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.100916 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d"] Oct 02 12:24:33 crc kubenswrapper[4766]: E1002 12:24:33.101342 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a01af80c-f331-4cac-8f4e-7653f4b4f296" containerName="barbican-db-sync" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.101359 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a01af80c-f331-4cac-8f4e-7653f4b4f296" containerName="barbican-db-sync" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.101582 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a01af80c-f331-4cac-8f4e-7653f4b4f296" containerName="barbican-db-sync" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.102522 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.114166 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-jfnf5" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.123435 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.127810 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.133217 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-f95f7bf9c-29dkx"] Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.135067 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-f95f7bf9c-29dkx" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.149847 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.154363 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d"] Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.176650 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-f95f7bf9c-29dkx"] Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.197908 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54df6f9497-rfpjr"] Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.200993 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.211036 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65be6ba-47fe-4928-b461-53031fd0e5eb-logs\") pod \"barbican-keystone-listener-5b6ffc6db8-wgm5d\" (UID: \"e65be6ba-47fe-4928-b461-53031fd0e5eb\") " pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.211104 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e65be6ba-47fe-4928-b461-53031fd0e5eb-config-data-custom\") pod \"barbican-keystone-listener-5b6ffc6db8-wgm5d\" (UID: \"e65be6ba-47fe-4928-b461-53031fd0e5eb\") " pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.211143 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65be6ba-47fe-4928-b461-53031fd0e5eb-combined-ca-bundle\") pod \"barbican-keystone-listener-5b6ffc6db8-wgm5d\" (UID: \"e65be6ba-47fe-4928-b461-53031fd0e5eb\") " pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.211168 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/385a22c3-88e6-49f7-8e51-147925a9baef-combined-ca-bundle\") pod \"barbican-worker-f95f7bf9c-29dkx\" (UID: \"385a22c3-88e6-49f7-8e51-147925a9baef\") " pod="openstack/barbican-worker-f95f7bf9c-29dkx" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.211193 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e65be6ba-47fe-4928-b461-53031fd0e5eb-config-data\") pod \"barbican-keystone-listener-5b6ffc6db8-wgm5d\" (UID: \"e65be6ba-47fe-4928-b461-53031fd0e5eb\") " pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.211266 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndg5s\" (UniqueName: \"kubernetes.io/projected/e65be6ba-47fe-4928-b461-53031fd0e5eb-kube-api-access-ndg5s\") pod \"barbican-keystone-listener-5b6ffc6db8-wgm5d\" (UID: \"e65be6ba-47fe-4928-b461-53031fd0e5eb\") " pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.211291 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl94s\" (UniqueName: \"kubernetes.io/projected/385a22c3-88e6-49f7-8e51-147925a9baef-kube-api-access-tl94s\") pod \"barbican-worker-f95f7bf9c-29dkx\" (UID: \"385a22c3-88e6-49f7-8e51-147925a9baef\") " pod="openstack/barbican-worker-f95f7bf9c-29dkx" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.212301 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/385a22c3-88e6-49f7-8e51-147925a9baef-config-data-custom\") pod \"barbican-worker-f95f7bf9c-29dkx\" (UID: \"385a22c3-88e6-49f7-8e51-147925a9baef\") " pod="openstack/barbican-worker-f95f7bf9c-29dkx" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.212381 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/385a22c3-88e6-49f7-8e51-147925a9baef-logs\") pod \"barbican-worker-f95f7bf9c-29dkx\" (UID: \"385a22c3-88e6-49f7-8e51-147925a9baef\") " pod="openstack/barbican-worker-f95f7bf9c-29dkx" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.212412 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/385a22c3-88e6-49f7-8e51-147925a9baef-config-data\") pod \"barbican-worker-f95f7bf9c-29dkx\" (UID: \"385a22c3-88e6-49f7-8e51-147925a9baef\") " pod="openstack/barbican-worker-f95f7bf9c-29dkx" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.245261 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54df6f9497-rfpjr"] Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.313899 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65be6ba-47fe-4928-b461-53031fd0e5eb-logs\") pod \"barbican-keystone-listener-5b6ffc6db8-wgm5d\" (UID: \"e65be6ba-47fe-4928-b461-53031fd0e5eb\") " pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.313958 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e65be6ba-47fe-4928-b461-53031fd0e5eb-config-data-custom\") pod \"barbican-keystone-listener-5b6ffc6db8-wgm5d\" (UID: \"e65be6ba-47fe-4928-b461-53031fd0e5eb\") " pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.313984 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65be6ba-47fe-4928-b461-53031fd0e5eb-combined-ca-bundle\") pod \"barbican-keystone-listener-5b6ffc6db8-wgm5d\" (UID: \"e65be6ba-47fe-4928-b461-53031fd0e5eb\") " pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.314002 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/385a22c3-88e6-49f7-8e51-147925a9baef-combined-ca-bundle\") pod \"barbican-worker-f95f7bf9c-29dkx\" (UID: \"385a22c3-88e6-49f7-8e51-147925a9baef\") " pod="openstack/barbican-worker-f95f7bf9c-29dkx" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.314022 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e65be6ba-47fe-4928-b461-53031fd0e5eb-config-data\") pod \"barbican-keystone-listener-5b6ffc6db8-wgm5d\" (UID: \"e65be6ba-47fe-4928-b461-53031fd0e5eb\") " pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.314067 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndg5s\" (UniqueName: \"kubernetes.io/projected/e65be6ba-47fe-4928-b461-53031fd0e5eb-kube-api-access-ndg5s\") pod \"barbican-keystone-listener-5b6ffc6db8-wgm5d\" (UID: \"e65be6ba-47fe-4928-b461-53031fd0e5eb\") " pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.314086 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl94s\" (UniqueName: \"kubernetes.io/projected/385a22c3-88e6-49f7-8e51-147925a9baef-kube-api-access-tl94s\") pod \"barbican-worker-f95f7bf9c-29dkx\" (UID: \"385a22c3-88e6-49f7-8e51-147925a9baef\") " pod="openstack/barbican-worker-f95f7bf9c-29dkx" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.314126 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-dns-svc\") pod \"dnsmasq-dns-54df6f9497-rfpjr\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.314147 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/385a22c3-88e6-49f7-8e51-147925a9baef-config-data-custom\") pod \"barbican-worker-f95f7bf9c-29dkx\" (UID: \"385a22c3-88e6-49f7-8e51-147925a9baef\") " pod="openstack/barbican-worker-f95f7bf9c-29dkx" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.314164 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7msf9\" (UniqueName: \"kubernetes.io/projected/5298592c-b60f-4916-9c04-ae15d5dd3236-kube-api-access-7msf9\") pod \"dnsmasq-dns-54df6f9497-rfpjr\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.314184 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/385a22c3-88e6-49f7-8e51-147925a9baef-logs\") pod \"barbican-worker-f95f7bf9c-29dkx\" (UID: \"385a22c3-88e6-49f7-8e51-147925a9baef\") " pod="openstack/barbican-worker-f95f7bf9c-29dkx" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.314204 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/385a22c3-88e6-49f7-8e51-147925a9baef-config-data\") pod \"barbican-worker-f95f7bf9c-29dkx\" (UID: \"385a22c3-88e6-49f7-8e51-147925a9baef\") " pod="openstack/barbican-worker-f95f7bf9c-29dkx" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.314225 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-ovsdbserver-sb\") pod \"dnsmasq-dns-54df6f9497-rfpjr\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.314250 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-ovsdbserver-nb\") pod \"dnsmasq-dns-54df6f9497-rfpjr\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.314268 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-config\") pod \"dnsmasq-dns-54df6f9497-rfpjr\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.314802 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65be6ba-47fe-4928-b461-53031fd0e5eb-logs\") pod \"barbican-keystone-listener-5b6ffc6db8-wgm5d\" (UID: \"e65be6ba-47fe-4928-b461-53031fd0e5eb\") " pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.318563 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/385a22c3-88e6-49f7-8e51-147925a9baef-logs\") pod \"barbican-worker-f95f7bf9c-29dkx\" (UID: \"385a22c3-88e6-49f7-8e51-147925a9baef\") " pod="openstack/barbican-worker-f95f7bf9c-29dkx" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.330762 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/385a22c3-88e6-49f7-8e51-147925a9baef-config-data\") pod \"barbican-worker-f95f7bf9c-29dkx\" (UID: \"385a22c3-88e6-49f7-8e51-147925a9baef\") " pod="openstack/barbican-worker-f95f7bf9c-29dkx" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.330903 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65be6ba-47fe-4928-b461-53031fd0e5eb-combined-ca-bundle\") pod \"barbican-keystone-listener-5b6ffc6db8-wgm5d\" (UID: \"e65be6ba-47fe-4928-b461-53031fd0e5eb\") " pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.331244 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e65be6ba-47fe-4928-b461-53031fd0e5eb-config-data-custom\") pod \"barbican-keystone-listener-5b6ffc6db8-wgm5d\" (UID: \"e65be6ba-47fe-4928-b461-53031fd0e5eb\") " pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.332461 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e65be6ba-47fe-4928-b461-53031fd0e5eb-config-data\") pod \"barbican-keystone-listener-5b6ffc6db8-wgm5d\" (UID: \"e65be6ba-47fe-4928-b461-53031fd0e5eb\") " pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.341852 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/385a22c3-88e6-49f7-8e51-147925a9baef-combined-ca-bundle\") pod \"barbican-worker-f95f7bf9c-29dkx\" (UID: \"385a22c3-88e6-49f7-8e51-147925a9baef\") " pod="openstack/barbican-worker-f95f7bf9c-29dkx" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.342331 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/385a22c3-88e6-49f7-8e51-147925a9baef-config-data-custom\") pod \"barbican-worker-f95f7bf9c-29dkx\" (UID: \"385a22c3-88e6-49f7-8e51-147925a9baef\") " pod="openstack/barbican-worker-f95f7bf9c-29dkx" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.350098 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl94s\" (UniqueName: \"kubernetes.io/projected/385a22c3-88e6-49f7-8e51-147925a9baef-kube-api-access-tl94s\") pod \"barbican-worker-f95f7bf9c-29dkx\" (UID: \"385a22c3-88e6-49f7-8e51-147925a9baef\") " pod="openstack/barbican-worker-f95f7bf9c-29dkx" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.374319 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndg5s\" (UniqueName: \"kubernetes.io/projected/e65be6ba-47fe-4928-b461-53031fd0e5eb-kube-api-access-ndg5s\") pod \"barbican-keystone-listener-5b6ffc6db8-wgm5d\" (UID: \"e65be6ba-47fe-4928-b461-53031fd0e5eb\") " pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.413978 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5c58d8d958-sws7w"] Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.415734 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.416533 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-dns-svc\") pod \"dnsmasq-dns-54df6f9497-rfpjr\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.416599 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7msf9\" (UniqueName: \"kubernetes.io/projected/5298592c-b60f-4916-9c04-ae15d5dd3236-kube-api-access-7msf9\") pod \"dnsmasq-dns-54df6f9497-rfpjr\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.416836 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-ovsdbserver-sb\") pod \"dnsmasq-dns-54df6f9497-rfpjr\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.416865 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-ovsdbserver-nb\") pod \"dnsmasq-dns-54df6f9497-rfpjr\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.416883 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-config\") pod \"dnsmasq-dns-54df6f9497-rfpjr\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.417865 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-config\") pod \"dnsmasq-dns-54df6f9497-rfpjr\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.418463 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-ovsdbserver-sb\") pod \"dnsmasq-dns-54df6f9497-rfpjr\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.426857 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.427888 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c58d8d958-sws7w"] Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.441970 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7msf9\" (UniqueName: \"kubernetes.io/projected/5298592c-b60f-4916-9c04-ae15d5dd3236-kube-api-access-7msf9\") pod \"dnsmasq-dns-54df6f9497-rfpjr\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.453055 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-ovsdbserver-nb\") pod \"dnsmasq-dns-54df6f9497-rfpjr\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.453900 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-dns-svc\") pod \"dnsmasq-dns-54df6f9497-rfpjr\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.456089 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.498571 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-f95f7bf9c-29dkx" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.518385 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcb2376e-df6b-448f-8b2a-3a8bfc8e7638-logs\") pod \"barbican-api-5c58d8d958-sws7w\" (UID: \"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638\") " pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.518906 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jm29\" (UniqueName: \"kubernetes.io/projected/fcb2376e-df6b-448f-8b2a-3a8bfc8e7638-kube-api-access-2jm29\") pod \"barbican-api-5c58d8d958-sws7w\" (UID: \"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638\") " pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.519037 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcb2376e-df6b-448f-8b2a-3a8bfc8e7638-config-data\") pod \"barbican-api-5c58d8d958-sws7w\" (UID: \"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638\") " pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.519141 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb2376e-df6b-448f-8b2a-3a8bfc8e7638-combined-ca-bundle\") pod \"barbican-api-5c58d8d958-sws7w\" (UID: \"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638\") " pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.519216 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fcb2376e-df6b-448f-8b2a-3a8bfc8e7638-config-data-custom\") pod \"barbican-api-5c58d8d958-sws7w\" (UID: \"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638\") " pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.529017 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.622843 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb2376e-df6b-448f-8b2a-3a8bfc8e7638-combined-ca-bundle\") pod \"barbican-api-5c58d8d958-sws7w\" (UID: \"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638\") " pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.623155 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fcb2376e-df6b-448f-8b2a-3a8bfc8e7638-config-data-custom\") pod \"barbican-api-5c58d8d958-sws7w\" (UID: \"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638\") " pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.623193 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcb2376e-df6b-448f-8b2a-3a8bfc8e7638-logs\") pod \"barbican-api-5c58d8d958-sws7w\" (UID: \"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638\") " pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.623254 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jm29\" (UniqueName: \"kubernetes.io/projected/fcb2376e-df6b-448f-8b2a-3a8bfc8e7638-kube-api-access-2jm29\") pod \"barbican-api-5c58d8d958-sws7w\" (UID: \"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638\") " pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.623346 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcb2376e-df6b-448f-8b2a-3a8bfc8e7638-config-data\") pod \"barbican-api-5c58d8d958-sws7w\" (UID: \"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638\") " pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.623861 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcb2376e-df6b-448f-8b2a-3a8bfc8e7638-logs\") pod \"barbican-api-5c58d8d958-sws7w\" (UID: \"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638\") " pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.632819 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fcb2376e-df6b-448f-8b2a-3a8bfc8e7638-config-data-custom\") pod \"barbican-api-5c58d8d958-sws7w\" (UID: \"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638\") " pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.633288 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcb2376e-df6b-448f-8b2a-3a8bfc8e7638-config-data\") pod \"barbican-api-5c58d8d958-sws7w\" (UID: \"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638\") " pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.638229 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcb2376e-df6b-448f-8b2a-3a8bfc8e7638-combined-ca-bundle\") pod \"barbican-api-5c58d8d958-sws7w\" (UID: \"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638\") " pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.643379 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jm29\" (UniqueName: \"kubernetes.io/projected/fcb2376e-df6b-448f-8b2a-3a8bfc8e7638-kube-api-access-2jm29\") pod \"barbican-api-5c58d8d958-sws7w\" (UID: \"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638\") " pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.775760 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d"] Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.812158 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" event={"ID":"e65be6ba-47fe-4928-b461-53031fd0e5eb","Type":"ContainerStarted","Data":"4e33ca9266ae98984e32383727056357befcec5a900767757073cb54b9ba2bfa"} Oct 02 12:24:33 crc kubenswrapper[4766]: I1002 12:24:33.826906 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.066748 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-f95f7bf9c-29dkx"] Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.177435 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54df6f9497-rfpjr"] Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.383312 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c58d8d958-sws7w"] Oct 02 12:24:34 crc kubenswrapper[4766]: W1002 12:24:34.391688 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcb2376e_df6b_448f_8b2a_3a8bfc8e7638.slice/crio-e72d76b120cb933a3dbb8d5242689c1125b2306253f09f6b6b726a15d7d99cf2 WatchSource:0}: Error finding container e72d76b120cb933a3dbb8d5242689c1125b2306253f09f6b6b726a15d7d99cf2: Status 404 returned error can't find the container with id e72d76b120cb933a3dbb8d5242689c1125b2306253f09f6b6b726a15d7d99cf2 Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.825386 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" event={"ID":"e65be6ba-47fe-4928-b461-53031fd0e5eb","Type":"ContainerStarted","Data":"4c0f257fc17781d33cda4f2993f6a51778b9d83ddc9552bc3db6d27f11a47163"} Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.825441 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" event={"ID":"e65be6ba-47fe-4928-b461-53031fd0e5eb","Type":"ContainerStarted","Data":"116f9b85cd7250115459a66f936ab4de748ea031178df264ad945df753be4b99"} Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.834236 4766 generic.go:334] "Generic (PLEG): container finished" podID="5298592c-b60f-4916-9c04-ae15d5dd3236" containerID="b92559a1b95fb8c7a39e870f97e5dcaaa85f3185ed3d3b62be310f66fff4c9a9" exitCode=0 Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.834314 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" event={"ID":"5298592c-b60f-4916-9c04-ae15d5dd3236","Type":"ContainerDied","Data":"b92559a1b95fb8c7a39e870f97e5dcaaa85f3185ed3d3b62be310f66fff4c9a9"} Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.834346 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" event={"ID":"5298592c-b60f-4916-9c04-ae15d5dd3236","Type":"ContainerStarted","Data":"ef3dd720b5c2b69ee040be67772cb846592912a9ae6bcb956775820b6bb7a932"} Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.839470 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f95f7bf9c-29dkx" event={"ID":"385a22c3-88e6-49f7-8e51-147925a9baef","Type":"ContainerStarted","Data":"e1f2f06f61a19a0c4e1d3f552645aab7fcfc22b59306c7a9b1ecb69974e8b2a4"} Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.839552 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f95f7bf9c-29dkx" event={"ID":"385a22c3-88e6-49f7-8e51-147925a9baef","Type":"ContainerStarted","Data":"8f760ddb77dc7ad8412db767b7d188a68505b1de1a893a433cb9013e3b3d76db"} Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.839563 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-f95f7bf9c-29dkx" event={"ID":"385a22c3-88e6-49f7-8e51-147925a9baef","Type":"ContainerStarted","Data":"c19f7ba68cb8908b452b53ca533bb6004bfafa526b791f0b8346cfc4aa8c4979"} Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.842105 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c58d8d958-sws7w" event={"ID":"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638","Type":"ContainerStarted","Data":"bf07d474c4c65e361b1d2a1e701d095217bf329cb88cb28a98e5813cba1488f5"} Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.842155 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c58d8d958-sws7w" event={"ID":"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638","Type":"ContainerStarted","Data":"4de5a9b20d361d635b5e3a4eeac5b2a42c20396ebb36649726e68ab840e6f2f4"} Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.842168 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c58d8d958-sws7w" event={"ID":"fcb2376e-df6b-448f-8b2a-3a8bfc8e7638","Type":"ContainerStarted","Data":"e72d76b120cb933a3dbb8d5242689c1125b2306253f09f6b6b726a15d7d99cf2"} Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.842887 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.842924 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.880210 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5c58d8d958-sws7w" podStartSLOduration=1.8801870790000001 podStartE2EDuration="1.880187079s" podCreationTimestamp="2025-10-02 12:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:24:34.874144495 +0000 UTC m=+5589.817015459" watchObservedRunningTime="2025-10-02 12:24:34.880187079 +0000 UTC m=+5589.823058033" Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.883074 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5b6ffc6db8-wgm5d" podStartSLOduration=1.883057811 podStartE2EDuration="1.883057811s" podCreationTimestamp="2025-10-02 12:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:24:34.855419406 +0000 UTC m=+5589.798290350" watchObservedRunningTime="2025-10-02 12:24:34.883057811 +0000 UTC m=+5589.825928755" Oct 02 12:24:34 crc kubenswrapper[4766]: I1002 12:24:34.904368 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-f95f7bf9c-29dkx" podStartSLOduration=1.904346833 podStartE2EDuration="1.904346833s" podCreationTimestamp="2025-10-02 12:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:24:34.898887448 +0000 UTC m=+5589.841758392" watchObservedRunningTime="2025-10-02 12:24:34.904346833 +0000 UTC m=+5589.847217777" Oct 02 12:24:35 crc kubenswrapper[4766]: I1002 12:24:35.853118 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" event={"ID":"5298592c-b60f-4916-9c04-ae15d5dd3236","Type":"ContainerStarted","Data":"5fa36e67b39ce376323eb3772ee2fa1b2c34eb2037a906bdb806fdc25c7ed0d7"} Oct 02 12:24:35 crc kubenswrapper[4766]: I1002 12:24:35.883413 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" podStartSLOduration=2.883390849 podStartE2EDuration="2.883390849s" podCreationTimestamp="2025-10-02 12:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:24:35.873169462 +0000 UTC m=+5590.816040406" watchObservedRunningTime="2025-10-02 12:24:35.883390849 +0000 UTC m=+5590.826261803" Oct 02 12:24:36 crc kubenswrapper[4766]: I1002 12:24:36.860977 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:43 crc kubenswrapper[4766]: I1002 12:24:43.531875 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:24:43 crc kubenswrapper[4766]: I1002 12:24:43.614275 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c568f7c8f-8x8gb"] Oct 02 12:24:43 crc kubenswrapper[4766]: I1002 12:24:43.614615 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" podUID="82a33696-df5c-4529-ae67-a7b78bd20819" containerName="dnsmasq-dns" containerID="cri-o://8d3312ef5427078d677b8e92dee643b76437d3f40f10f0597ef740cb6427a7d6" gracePeriod=10 Oct 02 12:24:43 crc kubenswrapper[4766]: I1002 12:24:43.951790 4766 generic.go:334] "Generic (PLEG): container finished" podID="82a33696-df5c-4529-ae67-a7b78bd20819" containerID="8d3312ef5427078d677b8e92dee643b76437d3f40f10f0597ef740cb6427a7d6" exitCode=0 Oct 02 12:24:43 crc kubenswrapper[4766]: I1002 12:24:43.951994 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" event={"ID":"82a33696-df5c-4529-ae67-a7b78bd20819","Type":"ContainerDied","Data":"8d3312ef5427078d677b8e92dee643b76437d3f40f10f0597ef740cb6427a7d6"} Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.649963 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.744286 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-dns-svc\") pod \"82a33696-df5c-4529-ae67-a7b78bd20819\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.744453 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxg8h\" (UniqueName: \"kubernetes.io/projected/82a33696-df5c-4529-ae67-a7b78bd20819-kube-api-access-vxg8h\") pod \"82a33696-df5c-4529-ae67-a7b78bd20819\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.744693 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-config\") pod \"82a33696-df5c-4529-ae67-a7b78bd20819\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.744758 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-ovsdbserver-sb\") pod \"82a33696-df5c-4529-ae67-a7b78bd20819\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.745771 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-ovsdbserver-nb\") pod \"82a33696-df5c-4529-ae67-a7b78bd20819\" (UID: \"82a33696-df5c-4529-ae67-a7b78bd20819\") " Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.752766 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a33696-df5c-4529-ae67-a7b78bd20819-kube-api-access-vxg8h" (OuterVolumeSpecName: "kube-api-access-vxg8h") pod "82a33696-df5c-4529-ae67-a7b78bd20819" (UID: "82a33696-df5c-4529-ae67-a7b78bd20819"). InnerVolumeSpecName "kube-api-access-vxg8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.794427 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "82a33696-df5c-4529-ae67-a7b78bd20819" (UID: "82a33696-df5c-4529-ae67-a7b78bd20819"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.796727 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "82a33696-df5c-4529-ae67-a7b78bd20819" (UID: "82a33696-df5c-4529-ae67-a7b78bd20819"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.797946 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-config" (OuterVolumeSpecName: "config") pod "82a33696-df5c-4529-ae67-a7b78bd20819" (UID: "82a33696-df5c-4529-ae67-a7b78bd20819"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.812594 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "82a33696-df5c-4529-ae67-a7b78bd20819" (UID: "82a33696-df5c-4529-ae67-a7b78bd20819"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.848984 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxg8h\" (UniqueName: \"kubernetes.io/projected/82a33696-df5c-4529-ae67-a7b78bd20819-kube-api-access-vxg8h\") on node \"crc\" DevicePath \"\"" Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.849028 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.849045 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.849058 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.849068 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/82a33696-df5c-4529-ae67-a7b78bd20819-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.982918 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" event={"ID":"82a33696-df5c-4529-ae67-a7b78bd20819","Type":"ContainerDied","Data":"42eab045605f5334806d3d0cf0d16fd6e00f2444168b26af254dfeb0f44a8a5e"} Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.983005 4766 scope.go:117] "RemoveContainer" containerID="8d3312ef5427078d677b8e92dee643b76437d3f40f10f0597ef740cb6427a7d6" Oct 02 12:24:44 crc kubenswrapper[4766]: I1002 12:24:44.983236 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" Oct 02 12:24:45 crc kubenswrapper[4766]: I1002 12:24:45.025345 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c568f7c8f-8x8gb"] Oct 02 12:24:45 crc kubenswrapper[4766]: I1002 12:24:45.030890 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c568f7c8f-8x8gb"] Oct 02 12:24:45 crc kubenswrapper[4766]: I1002 12:24:45.036089 4766 scope.go:117] "RemoveContainer" containerID="8fd7eee6f8257c95f3196e47492951ec33e9abb744d4234d572227644c576b88" Oct 02 12:24:45 crc kubenswrapper[4766]: I1002 12:24:45.328973 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:45 crc kubenswrapper[4766]: I1002 12:24:45.472389 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c58d8d958-sws7w" Oct 02 12:24:45 crc kubenswrapper[4766]: I1002 12:24:45.891231 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82a33696-df5c-4529-ae67-a7b78bd20819" path="/var/lib/kubelet/pods/82a33696-df5c-4529-ae67-a7b78bd20819/volumes" Oct 02 12:24:49 crc kubenswrapper[4766]: I1002 12:24:49.595224 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6c568f7c8f-8x8gb" podUID="82a33696-df5c-4529-ae67-a7b78bd20819" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.16:5353: i/o timeout" Oct 02 12:24:54 crc kubenswrapper[4766]: I1002 12:24:54.432340 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:24:54 crc kubenswrapper[4766]: I1002 12:24:54.433492 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:24:59 crc kubenswrapper[4766]: I1002 12:24:59.693120 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-nbgf2"] Oct 02 12:24:59 crc kubenswrapper[4766]: E1002 12:24:59.694010 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a33696-df5c-4529-ae67-a7b78bd20819" containerName="dnsmasq-dns" Oct 02 12:24:59 crc kubenswrapper[4766]: I1002 12:24:59.694041 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a33696-df5c-4529-ae67-a7b78bd20819" containerName="dnsmasq-dns" Oct 02 12:24:59 crc kubenswrapper[4766]: E1002 12:24:59.694060 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a33696-df5c-4529-ae67-a7b78bd20819" containerName="init" Oct 02 12:24:59 crc kubenswrapper[4766]: I1002 12:24:59.694065 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a33696-df5c-4529-ae67-a7b78bd20819" containerName="init" Oct 02 12:24:59 crc kubenswrapper[4766]: I1002 12:24:59.694255 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="82a33696-df5c-4529-ae67-a7b78bd20819" containerName="dnsmasq-dns" Oct 02 12:24:59 crc kubenswrapper[4766]: I1002 12:24:59.694875 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nbgf2" Oct 02 12:24:59 crc kubenswrapper[4766]: I1002 12:24:59.714292 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-nbgf2"] Oct 02 12:24:59 crc kubenswrapper[4766]: I1002 12:24:59.776281 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg6lf\" (UniqueName: \"kubernetes.io/projected/7999e98f-03cb-469a-8187-af8698bb975e-kube-api-access-dg6lf\") pod \"neutron-db-create-nbgf2\" (UID: \"7999e98f-03cb-469a-8187-af8698bb975e\") " pod="openstack/neutron-db-create-nbgf2" Oct 02 12:24:59 crc kubenswrapper[4766]: I1002 12:24:59.878119 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg6lf\" (UniqueName: \"kubernetes.io/projected/7999e98f-03cb-469a-8187-af8698bb975e-kube-api-access-dg6lf\") pod \"neutron-db-create-nbgf2\" (UID: \"7999e98f-03cb-469a-8187-af8698bb975e\") " pod="openstack/neutron-db-create-nbgf2" Oct 02 12:24:59 crc kubenswrapper[4766]: I1002 12:24:59.898869 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg6lf\" (UniqueName: \"kubernetes.io/projected/7999e98f-03cb-469a-8187-af8698bb975e-kube-api-access-dg6lf\") pod \"neutron-db-create-nbgf2\" (UID: \"7999e98f-03cb-469a-8187-af8698bb975e\") " pod="openstack/neutron-db-create-nbgf2" Oct 02 12:25:00 crc kubenswrapper[4766]: I1002 12:25:00.015231 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nbgf2" Oct 02 12:25:00 crc kubenswrapper[4766]: I1002 12:25:00.497758 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-nbgf2"] Oct 02 12:25:01 crc kubenswrapper[4766]: I1002 12:25:01.145075 4766 generic.go:334] "Generic (PLEG): container finished" podID="7999e98f-03cb-469a-8187-af8698bb975e" containerID="3c9e772d95b10749e2def625cd13c4517be08a6f113711d143aeb81cb0286e99" exitCode=0 Oct 02 12:25:01 crc kubenswrapper[4766]: I1002 12:25:01.145174 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nbgf2" event={"ID":"7999e98f-03cb-469a-8187-af8698bb975e","Type":"ContainerDied","Data":"3c9e772d95b10749e2def625cd13c4517be08a6f113711d143aeb81cb0286e99"} Oct 02 12:25:01 crc kubenswrapper[4766]: I1002 12:25:01.145370 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nbgf2" event={"ID":"7999e98f-03cb-469a-8187-af8698bb975e","Type":"ContainerStarted","Data":"40b1f74fd1455a718c41fe596923d89f21ed996ab0400cf01ec41a60329eb753"} Oct 02 12:25:02 crc kubenswrapper[4766]: I1002 12:25:02.505341 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nbgf2" Oct 02 12:25:02 crc kubenswrapper[4766]: I1002 12:25:02.669439 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg6lf\" (UniqueName: \"kubernetes.io/projected/7999e98f-03cb-469a-8187-af8698bb975e-kube-api-access-dg6lf\") pod \"7999e98f-03cb-469a-8187-af8698bb975e\" (UID: \"7999e98f-03cb-469a-8187-af8698bb975e\") " Oct 02 12:25:02 crc kubenswrapper[4766]: I1002 12:25:02.677808 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7999e98f-03cb-469a-8187-af8698bb975e-kube-api-access-dg6lf" (OuterVolumeSpecName: "kube-api-access-dg6lf") pod "7999e98f-03cb-469a-8187-af8698bb975e" (UID: "7999e98f-03cb-469a-8187-af8698bb975e"). InnerVolumeSpecName "kube-api-access-dg6lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:25:02 crc kubenswrapper[4766]: I1002 12:25:02.772623 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg6lf\" (UniqueName: \"kubernetes.io/projected/7999e98f-03cb-469a-8187-af8698bb975e-kube-api-access-dg6lf\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:03 crc kubenswrapper[4766]: I1002 12:25:03.169000 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-nbgf2" event={"ID":"7999e98f-03cb-469a-8187-af8698bb975e","Type":"ContainerDied","Data":"40b1f74fd1455a718c41fe596923d89f21ed996ab0400cf01ec41a60329eb753"} Oct 02 12:25:03 crc kubenswrapper[4766]: I1002 12:25:03.169073 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40b1f74fd1455a718c41fe596923d89f21ed996ab0400cf01ec41a60329eb753" Oct 02 12:25:03 crc kubenswrapper[4766]: I1002 12:25:03.169078 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-nbgf2" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.307116 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wc2r7"] Oct 02 12:25:09 crc kubenswrapper[4766]: E1002 12:25:09.307938 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7999e98f-03cb-469a-8187-af8698bb975e" containerName="mariadb-database-create" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.307953 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7999e98f-03cb-469a-8187-af8698bb975e" containerName="mariadb-database-create" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.308130 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7999e98f-03cb-469a-8187-af8698bb975e" containerName="mariadb-database-create" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.309342 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wc2r7" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.316914 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wc2r7"] Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.488301 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699911dd-95da-451d-8ea1-731fe880bbfb-utilities\") pod \"certified-operators-wc2r7\" (UID: \"699911dd-95da-451d-8ea1-731fe880bbfb\") " pod="openshift-marketplace/certified-operators-wc2r7" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.488475 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699911dd-95da-451d-8ea1-731fe880bbfb-catalog-content\") pod \"certified-operators-wc2r7\" (UID: \"699911dd-95da-451d-8ea1-731fe880bbfb\") " pod="openshift-marketplace/certified-operators-wc2r7" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.488540 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp7xz\" (UniqueName: \"kubernetes.io/projected/699911dd-95da-451d-8ea1-731fe880bbfb-kube-api-access-rp7xz\") pod \"certified-operators-wc2r7\" (UID: \"699911dd-95da-451d-8ea1-731fe880bbfb\") " pod="openshift-marketplace/certified-operators-wc2r7" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.590373 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699911dd-95da-451d-8ea1-731fe880bbfb-utilities\") pod \"certified-operators-wc2r7\" (UID: \"699911dd-95da-451d-8ea1-731fe880bbfb\") " pod="openshift-marketplace/certified-operators-wc2r7" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.590502 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699911dd-95da-451d-8ea1-731fe880bbfb-catalog-content\") pod \"certified-operators-wc2r7\" (UID: \"699911dd-95da-451d-8ea1-731fe880bbfb\") " pod="openshift-marketplace/certified-operators-wc2r7" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.590559 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp7xz\" (UniqueName: \"kubernetes.io/projected/699911dd-95da-451d-8ea1-731fe880bbfb-kube-api-access-rp7xz\") pod \"certified-operators-wc2r7\" (UID: \"699911dd-95da-451d-8ea1-731fe880bbfb\") " pod="openshift-marketplace/certified-operators-wc2r7" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.591095 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699911dd-95da-451d-8ea1-731fe880bbfb-utilities\") pod \"certified-operators-wc2r7\" (UID: \"699911dd-95da-451d-8ea1-731fe880bbfb\") " pod="openshift-marketplace/certified-operators-wc2r7" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.591136 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699911dd-95da-451d-8ea1-731fe880bbfb-catalog-content\") pod \"certified-operators-wc2r7\" (UID: \"699911dd-95da-451d-8ea1-731fe880bbfb\") " pod="openshift-marketplace/certified-operators-wc2r7" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.613671 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp7xz\" (UniqueName: \"kubernetes.io/projected/699911dd-95da-451d-8ea1-731fe880bbfb-kube-api-access-rp7xz\") pod \"certified-operators-wc2r7\" (UID: \"699911dd-95da-451d-8ea1-731fe880bbfb\") " pod="openshift-marketplace/certified-operators-wc2r7" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.635229 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wc2r7" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.793665 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-3461-account-create-lc4f5"] Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.795418 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3461-account-create-lc4f5" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.799872 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.814820 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3461-account-create-lc4f5"] Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.897651 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmfvn\" (UniqueName: \"kubernetes.io/projected/789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8-kube-api-access-qmfvn\") pod \"neutron-3461-account-create-lc4f5\" (UID: \"789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8\") " pod="openstack/neutron-3461-account-create-lc4f5" Oct 02 12:25:09 crc kubenswrapper[4766]: I1002 12:25:09.998815 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmfvn\" (UniqueName: \"kubernetes.io/projected/789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8-kube-api-access-qmfvn\") pod \"neutron-3461-account-create-lc4f5\" (UID: \"789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8\") " pod="openstack/neutron-3461-account-create-lc4f5" Oct 02 12:25:10 crc kubenswrapper[4766]: I1002 12:25:10.022777 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmfvn\" (UniqueName: \"kubernetes.io/projected/789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8-kube-api-access-qmfvn\") pod \"neutron-3461-account-create-lc4f5\" (UID: \"789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8\") " pod="openstack/neutron-3461-account-create-lc4f5" Oct 02 12:25:10 crc kubenswrapper[4766]: I1002 12:25:10.158938 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3461-account-create-lc4f5" Oct 02 12:25:10 crc kubenswrapper[4766]: I1002 12:25:10.167242 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wc2r7"] Oct 02 12:25:10 crc kubenswrapper[4766]: I1002 12:25:10.232320 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wc2r7" event={"ID":"699911dd-95da-451d-8ea1-731fe880bbfb","Type":"ContainerStarted","Data":"e90928a37d9b34ee15eaab35a7b77e93a35cf2e4cdc40b6704289a256fbcc459"} Oct 02 12:25:10 crc kubenswrapper[4766]: I1002 12:25:10.583706 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-3461-account-create-lc4f5"] Oct 02 12:25:10 crc kubenswrapper[4766]: W1002 12:25:10.592258 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod789f25c6_bb9c_43cd_8ad2_b0a4c1d8b1a8.slice/crio-f3b3ca33301fb0c9053fc25854a076a51638067a88ac28dd58e72e5efe2c7185 WatchSource:0}: Error finding container f3b3ca33301fb0c9053fc25854a076a51638067a88ac28dd58e72e5efe2c7185: Status 404 returned error can't find the container with id f3b3ca33301fb0c9053fc25854a076a51638067a88ac28dd58e72e5efe2c7185 Oct 02 12:25:11 crc kubenswrapper[4766]: I1002 12:25:11.242594 4766 generic.go:334] "Generic (PLEG): container finished" podID="789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8" containerID="419f3956430be7c0fac39d4d7d5a0f16f372f51caec32beca986d71cbd8e40b7" exitCode=0 Oct 02 12:25:11 crc kubenswrapper[4766]: I1002 12:25:11.242664 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3461-account-create-lc4f5" event={"ID":"789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8","Type":"ContainerDied","Data":"419f3956430be7c0fac39d4d7d5a0f16f372f51caec32beca986d71cbd8e40b7"} Oct 02 12:25:11 crc kubenswrapper[4766]: I1002 12:25:11.243218 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3461-account-create-lc4f5" event={"ID":"789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8","Type":"ContainerStarted","Data":"f3b3ca33301fb0c9053fc25854a076a51638067a88ac28dd58e72e5efe2c7185"} Oct 02 12:25:11 crc kubenswrapper[4766]: I1002 12:25:11.246148 4766 generic.go:334] "Generic (PLEG): container finished" podID="699911dd-95da-451d-8ea1-731fe880bbfb" containerID="341ceee5836884ea5ac12efb9fb2dceea436dddadde50e3c96d6996f7b162c5c" exitCode=0 Oct 02 12:25:11 crc kubenswrapper[4766]: I1002 12:25:11.246214 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wc2r7" event={"ID":"699911dd-95da-451d-8ea1-731fe880bbfb","Type":"ContainerDied","Data":"341ceee5836884ea5ac12efb9fb2dceea436dddadde50e3c96d6996f7b162c5c"} Oct 02 12:25:12 crc kubenswrapper[4766]: I1002 12:25:12.561530 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3461-account-create-lc4f5" Oct 02 12:25:12 crc kubenswrapper[4766]: I1002 12:25:12.750857 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmfvn\" (UniqueName: \"kubernetes.io/projected/789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8-kube-api-access-qmfvn\") pod \"789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8\" (UID: \"789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8\") " Oct 02 12:25:12 crc kubenswrapper[4766]: I1002 12:25:12.780862 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8-kube-api-access-qmfvn" (OuterVolumeSpecName: "kube-api-access-qmfvn") pod "789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8" (UID: "789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8"). InnerVolumeSpecName "kube-api-access-qmfvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:25:12 crc kubenswrapper[4766]: I1002 12:25:12.853596 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmfvn\" (UniqueName: \"kubernetes.io/projected/789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8-kube-api-access-qmfvn\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:13 crc kubenswrapper[4766]: I1002 12:25:13.264645 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-3461-account-create-lc4f5" event={"ID":"789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8","Type":"ContainerDied","Data":"f3b3ca33301fb0c9053fc25854a076a51638067a88ac28dd58e72e5efe2c7185"} Oct 02 12:25:13 crc kubenswrapper[4766]: I1002 12:25:13.264710 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-3461-account-create-lc4f5" Oct 02 12:25:13 crc kubenswrapper[4766]: I1002 12:25:13.264689 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3b3ca33301fb0c9053fc25854a076a51638067a88ac28dd58e72e5efe2c7185" Oct 02 12:25:14 crc kubenswrapper[4766]: I1002 12:25:14.947994 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-8hdnj"] Oct 02 12:25:14 crc kubenswrapper[4766]: E1002 12:25:14.948446 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8" containerName="mariadb-account-create" Oct 02 12:25:14 crc kubenswrapper[4766]: I1002 12:25:14.948463 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8" containerName="mariadb-account-create" Oct 02 12:25:14 crc kubenswrapper[4766]: I1002 12:25:14.948661 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8" containerName="mariadb-account-create" Oct 02 12:25:14 crc kubenswrapper[4766]: I1002 12:25:14.949227 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8hdnj" Oct 02 12:25:14 crc kubenswrapper[4766]: I1002 12:25:14.954148 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 12:25:14 crc kubenswrapper[4766]: I1002 12:25:14.954167 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 12:25:14 crc kubenswrapper[4766]: I1002 12:25:14.958941 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8hdnj"] Oct 02 12:25:14 crc kubenswrapper[4766]: I1002 12:25:14.965488 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-68kd9" Oct 02 12:25:15 crc kubenswrapper[4766]: I1002 12:25:15.099880 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/afda36fa-07fe-43b6-82a3-5ec9788fec1e-config\") pod \"neutron-db-sync-8hdnj\" (UID: \"afda36fa-07fe-43b6-82a3-5ec9788fec1e\") " pod="openstack/neutron-db-sync-8hdnj" Oct 02 12:25:15 crc kubenswrapper[4766]: I1002 12:25:15.100471 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afda36fa-07fe-43b6-82a3-5ec9788fec1e-combined-ca-bundle\") pod \"neutron-db-sync-8hdnj\" (UID: \"afda36fa-07fe-43b6-82a3-5ec9788fec1e\") " pod="openstack/neutron-db-sync-8hdnj" Oct 02 12:25:15 crc kubenswrapper[4766]: I1002 12:25:15.100582 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqjdj\" (UniqueName: \"kubernetes.io/projected/afda36fa-07fe-43b6-82a3-5ec9788fec1e-kube-api-access-sqjdj\") pod \"neutron-db-sync-8hdnj\" (UID: \"afda36fa-07fe-43b6-82a3-5ec9788fec1e\") " pod="openstack/neutron-db-sync-8hdnj" Oct 02 12:25:15 crc kubenswrapper[4766]: I1002 12:25:15.201763 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/afda36fa-07fe-43b6-82a3-5ec9788fec1e-config\") pod \"neutron-db-sync-8hdnj\" (UID: \"afda36fa-07fe-43b6-82a3-5ec9788fec1e\") " pod="openstack/neutron-db-sync-8hdnj" Oct 02 12:25:15 crc kubenswrapper[4766]: I1002 12:25:15.201857 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afda36fa-07fe-43b6-82a3-5ec9788fec1e-combined-ca-bundle\") pod \"neutron-db-sync-8hdnj\" (UID: \"afda36fa-07fe-43b6-82a3-5ec9788fec1e\") " pod="openstack/neutron-db-sync-8hdnj" Oct 02 12:25:15 crc kubenswrapper[4766]: I1002 12:25:15.201927 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqjdj\" (UniqueName: \"kubernetes.io/projected/afda36fa-07fe-43b6-82a3-5ec9788fec1e-kube-api-access-sqjdj\") pod \"neutron-db-sync-8hdnj\" (UID: \"afda36fa-07fe-43b6-82a3-5ec9788fec1e\") " pod="openstack/neutron-db-sync-8hdnj" Oct 02 12:25:15 crc kubenswrapper[4766]: I1002 12:25:15.212686 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afda36fa-07fe-43b6-82a3-5ec9788fec1e-combined-ca-bundle\") pod \"neutron-db-sync-8hdnj\" (UID: \"afda36fa-07fe-43b6-82a3-5ec9788fec1e\") " pod="openstack/neutron-db-sync-8hdnj" Oct 02 12:25:15 crc kubenswrapper[4766]: I1002 12:25:15.213977 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/afda36fa-07fe-43b6-82a3-5ec9788fec1e-config\") pod \"neutron-db-sync-8hdnj\" (UID: \"afda36fa-07fe-43b6-82a3-5ec9788fec1e\") " pod="openstack/neutron-db-sync-8hdnj" Oct 02 12:25:15 crc kubenswrapper[4766]: I1002 12:25:15.224294 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqjdj\" (UniqueName: \"kubernetes.io/projected/afda36fa-07fe-43b6-82a3-5ec9788fec1e-kube-api-access-sqjdj\") pod \"neutron-db-sync-8hdnj\" (UID: \"afda36fa-07fe-43b6-82a3-5ec9788fec1e\") " pod="openstack/neutron-db-sync-8hdnj" Oct 02 12:25:15 crc kubenswrapper[4766]: I1002 12:25:15.283860 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8hdnj" Oct 02 12:25:15 crc kubenswrapper[4766]: I1002 12:25:15.284211 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wc2r7" event={"ID":"699911dd-95da-451d-8ea1-731fe880bbfb","Type":"ContainerStarted","Data":"14942141e5a5552042f478dbaa39c31f130fd384b72e8473549f92e7cb6187f5"} Oct 02 12:25:15 crc kubenswrapper[4766]: I1002 12:25:15.729641 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8hdnj"] Oct 02 12:25:15 crc kubenswrapper[4766]: W1002 12:25:15.734378 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafda36fa_07fe_43b6_82a3_5ec9788fec1e.slice/crio-f79b41934735a44aa0ff831fd38d05b08a15f252fb8eb9c4ecaf201c758a5b34 WatchSource:0}: Error finding container f79b41934735a44aa0ff831fd38d05b08a15f252fb8eb9c4ecaf201c758a5b34: Status 404 returned error can't find the container with id f79b41934735a44aa0ff831fd38d05b08a15f252fb8eb9c4ecaf201c758a5b34 Oct 02 12:25:16 crc kubenswrapper[4766]: I1002 12:25:16.305962 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8hdnj" event={"ID":"afda36fa-07fe-43b6-82a3-5ec9788fec1e","Type":"ContainerStarted","Data":"cb7fda821febdc3b8dcd6397e43cbb18f177d4be100c11f44700f5869791dac3"} Oct 02 12:25:16 crc kubenswrapper[4766]: I1002 12:25:16.306395 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8hdnj" event={"ID":"afda36fa-07fe-43b6-82a3-5ec9788fec1e","Type":"ContainerStarted","Data":"f79b41934735a44aa0ff831fd38d05b08a15f252fb8eb9c4ecaf201c758a5b34"} Oct 02 12:25:16 crc kubenswrapper[4766]: I1002 12:25:16.309290 4766 generic.go:334] "Generic (PLEG): container finished" podID="699911dd-95da-451d-8ea1-731fe880bbfb" containerID="14942141e5a5552042f478dbaa39c31f130fd384b72e8473549f92e7cb6187f5" exitCode=0 Oct 02 12:25:16 crc kubenswrapper[4766]: I1002 12:25:16.309343 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wc2r7" event={"ID":"699911dd-95da-451d-8ea1-731fe880bbfb","Type":"ContainerDied","Data":"14942141e5a5552042f478dbaa39c31f130fd384b72e8473549f92e7cb6187f5"} Oct 02 12:25:16 crc kubenswrapper[4766]: I1002 12:25:16.322200 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-8hdnj" podStartSLOduration=2.322179088 podStartE2EDuration="2.322179088s" podCreationTimestamp="2025-10-02 12:25:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:25:16.319934777 +0000 UTC m=+5631.262805711" watchObservedRunningTime="2025-10-02 12:25:16.322179088 +0000 UTC m=+5631.265050032" Oct 02 12:25:17 crc kubenswrapper[4766]: I1002 12:25:17.320984 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wc2r7" event={"ID":"699911dd-95da-451d-8ea1-731fe880bbfb","Type":"ContainerStarted","Data":"cad14eb66053c5243d9e93c00be6714022856df31234e76ae35209f974e98f0f"} Oct 02 12:25:17 crc kubenswrapper[4766]: I1002 12:25:17.350343 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wc2r7" podStartSLOduration=2.764646312 podStartE2EDuration="8.350313586s" podCreationTimestamp="2025-10-02 12:25:09 +0000 UTC" firstStartedPulling="2025-10-02 12:25:11.248633201 +0000 UTC m=+5626.191504145" lastFinishedPulling="2025-10-02 12:25:16.834300475 +0000 UTC m=+5631.777171419" observedRunningTime="2025-10-02 12:25:17.339223011 +0000 UTC m=+5632.282093965" watchObservedRunningTime="2025-10-02 12:25:17.350313586 +0000 UTC m=+5632.293184530" Oct 02 12:25:19 crc kubenswrapper[4766]: I1002 12:25:19.635444 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wc2r7" Oct 02 12:25:19 crc kubenswrapper[4766]: I1002 12:25:19.635892 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wc2r7" Oct 02 12:25:19 crc kubenswrapper[4766]: I1002 12:25:19.680628 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wc2r7" Oct 02 12:25:20 crc kubenswrapper[4766]: I1002 12:25:20.361162 4766 generic.go:334] "Generic (PLEG): container finished" podID="afda36fa-07fe-43b6-82a3-5ec9788fec1e" containerID="cb7fda821febdc3b8dcd6397e43cbb18f177d4be100c11f44700f5869791dac3" exitCode=0 Oct 02 12:25:20 crc kubenswrapper[4766]: I1002 12:25:20.361263 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8hdnj" event={"ID":"afda36fa-07fe-43b6-82a3-5ec9788fec1e","Type":"ContainerDied","Data":"cb7fda821febdc3b8dcd6397e43cbb18f177d4be100c11f44700f5869791dac3"} Oct 02 12:25:20 crc kubenswrapper[4766]: I1002 12:25:20.472831 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vnlrw"] Oct 02 12:25:20 crc kubenswrapper[4766]: I1002 12:25:20.474795 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnlrw" Oct 02 12:25:20 crc kubenswrapper[4766]: I1002 12:25:20.484550 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnlrw"] Oct 02 12:25:20 crc kubenswrapper[4766]: I1002 12:25:20.499892 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b42d0006-4241-457f-9f8d-c0875475d86f-catalog-content\") pod \"redhat-marketplace-vnlrw\" (UID: \"b42d0006-4241-457f-9f8d-c0875475d86f\") " pod="openshift-marketplace/redhat-marketplace-vnlrw" Oct 02 12:25:20 crc kubenswrapper[4766]: I1002 12:25:20.499984 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9m86\" (UniqueName: \"kubernetes.io/projected/b42d0006-4241-457f-9f8d-c0875475d86f-kube-api-access-k9m86\") pod \"redhat-marketplace-vnlrw\" (UID: \"b42d0006-4241-457f-9f8d-c0875475d86f\") " pod="openshift-marketplace/redhat-marketplace-vnlrw" Oct 02 12:25:20 crc kubenswrapper[4766]: I1002 12:25:20.500098 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b42d0006-4241-457f-9f8d-c0875475d86f-utilities\") pod \"redhat-marketplace-vnlrw\" (UID: \"b42d0006-4241-457f-9f8d-c0875475d86f\") " pod="openshift-marketplace/redhat-marketplace-vnlrw" Oct 02 12:25:20 crc kubenswrapper[4766]: I1002 12:25:20.601617 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b42d0006-4241-457f-9f8d-c0875475d86f-catalog-content\") pod \"redhat-marketplace-vnlrw\" (UID: \"b42d0006-4241-457f-9f8d-c0875475d86f\") " pod="openshift-marketplace/redhat-marketplace-vnlrw" Oct 02 12:25:20 crc kubenswrapper[4766]: I1002 12:25:20.601692 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9m86\" (UniqueName: \"kubernetes.io/projected/b42d0006-4241-457f-9f8d-c0875475d86f-kube-api-access-k9m86\") pod \"redhat-marketplace-vnlrw\" (UID: \"b42d0006-4241-457f-9f8d-c0875475d86f\") " pod="openshift-marketplace/redhat-marketplace-vnlrw" Oct 02 12:25:20 crc kubenswrapper[4766]: I1002 12:25:20.601746 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b42d0006-4241-457f-9f8d-c0875475d86f-utilities\") pod \"redhat-marketplace-vnlrw\" (UID: \"b42d0006-4241-457f-9f8d-c0875475d86f\") " pod="openshift-marketplace/redhat-marketplace-vnlrw" Oct 02 12:25:20 crc kubenswrapper[4766]: I1002 12:25:20.602124 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b42d0006-4241-457f-9f8d-c0875475d86f-catalog-content\") pod \"redhat-marketplace-vnlrw\" (UID: \"b42d0006-4241-457f-9f8d-c0875475d86f\") " pod="openshift-marketplace/redhat-marketplace-vnlrw" Oct 02 12:25:20 crc kubenswrapper[4766]: I1002 12:25:20.602152 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b42d0006-4241-457f-9f8d-c0875475d86f-utilities\") pod \"redhat-marketplace-vnlrw\" (UID: \"b42d0006-4241-457f-9f8d-c0875475d86f\") " pod="openshift-marketplace/redhat-marketplace-vnlrw" Oct 02 12:25:20 crc kubenswrapper[4766]: I1002 12:25:20.630108 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9m86\" (UniqueName: \"kubernetes.io/projected/b42d0006-4241-457f-9f8d-c0875475d86f-kube-api-access-k9m86\") pod \"redhat-marketplace-vnlrw\" (UID: \"b42d0006-4241-457f-9f8d-c0875475d86f\") " pod="openshift-marketplace/redhat-marketplace-vnlrw" Oct 02 12:25:20 crc kubenswrapper[4766]: I1002 12:25:20.807596 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnlrw" Oct 02 12:25:21 crc kubenswrapper[4766]: I1002 12:25:21.362974 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnlrw"] Oct 02 12:25:21 crc kubenswrapper[4766]: I1002 12:25:21.751374 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8hdnj" Oct 02 12:25:21 crc kubenswrapper[4766]: I1002 12:25:21.926944 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqjdj\" (UniqueName: \"kubernetes.io/projected/afda36fa-07fe-43b6-82a3-5ec9788fec1e-kube-api-access-sqjdj\") pod \"afda36fa-07fe-43b6-82a3-5ec9788fec1e\" (UID: \"afda36fa-07fe-43b6-82a3-5ec9788fec1e\") " Oct 02 12:25:21 crc kubenswrapper[4766]: I1002 12:25:21.927085 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afda36fa-07fe-43b6-82a3-5ec9788fec1e-combined-ca-bundle\") pod \"afda36fa-07fe-43b6-82a3-5ec9788fec1e\" (UID: \"afda36fa-07fe-43b6-82a3-5ec9788fec1e\") " Oct 02 12:25:21 crc kubenswrapper[4766]: I1002 12:25:21.927175 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/afda36fa-07fe-43b6-82a3-5ec9788fec1e-config\") pod \"afda36fa-07fe-43b6-82a3-5ec9788fec1e\" (UID: \"afda36fa-07fe-43b6-82a3-5ec9788fec1e\") " Oct 02 12:25:21 crc kubenswrapper[4766]: I1002 12:25:21.932635 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afda36fa-07fe-43b6-82a3-5ec9788fec1e-kube-api-access-sqjdj" (OuterVolumeSpecName: "kube-api-access-sqjdj") pod "afda36fa-07fe-43b6-82a3-5ec9788fec1e" (UID: "afda36fa-07fe-43b6-82a3-5ec9788fec1e"). InnerVolumeSpecName "kube-api-access-sqjdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:25:21 crc kubenswrapper[4766]: I1002 12:25:21.961846 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afda36fa-07fe-43b6-82a3-5ec9788fec1e-config" (OuterVolumeSpecName: "config") pod "afda36fa-07fe-43b6-82a3-5ec9788fec1e" (UID: "afda36fa-07fe-43b6-82a3-5ec9788fec1e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:25:21 crc kubenswrapper[4766]: I1002 12:25:21.976683 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afda36fa-07fe-43b6-82a3-5ec9788fec1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afda36fa-07fe-43b6-82a3-5ec9788fec1e" (UID: "afda36fa-07fe-43b6-82a3-5ec9788fec1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.031156 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afda36fa-07fe-43b6-82a3-5ec9788fec1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.031195 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/afda36fa-07fe-43b6-82a3-5ec9788fec1e-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.031205 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqjdj\" (UniqueName: \"kubernetes.io/projected/afda36fa-07fe-43b6-82a3-5ec9788fec1e-kube-api-access-sqjdj\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.393404 4766 generic.go:334] "Generic (PLEG): container finished" podID="b42d0006-4241-457f-9f8d-c0875475d86f" containerID="4572179a07b595b7ebff9496ffb66de8aedd5d498ab6805222f04cdd1d6db4ee" exitCode=0 Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.393469 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnlrw" event={"ID":"b42d0006-4241-457f-9f8d-c0875475d86f","Type":"ContainerDied","Data":"4572179a07b595b7ebff9496ffb66de8aedd5d498ab6805222f04cdd1d6db4ee"} Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.393497 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnlrw" event={"ID":"b42d0006-4241-457f-9f8d-c0875475d86f","Type":"ContainerStarted","Data":"8d8f2286f0286120f11bde8b4027c4958a51c423c7785da8852872bf1487d368"} Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.402520 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8hdnj" event={"ID":"afda36fa-07fe-43b6-82a3-5ec9788fec1e","Type":"ContainerDied","Data":"f79b41934735a44aa0ff831fd38d05b08a15f252fb8eb9c4ecaf201c758a5b34"} Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.402557 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8hdnj" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.402570 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f79b41934735a44aa0ff831fd38d05b08a15f252fb8eb9c4ecaf201c758a5b34" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.524683 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8488b8bd7c-vxgf6"] Oct 02 12:25:22 crc kubenswrapper[4766]: E1002 12:25:22.525373 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afda36fa-07fe-43b6-82a3-5ec9788fec1e" containerName="neutron-db-sync" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.525393 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="afda36fa-07fe-43b6-82a3-5ec9788fec1e" containerName="neutron-db-sync" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.525596 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="afda36fa-07fe-43b6-82a3-5ec9788fec1e" containerName="neutron-db-sync" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.528694 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.545426 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8488b8bd7c-vxgf6"] Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.856118 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-ovsdbserver-nb\") pod \"dnsmasq-dns-8488b8bd7c-vxgf6\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.856230 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-config\") pod \"dnsmasq-dns-8488b8bd7c-vxgf6\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.856316 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-ovsdbserver-sb\") pod \"dnsmasq-dns-8488b8bd7c-vxgf6\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.856345 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-dns-svc\") pod \"dnsmasq-dns-8488b8bd7c-vxgf6\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.856385 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzrts\" (UniqueName: \"kubernetes.io/projected/88226e6f-8efa-4505-a5e5-d4cf947b2d86-kube-api-access-hzrts\") pod \"dnsmasq-dns-8488b8bd7c-vxgf6\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.950896 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-cddb5dc7-2k4p9"] Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.952817 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cddb5dc7-2k4p9" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.955795 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.955948 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.956198 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-68kd9" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.957868 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-config\") pod \"dnsmasq-dns-8488b8bd7c-vxgf6\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.957960 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-ovsdbserver-sb\") pod \"dnsmasq-dns-8488b8bd7c-vxgf6\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.958594 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-dns-svc\") pod \"dnsmasq-dns-8488b8bd7c-vxgf6\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.958666 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzrts\" (UniqueName: \"kubernetes.io/projected/88226e6f-8efa-4505-a5e5-d4cf947b2d86-kube-api-access-hzrts\") pod \"dnsmasq-dns-8488b8bd7c-vxgf6\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.958768 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-ovsdbserver-nb\") pod \"dnsmasq-dns-8488b8bd7c-vxgf6\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.959021 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-ovsdbserver-sb\") pod \"dnsmasq-dns-8488b8bd7c-vxgf6\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.959021 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-config\") pod \"dnsmasq-dns-8488b8bd7c-vxgf6\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.959606 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-dns-svc\") pod \"dnsmasq-dns-8488b8bd7c-vxgf6\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.959633 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-ovsdbserver-nb\") pod \"dnsmasq-dns-8488b8bd7c-vxgf6\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.965545 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cddb5dc7-2k4p9"] Oct 02 12:25:22 crc kubenswrapper[4766]: I1002 12:25:22.986141 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzrts\" (UniqueName: \"kubernetes.io/projected/88226e6f-8efa-4505-a5e5-d4cf947b2d86-kube-api-access-hzrts\") pod \"dnsmasq-dns-8488b8bd7c-vxgf6\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:23 crc kubenswrapper[4766]: I1002 12:25:23.060383 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4e5e1566-6708-4f6e-857e-ca1d6fe153ec-httpd-config\") pod \"neutron-cddb5dc7-2k4p9\" (UID: \"4e5e1566-6708-4f6e-857e-ca1d6fe153ec\") " pod="openstack/neutron-cddb5dc7-2k4p9" Oct 02 12:25:23 crc kubenswrapper[4766]: I1002 12:25:23.060538 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e5e1566-6708-4f6e-857e-ca1d6fe153ec-config\") pod \"neutron-cddb5dc7-2k4p9\" (UID: \"4e5e1566-6708-4f6e-857e-ca1d6fe153ec\") " pod="openstack/neutron-cddb5dc7-2k4p9" Oct 02 12:25:23 crc kubenswrapper[4766]: I1002 12:25:23.060875 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lvx4\" (UniqueName: \"kubernetes.io/projected/4e5e1566-6708-4f6e-857e-ca1d6fe153ec-kube-api-access-2lvx4\") pod \"neutron-cddb5dc7-2k4p9\" (UID: \"4e5e1566-6708-4f6e-857e-ca1d6fe153ec\") " pod="openstack/neutron-cddb5dc7-2k4p9" Oct 02 12:25:23 crc kubenswrapper[4766]: I1002 12:25:23.061018 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5e1566-6708-4f6e-857e-ca1d6fe153ec-combined-ca-bundle\") pod \"neutron-cddb5dc7-2k4p9\" (UID: \"4e5e1566-6708-4f6e-857e-ca1d6fe153ec\") " pod="openstack/neutron-cddb5dc7-2k4p9" Oct 02 12:25:23 crc kubenswrapper[4766]: I1002 12:25:23.072727 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:23 crc kubenswrapper[4766]: I1002 12:25:23.165131 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4e5e1566-6708-4f6e-857e-ca1d6fe153ec-httpd-config\") pod \"neutron-cddb5dc7-2k4p9\" (UID: \"4e5e1566-6708-4f6e-857e-ca1d6fe153ec\") " pod="openstack/neutron-cddb5dc7-2k4p9" Oct 02 12:25:23 crc kubenswrapper[4766]: I1002 12:25:23.166719 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e5e1566-6708-4f6e-857e-ca1d6fe153ec-config\") pod \"neutron-cddb5dc7-2k4p9\" (UID: \"4e5e1566-6708-4f6e-857e-ca1d6fe153ec\") " pod="openstack/neutron-cddb5dc7-2k4p9" Oct 02 12:25:23 crc kubenswrapper[4766]: I1002 12:25:23.166938 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lvx4\" (UniqueName: \"kubernetes.io/projected/4e5e1566-6708-4f6e-857e-ca1d6fe153ec-kube-api-access-2lvx4\") pod \"neutron-cddb5dc7-2k4p9\" (UID: \"4e5e1566-6708-4f6e-857e-ca1d6fe153ec\") " pod="openstack/neutron-cddb5dc7-2k4p9" Oct 02 12:25:23 crc kubenswrapper[4766]: I1002 12:25:23.166983 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5e1566-6708-4f6e-857e-ca1d6fe153ec-combined-ca-bundle\") pod \"neutron-cddb5dc7-2k4p9\" (UID: \"4e5e1566-6708-4f6e-857e-ca1d6fe153ec\") " pod="openstack/neutron-cddb5dc7-2k4p9" Oct 02 12:25:23 crc kubenswrapper[4766]: I1002 12:25:23.169405 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4e5e1566-6708-4f6e-857e-ca1d6fe153ec-httpd-config\") pod \"neutron-cddb5dc7-2k4p9\" (UID: \"4e5e1566-6708-4f6e-857e-ca1d6fe153ec\") " pod="openstack/neutron-cddb5dc7-2k4p9" Oct 02 12:25:23 crc kubenswrapper[4766]: I1002 12:25:23.170529 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e5e1566-6708-4f6e-857e-ca1d6fe153ec-config\") pod \"neutron-cddb5dc7-2k4p9\" (UID: \"4e5e1566-6708-4f6e-857e-ca1d6fe153ec\") " pod="openstack/neutron-cddb5dc7-2k4p9" Oct 02 12:25:23 crc kubenswrapper[4766]: I1002 12:25:23.172007 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5e1566-6708-4f6e-857e-ca1d6fe153ec-combined-ca-bundle\") pod \"neutron-cddb5dc7-2k4p9\" (UID: \"4e5e1566-6708-4f6e-857e-ca1d6fe153ec\") " pod="openstack/neutron-cddb5dc7-2k4p9" Oct 02 12:25:23 crc kubenswrapper[4766]: I1002 12:25:23.186225 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lvx4\" (UniqueName: \"kubernetes.io/projected/4e5e1566-6708-4f6e-857e-ca1d6fe153ec-kube-api-access-2lvx4\") pod \"neutron-cddb5dc7-2k4p9\" (UID: \"4e5e1566-6708-4f6e-857e-ca1d6fe153ec\") " pod="openstack/neutron-cddb5dc7-2k4p9" Oct 02 12:25:23 crc kubenswrapper[4766]: I1002 12:25:23.275251 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cddb5dc7-2k4p9" Oct 02 12:25:23 crc kubenswrapper[4766]: I1002 12:25:23.568773 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8488b8bd7c-vxgf6"] Oct 02 12:25:23 crc kubenswrapper[4766]: I1002 12:25:23.897529 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cddb5dc7-2k4p9"] Oct 02 12:25:24 crc kubenswrapper[4766]: I1002 12:25:24.424382 4766 generic.go:334] "Generic (PLEG): container finished" podID="88226e6f-8efa-4505-a5e5-d4cf947b2d86" containerID="0626ce2f9b774c40c74588a2fb53d5cabeed95b64e8f63d6933910a1c809b6d4" exitCode=0 Oct 02 12:25:24 crc kubenswrapper[4766]: I1002 12:25:24.424452 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" event={"ID":"88226e6f-8efa-4505-a5e5-d4cf947b2d86","Type":"ContainerDied","Data":"0626ce2f9b774c40c74588a2fb53d5cabeed95b64e8f63d6933910a1c809b6d4"} Oct 02 12:25:24 crc kubenswrapper[4766]: I1002 12:25:24.424540 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" event={"ID":"88226e6f-8efa-4505-a5e5-d4cf947b2d86","Type":"ContainerStarted","Data":"c46f81997ed5ee514a2056bd9176f0aaeda47bae2d9e2a869f72ef4279ceb71b"} Oct 02 12:25:24 crc kubenswrapper[4766]: I1002 12:25:24.426931 4766 generic.go:334] "Generic (PLEG): container finished" podID="b42d0006-4241-457f-9f8d-c0875475d86f" containerID="4b44411297cd6bc18f60b49f20668bf7daacc7ed419df6afc139c040f454f266" exitCode=0 Oct 02 12:25:24 crc kubenswrapper[4766]: I1002 12:25:24.427053 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnlrw" event={"ID":"b42d0006-4241-457f-9f8d-c0875475d86f","Type":"ContainerDied","Data":"4b44411297cd6bc18f60b49f20668bf7daacc7ed419df6afc139c040f454f266"} Oct 02 12:25:24 crc kubenswrapper[4766]: I1002 12:25:24.430290 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cddb5dc7-2k4p9" event={"ID":"4e5e1566-6708-4f6e-857e-ca1d6fe153ec","Type":"ContainerStarted","Data":"d86d143327407d4a88319cdd4180d7c90355170712f9a4f05ddafa90f4f149ce"} Oct 02 12:25:24 crc kubenswrapper[4766]: I1002 12:25:24.430313 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cddb5dc7-2k4p9" event={"ID":"4e5e1566-6708-4f6e-857e-ca1d6fe153ec","Type":"ContainerStarted","Data":"49652dc275e18efb66046ad581e4307137352ab902514a58861b268ca4746121"} Oct 02 12:25:24 crc kubenswrapper[4766]: I1002 12:25:24.430325 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cddb5dc7-2k4p9" event={"ID":"4e5e1566-6708-4f6e-857e-ca1d6fe153ec","Type":"ContainerStarted","Data":"2fbabbefe374606fb2e8edcfb7aef84972bb9118f38f715b2da5a2cd44925f2c"} Oct 02 12:25:24 crc kubenswrapper[4766]: I1002 12:25:24.430557 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-cddb5dc7-2k4p9" Oct 02 12:25:24 crc kubenswrapper[4766]: I1002 12:25:24.432011 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:25:24 crc kubenswrapper[4766]: I1002 12:25:24.432052 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:25:24 crc kubenswrapper[4766]: I1002 12:25:24.477413 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-cddb5dc7-2k4p9" podStartSLOduration=2.4773918520000002 podStartE2EDuration="2.477391852s" podCreationTimestamp="2025-10-02 12:25:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:25:24.471251405 +0000 UTC m=+5639.414122349" watchObservedRunningTime="2025-10-02 12:25:24.477391852 +0000 UTC m=+5639.420262796" Oct 02 12:25:25 crc kubenswrapper[4766]: I1002 12:25:25.441366 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" event={"ID":"88226e6f-8efa-4505-a5e5-d4cf947b2d86","Type":"ContainerStarted","Data":"1f0799f089758641ac1fac053e212d9f4c577b98ec8729f678d606a22cc63912"} Oct 02 12:25:25 crc kubenswrapper[4766]: I1002 12:25:25.441959 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:25 crc kubenswrapper[4766]: I1002 12:25:25.443991 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnlrw" event={"ID":"b42d0006-4241-457f-9f8d-c0875475d86f","Type":"ContainerStarted","Data":"a0b30e2e0a87afea00d26292d52522186dc972ebd28750b9e389eaf397076379"} Oct 02 12:25:25 crc kubenswrapper[4766]: I1002 12:25:25.470874 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" podStartSLOduration=3.470848999 podStartE2EDuration="3.470848999s" podCreationTimestamp="2025-10-02 12:25:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:25:25.462242554 +0000 UTC m=+5640.405113498" watchObservedRunningTime="2025-10-02 12:25:25.470848999 +0000 UTC m=+5640.413719943" Oct 02 12:25:25 crc kubenswrapper[4766]: I1002 12:25:25.487781 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vnlrw" podStartSLOduration=2.929470343 podStartE2EDuration="5.487759981s" podCreationTimestamp="2025-10-02 12:25:20 +0000 UTC" firstStartedPulling="2025-10-02 12:25:22.395845998 +0000 UTC m=+5637.338716942" lastFinishedPulling="2025-10-02 12:25:24.954135636 +0000 UTC m=+5639.897006580" observedRunningTime="2025-10-02 12:25:25.479202077 +0000 UTC m=+5640.422073041" watchObservedRunningTime="2025-10-02 12:25:25.487759981 +0000 UTC m=+5640.430630935" Oct 02 12:25:29 crc kubenswrapper[4766]: I1002 12:25:29.699003 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wc2r7" Oct 02 12:25:29 crc kubenswrapper[4766]: I1002 12:25:29.807678 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wc2r7"] Oct 02 12:25:29 crc kubenswrapper[4766]: I1002 12:25:29.845626 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rlmp7"] Oct 02 12:25:29 crc kubenswrapper[4766]: I1002 12:25:29.845936 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rlmp7" podUID="9aebef18-f17d-487d-a23e-472000a73d87" containerName="registry-server" containerID="cri-o://2cccc499a7ffb4a4140490cb2d830bef4f3329fc6097dbe6b03cb7e4feaf5fab" gracePeriod=2 Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.357250 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlmp7" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.418261 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72qxr\" (UniqueName: \"kubernetes.io/projected/9aebef18-f17d-487d-a23e-472000a73d87-kube-api-access-72qxr\") pod \"9aebef18-f17d-487d-a23e-472000a73d87\" (UID: \"9aebef18-f17d-487d-a23e-472000a73d87\") " Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.418493 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aebef18-f17d-487d-a23e-472000a73d87-catalog-content\") pod \"9aebef18-f17d-487d-a23e-472000a73d87\" (UID: \"9aebef18-f17d-487d-a23e-472000a73d87\") " Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.418658 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aebef18-f17d-487d-a23e-472000a73d87-utilities\") pod \"9aebef18-f17d-487d-a23e-472000a73d87\" (UID: \"9aebef18-f17d-487d-a23e-472000a73d87\") " Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.419453 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aebef18-f17d-487d-a23e-472000a73d87-utilities" (OuterVolumeSpecName: "utilities") pod "9aebef18-f17d-487d-a23e-472000a73d87" (UID: "9aebef18-f17d-487d-a23e-472000a73d87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.426858 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aebef18-f17d-487d-a23e-472000a73d87-kube-api-access-72qxr" (OuterVolumeSpecName: "kube-api-access-72qxr") pod "9aebef18-f17d-487d-a23e-472000a73d87" (UID: "9aebef18-f17d-487d-a23e-472000a73d87"). InnerVolumeSpecName "kube-api-access-72qxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.474217 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aebef18-f17d-487d-a23e-472000a73d87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9aebef18-f17d-487d-a23e-472000a73d87" (UID: "9aebef18-f17d-487d-a23e-472000a73d87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.522915 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9aebef18-f17d-487d-a23e-472000a73d87-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.522951 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9aebef18-f17d-487d-a23e-472000a73d87-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.522962 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72qxr\" (UniqueName: \"kubernetes.io/projected/9aebef18-f17d-487d-a23e-472000a73d87-kube-api-access-72qxr\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.527456 4766 generic.go:334] "Generic (PLEG): container finished" podID="9aebef18-f17d-487d-a23e-472000a73d87" containerID="2cccc499a7ffb4a4140490cb2d830bef4f3329fc6097dbe6b03cb7e4feaf5fab" exitCode=0 Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.527515 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlmp7" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.527545 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlmp7" event={"ID":"9aebef18-f17d-487d-a23e-472000a73d87","Type":"ContainerDied","Data":"2cccc499a7ffb4a4140490cb2d830bef4f3329fc6097dbe6b03cb7e4feaf5fab"} Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.527592 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlmp7" event={"ID":"9aebef18-f17d-487d-a23e-472000a73d87","Type":"ContainerDied","Data":"a22322b8ddea43b123f6226e24a3243c27789e8d5728ecfe8133a90ac1e8ff56"} Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.527613 4766 scope.go:117] "RemoveContainer" containerID="2cccc499a7ffb4a4140490cb2d830bef4f3329fc6097dbe6b03cb7e4feaf5fab" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.559553 4766 scope.go:117] "RemoveContainer" containerID="4f2d9d5fc23de7af5795210d175f2c1577a2ad5fe12bdb33df34a7bae5ed2e87" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.571315 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rlmp7"] Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.593164 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rlmp7"] Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.597667 4766 scope.go:117] "RemoveContainer" containerID="ded4a0eaa40a7b949183881f080562b8b4673f8749cb8263cf6514b9bdbfc96a" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.644414 4766 scope.go:117] "RemoveContainer" containerID="2cccc499a7ffb4a4140490cb2d830bef4f3329fc6097dbe6b03cb7e4feaf5fab" Oct 02 12:25:30 crc kubenswrapper[4766]: E1002 12:25:30.644945 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cccc499a7ffb4a4140490cb2d830bef4f3329fc6097dbe6b03cb7e4feaf5fab\": container with ID starting with 2cccc499a7ffb4a4140490cb2d830bef4f3329fc6097dbe6b03cb7e4feaf5fab not found: ID does not exist" containerID="2cccc499a7ffb4a4140490cb2d830bef4f3329fc6097dbe6b03cb7e4feaf5fab" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.644995 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cccc499a7ffb4a4140490cb2d830bef4f3329fc6097dbe6b03cb7e4feaf5fab"} err="failed to get container status \"2cccc499a7ffb4a4140490cb2d830bef4f3329fc6097dbe6b03cb7e4feaf5fab\": rpc error: code = NotFound desc = could not find container \"2cccc499a7ffb4a4140490cb2d830bef4f3329fc6097dbe6b03cb7e4feaf5fab\": container with ID starting with 2cccc499a7ffb4a4140490cb2d830bef4f3329fc6097dbe6b03cb7e4feaf5fab not found: ID does not exist" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.645025 4766 scope.go:117] "RemoveContainer" containerID="4f2d9d5fc23de7af5795210d175f2c1577a2ad5fe12bdb33df34a7bae5ed2e87" Oct 02 12:25:30 crc kubenswrapper[4766]: E1002 12:25:30.645384 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f2d9d5fc23de7af5795210d175f2c1577a2ad5fe12bdb33df34a7bae5ed2e87\": container with ID starting with 4f2d9d5fc23de7af5795210d175f2c1577a2ad5fe12bdb33df34a7bae5ed2e87 not found: ID does not exist" containerID="4f2d9d5fc23de7af5795210d175f2c1577a2ad5fe12bdb33df34a7bae5ed2e87" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.645439 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f2d9d5fc23de7af5795210d175f2c1577a2ad5fe12bdb33df34a7bae5ed2e87"} err="failed to get container status \"4f2d9d5fc23de7af5795210d175f2c1577a2ad5fe12bdb33df34a7bae5ed2e87\": rpc error: code = NotFound desc = could not find container \"4f2d9d5fc23de7af5795210d175f2c1577a2ad5fe12bdb33df34a7bae5ed2e87\": container with ID starting with 4f2d9d5fc23de7af5795210d175f2c1577a2ad5fe12bdb33df34a7bae5ed2e87 not found: ID does not exist" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.645479 4766 scope.go:117] "RemoveContainer" containerID="ded4a0eaa40a7b949183881f080562b8b4673f8749cb8263cf6514b9bdbfc96a" Oct 02 12:25:30 crc kubenswrapper[4766]: E1002 12:25:30.645852 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ded4a0eaa40a7b949183881f080562b8b4673f8749cb8263cf6514b9bdbfc96a\": container with ID starting with ded4a0eaa40a7b949183881f080562b8b4673f8749cb8263cf6514b9bdbfc96a not found: ID does not exist" containerID="ded4a0eaa40a7b949183881f080562b8b4673f8749cb8263cf6514b9bdbfc96a" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.645878 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ded4a0eaa40a7b949183881f080562b8b4673f8749cb8263cf6514b9bdbfc96a"} err="failed to get container status \"ded4a0eaa40a7b949183881f080562b8b4673f8749cb8263cf6514b9bdbfc96a\": rpc error: code = NotFound desc = could not find container \"ded4a0eaa40a7b949183881f080562b8b4673f8749cb8263cf6514b9bdbfc96a\": container with ID starting with ded4a0eaa40a7b949183881f080562b8b4673f8749cb8263cf6514b9bdbfc96a not found: ID does not exist" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.808343 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vnlrw" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.808400 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vnlrw" Oct 02 12:25:30 crc kubenswrapper[4766]: I1002 12:25:30.854527 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vnlrw" Oct 02 12:25:31 crc kubenswrapper[4766]: I1002 12:25:31.609826 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vnlrw" Oct 02 12:25:31 crc kubenswrapper[4766]: I1002 12:25:31.894927 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aebef18-f17d-487d-a23e-472000a73d87" path="/var/lib/kubelet/pods/9aebef18-f17d-487d-a23e-472000a73d87/volumes" Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.074809 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.153493 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnlrw"] Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.172396 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54df6f9497-rfpjr"] Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.176001 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" podUID="5298592c-b60f-4916-9c04-ae15d5dd3236" containerName="dnsmasq-dns" containerID="cri-o://5fa36e67b39ce376323eb3772ee2fa1b2c34eb2037a906bdb806fdc25c7ed0d7" gracePeriod=10 Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.568241 4766 generic.go:334] "Generic (PLEG): container finished" podID="5298592c-b60f-4916-9c04-ae15d5dd3236" containerID="5fa36e67b39ce376323eb3772ee2fa1b2c34eb2037a906bdb806fdc25c7ed0d7" exitCode=0 Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.568733 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" event={"ID":"5298592c-b60f-4916-9c04-ae15d5dd3236","Type":"ContainerDied","Data":"5fa36e67b39ce376323eb3772ee2fa1b2c34eb2037a906bdb806fdc25c7ed0d7"} Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.570711 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vnlrw" podUID="b42d0006-4241-457f-9f8d-c0875475d86f" containerName="registry-server" containerID="cri-o://a0b30e2e0a87afea00d26292d52522186dc972ebd28750b9e389eaf397076379" gracePeriod=2 Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.756884 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.904805 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-config\") pod \"5298592c-b60f-4916-9c04-ae15d5dd3236\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.905694 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7msf9\" (UniqueName: \"kubernetes.io/projected/5298592c-b60f-4916-9c04-ae15d5dd3236-kube-api-access-7msf9\") pod \"5298592c-b60f-4916-9c04-ae15d5dd3236\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.905755 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-ovsdbserver-sb\") pod \"5298592c-b60f-4916-9c04-ae15d5dd3236\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.905785 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-dns-svc\") pod \"5298592c-b60f-4916-9c04-ae15d5dd3236\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.905811 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-ovsdbserver-nb\") pod \"5298592c-b60f-4916-9c04-ae15d5dd3236\" (UID: \"5298592c-b60f-4916-9c04-ae15d5dd3236\") " Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.936054 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5298592c-b60f-4916-9c04-ae15d5dd3236-kube-api-access-7msf9" (OuterVolumeSpecName: "kube-api-access-7msf9") pod "5298592c-b60f-4916-9c04-ae15d5dd3236" (UID: "5298592c-b60f-4916-9c04-ae15d5dd3236"). InnerVolumeSpecName "kube-api-access-7msf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.976928 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5298592c-b60f-4916-9c04-ae15d5dd3236" (UID: "5298592c-b60f-4916-9c04-ae15d5dd3236"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.976944 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5298592c-b60f-4916-9c04-ae15d5dd3236" (UID: "5298592c-b60f-4916-9c04-ae15d5dd3236"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.996665 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-config" (OuterVolumeSpecName: "config") pod "5298592c-b60f-4916-9c04-ae15d5dd3236" (UID: "5298592c-b60f-4916-9c04-ae15d5dd3236"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:25:33 crc kubenswrapper[4766]: I1002 12:25:33.998684 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5298592c-b60f-4916-9c04-ae15d5dd3236" (UID: "5298592c-b60f-4916-9c04-ae15d5dd3236"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.007263 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.007297 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.007308 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.007320 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7msf9\" (UniqueName: \"kubernetes.io/projected/5298592c-b60f-4916-9c04-ae15d5dd3236-kube-api-access-7msf9\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.007329 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5298592c-b60f-4916-9c04-ae15d5dd3236-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.070717 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnlrw" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.210953 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b42d0006-4241-457f-9f8d-c0875475d86f-catalog-content\") pod \"b42d0006-4241-457f-9f8d-c0875475d86f\" (UID: \"b42d0006-4241-457f-9f8d-c0875475d86f\") " Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.211157 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9m86\" (UniqueName: \"kubernetes.io/projected/b42d0006-4241-457f-9f8d-c0875475d86f-kube-api-access-k9m86\") pod \"b42d0006-4241-457f-9f8d-c0875475d86f\" (UID: \"b42d0006-4241-457f-9f8d-c0875475d86f\") " Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.211418 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b42d0006-4241-457f-9f8d-c0875475d86f-utilities\") pod \"b42d0006-4241-457f-9f8d-c0875475d86f\" (UID: \"b42d0006-4241-457f-9f8d-c0875475d86f\") " Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.212812 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b42d0006-4241-457f-9f8d-c0875475d86f-utilities" (OuterVolumeSpecName: "utilities") pod "b42d0006-4241-457f-9f8d-c0875475d86f" (UID: "b42d0006-4241-457f-9f8d-c0875475d86f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.216466 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b42d0006-4241-457f-9f8d-c0875475d86f-kube-api-access-k9m86" (OuterVolumeSpecName: "kube-api-access-k9m86") pod "b42d0006-4241-457f-9f8d-c0875475d86f" (UID: "b42d0006-4241-457f-9f8d-c0875475d86f"). InnerVolumeSpecName "kube-api-access-k9m86". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.227934 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b42d0006-4241-457f-9f8d-c0875475d86f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b42d0006-4241-457f-9f8d-c0875475d86f" (UID: "b42d0006-4241-457f-9f8d-c0875475d86f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.314613 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b42d0006-4241-457f-9f8d-c0875475d86f-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.314685 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b42d0006-4241-457f-9f8d-c0875475d86f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.314706 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9m86\" (UniqueName: \"kubernetes.io/projected/b42d0006-4241-457f-9f8d-c0875475d86f-kube-api-access-k9m86\") on node \"crc\" DevicePath \"\"" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.580398 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" event={"ID":"5298592c-b60f-4916-9c04-ae15d5dd3236","Type":"ContainerDied","Data":"ef3dd720b5c2b69ee040be67772cb846592912a9ae6bcb956775820b6bb7a932"} Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.580457 4766 scope.go:117] "RemoveContainer" containerID="5fa36e67b39ce376323eb3772ee2fa1b2c34eb2037a906bdb806fdc25c7ed0d7" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.580606 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.586353 4766 generic.go:334] "Generic (PLEG): container finished" podID="b42d0006-4241-457f-9f8d-c0875475d86f" containerID="a0b30e2e0a87afea00d26292d52522186dc972ebd28750b9e389eaf397076379" exitCode=0 Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.586409 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnlrw" event={"ID":"b42d0006-4241-457f-9f8d-c0875475d86f","Type":"ContainerDied","Data":"a0b30e2e0a87afea00d26292d52522186dc972ebd28750b9e389eaf397076379"} Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.586419 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnlrw" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.586442 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnlrw" event={"ID":"b42d0006-4241-457f-9f8d-c0875475d86f","Type":"ContainerDied","Data":"8d8f2286f0286120f11bde8b4027c4958a51c423c7785da8852872bf1487d368"} Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.624978 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54df6f9497-rfpjr"] Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.626373 4766 scope.go:117] "RemoveContainer" containerID="b92559a1b95fb8c7a39e870f97e5dcaaa85f3185ed3d3b62be310f66fff4c9a9" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.635619 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54df6f9497-rfpjr"] Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.651540 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnlrw"] Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.652659 4766 scope.go:117] "RemoveContainer" containerID="a0b30e2e0a87afea00d26292d52522186dc972ebd28750b9e389eaf397076379" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.658871 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnlrw"] Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.703267 4766 scope.go:117] "RemoveContainer" containerID="4b44411297cd6bc18f60b49f20668bf7daacc7ed419df6afc139c040f454f266" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.722829 4766 scope.go:117] "RemoveContainer" containerID="4572179a07b595b7ebff9496ffb66de8aedd5d498ab6805222f04cdd1d6db4ee" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.759730 4766 scope.go:117] "RemoveContainer" containerID="a0b30e2e0a87afea00d26292d52522186dc972ebd28750b9e389eaf397076379" Oct 02 12:25:34 crc kubenswrapper[4766]: E1002 12:25:34.761759 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0b30e2e0a87afea00d26292d52522186dc972ebd28750b9e389eaf397076379\": container with ID starting with a0b30e2e0a87afea00d26292d52522186dc972ebd28750b9e389eaf397076379 not found: ID does not exist" containerID="a0b30e2e0a87afea00d26292d52522186dc972ebd28750b9e389eaf397076379" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.761828 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0b30e2e0a87afea00d26292d52522186dc972ebd28750b9e389eaf397076379"} err="failed to get container status \"a0b30e2e0a87afea00d26292d52522186dc972ebd28750b9e389eaf397076379\": rpc error: code = NotFound desc = could not find container \"a0b30e2e0a87afea00d26292d52522186dc972ebd28750b9e389eaf397076379\": container with ID starting with a0b30e2e0a87afea00d26292d52522186dc972ebd28750b9e389eaf397076379 not found: ID does not exist" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.761866 4766 scope.go:117] "RemoveContainer" containerID="4b44411297cd6bc18f60b49f20668bf7daacc7ed419df6afc139c040f454f266" Oct 02 12:25:34 crc kubenswrapper[4766]: E1002 12:25:34.762424 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b44411297cd6bc18f60b49f20668bf7daacc7ed419df6afc139c040f454f266\": container with ID starting with 4b44411297cd6bc18f60b49f20668bf7daacc7ed419df6afc139c040f454f266 not found: ID does not exist" containerID="4b44411297cd6bc18f60b49f20668bf7daacc7ed419df6afc139c040f454f266" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.762481 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b44411297cd6bc18f60b49f20668bf7daacc7ed419df6afc139c040f454f266"} err="failed to get container status \"4b44411297cd6bc18f60b49f20668bf7daacc7ed419df6afc139c040f454f266\": rpc error: code = NotFound desc = could not find container \"4b44411297cd6bc18f60b49f20668bf7daacc7ed419df6afc139c040f454f266\": container with ID starting with 4b44411297cd6bc18f60b49f20668bf7daacc7ed419df6afc139c040f454f266 not found: ID does not exist" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.762558 4766 scope.go:117] "RemoveContainer" containerID="4572179a07b595b7ebff9496ffb66de8aedd5d498ab6805222f04cdd1d6db4ee" Oct 02 12:25:34 crc kubenswrapper[4766]: E1002 12:25:34.763184 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4572179a07b595b7ebff9496ffb66de8aedd5d498ab6805222f04cdd1d6db4ee\": container with ID starting with 4572179a07b595b7ebff9496ffb66de8aedd5d498ab6805222f04cdd1d6db4ee not found: ID does not exist" containerID="4572179a07b595b7ebff9496ffb66de8aedd5d498ab6805222f04cdd1d6db4ee" Oct 02 12:25:34 crc kubenswrapper[4766]: I1002 12:25:34.763216 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4572179a07b595b7ebff9496ffb66de8aedd5d498ab6805222f04cdd1d6db4ee"} err="failed to get container status \"4572179a07b595b7ebff9496ffb66de8aedd5d498ab6805222f04cdd1d6db4ee\": rpc error: code = NotFound desc = could not find container \"4572179a07b595b7ebff9496ffb66de8aedd5d498ab6805222f04cdd1d6db4ee\": container with ID starting with 4572179a07b595b7ebff9496ffb66de8aedd5d498ab6805222f04cdd1d6db4ee not found: ID does not exist" Oct 02 12:25:35 crc kubenswrapper[4766]: I1002 12:25:35.891548 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5298592c-b60f-4916-9c04-ae15d5dd3236" path="/var/lib/kubelet/pods/5298592c-b60f-4916-9c04-ae15d5dd3236/volumes" Oct 02 12:25:35 crc kubenswrapper[4766]: I1002 12:25:35.892749 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b42d0006-4241-457f-9f8d-c0875475d86f" path="/var/lib/kubelet/pods/b42d0006-4241-457f-9f8d-c0875475d86f/volumes" Oct 02 12:25:38 crc kubenswrapper[4766]: I1002 12:25:38.532778 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54df6f9497-rfpjr" podUID="5298592c-b60f-4916-9c04-ae15d5dd3236" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.27:5353: i/o timeout" Oct 02 12:25:53 crc kubenswrapper[4766]: I1002 12:25:53.294577 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-cddb5dc7-2k4p9" Oct 02 12:25:54 crc kubenswrapper[4766]: I1002 12:25:54.431972 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:25:54 crc kubenswrapper[4766]: I1002 12:25:54.432310 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:25:54 crc kubenswrapper[4766]: I1002 12:25:54.432359 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 12:25:54 crc kubenswrapper[4766]: I1002 12:25:54.432991 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:25:54 crc kubenswrapper[4766]: I1002 12:25:54.433044 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" gracePeriod=600 Oct 02 12:25:54 crc kubenswrapper[4766]: E1002 12:25:54.568734 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:25:54 crc kubenswrapper[4766]: I1002 12:25:54.760977 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" exitCode=0 Oct 02 12:25:54 crc kubenswrapper[4766]: I1002 12:25:54.761023 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984"} Oct 02 12:25:54 crc kubenswrapper[4766]: I1002 12:25:54.761062 4766 scope.go:117] "RemoveContainer" containerID="7be1eb90dbe4a2beb104498de2e75466d69e548c7f34b0f0f5bbe74fe4681dd4" Oct 02 12:25:54 crc kubenswrapper[4766]: I1002 12:25:54.762037 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:25:54 crc kubenswrapper[4766]: E1002 12:25:54.762317 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.030944 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-96qjq"] Oct 02 12:26:03 crc kubenswrapper[4766]: E1002 12:26:03.031808 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5298592c-b60f-4916-9c04-ae15d5dd3236" containerName="init" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.031825 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5298592c-b60f-4916-9c04-ae15d5dd3236" containerName="init" Oct 02 12:26:03 crc kubenswrapper[4766]: E1002 12:26:03.031847 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42d0006-4241-457f-9f8d-c0875475d86f" containerName="extract-content" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.031853 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42d0006-4241-457f-9f8d-c0875475d86f" containerName="extract-content" Oct 02 12:26:03 crc kubenswrapper[4766]: E1002 12:26:03.031861 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5298592c-b60f-4916-9c04-ae15d5dd3236" containerName="dnsmasq-dns" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.031868 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5298592c-b60f-4916-9c04-ae15d5dd3236" containerName="dnsmasq-dns" Oct 02 12:26:03 crc kubenswrapper[4766]: E1002 12:26:03.031883 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42d0006-4241-457f-9f8d-c0875475d86f" containerName="extract-utilities" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.031888 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42d0006-4241-457f-9f8d-c0875475d86f" containerName="extract-utilities" Oct 02 12:26:03 crc kubenswrapper[4766]: E1002 12:26:03.031903 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aebef18-f17d-487d-a23e-472000a73d87" containerName="registry-server" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.031908 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aebef18-f17d-487d-a23e-472000a73d87" containerName="registry-server" Oct 02 12:26:03 crc kubenswrapper[4766]: E1002 12:26:03.031917 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aebef18-f17d-487d-a23e-472000a73d87" containerName="extract-content" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.031923 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aebef18-f17d-487d-a23e-472000a73d87" containerName="extract-content" Oct 02 12:26:03 crc kubenswrapper[4766]: E1002 12:26:03.031936 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aebef18-f17d-487d-a23e-472000a73d87" containerName="extract-utilities" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.031942 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aebef18-f17d-487d-a23e-472000a73d87" containerName="extract-utilities" Oct 02 12:26:03 crc kubenswrapper[4766]: E1002 12:26:03.031958 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b42d0006-4241-457f-9f8d-c0875475d86f" containerName="registry-server" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.031964 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b42d0006-4241-457f-9f8d-c0875475d86f" containerName="registry-server" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.032119 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5298592c-b60f-4916-9c04-ae15d5dd3236" containerName="dnsmasq-dns" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.032129 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b42d0006-4241-457f-9f8d-c0875475d86f" containerName="registry-server" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.032143 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aebef18-f17d-487d-a23e-472000a73d87" containerName="registry-server" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.032873 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-96qjq" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.047338 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-96qjq"] Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.135879 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2mm7\" (UniqueName: \"kubernetes.io/projected/690f940e-9bfd-4423-8f76-3e3d5c55347d-kube-api-access-h2mm7\") pod \"glance-db-create-96qjq\" (UID: \"690f940e-9bfd-4423-8f76-3e3d5c55347d\") " pod="openstack/glance-db-create-96qjq" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.238354 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2mm7\" (UniqueName: \"kubernetes.io/projected/690f940e-9bfd-4423-8f76-3e3d5c55347d-kube-api-access-h2mm7\") pod \"glance-db-create-96qjq\" (UID: \"690f940e-9bfd-4423-8f76-3e3d5c55347d\") " pod="openstack/glance-db-create-96qjq" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.258026 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2mm7\" (UniqueName: \"kubernetes.io/projected/690f940e-9bfd-4423-8f76-3e3d5c55347d-kube-api-access-h2mm7\") pod \"glance-db-create-96qjq\" (UID: \"690f940e-9bfd-4423-8f76-3e3d5c55347d\") " pod="openstack/glance-db-create-96qjq" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.357660 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-96qjq" Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.838965 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-96qjq"] Oct 02 12:26:03 crc kubenswrapper[4766]: I1002 12:26:03.858358 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-96qjq" event={"ID":"690f940e-9bfd-4423-8f76-3e3d5c55347d","Type":"ContainerStarted","Data":"b4076073ac36dfee2589800382096911d33bdfa1036f1cd32e6c1e0e16d93787"} Oct 02 12:26:04 crc kubenswrapper[4766]: I1002 12:26:04.867097 4766 generic.go:334] "Generic (PLEG): container finished" podID="690f940e-9bfd-4423-8f76-3e3d5c55347d" containerID="d5d5aff3b8057332d11063cbf80d0d045f4a8f2111b3026cf7cefdd510f528ac" exitCode=0 Oct 02 12:26:04 crc kubenswrapper[4766]: I1002 12:26:04.867201 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-96qjq" event={"ID":"690f940e-9bfd-4423-8f76-3e3d5c55347d","Type":"ContainerDied","Data":"d5d5aff3b8057332d11063cbf80d0d045f4a8f2111b3026cf7cefdd510f528ac"} Oct 02 12:26:06 crc kubenswrapper[4766]: I1002 12:26:06.260493 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-96qjq" Oct 02 12:26:06 crc kubenswrapper[4766]: I1002 12:26:06.404720 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2mm7\" (UniqueName: \"kubernetes.io/projected/690f940e-9bfd-4423-8f76-3e3d5c55347d-kube-api-access-h2mm7\") pod \"690f940e-9bfd-4423-8f76-3e3d5c55347d\" (UID: \"690f940e-9bfd-4423-8f76-3e3d5c55347d\") " Oct 02 12:26:06 crc kubenswrapper[4766]: I1002 12:26:06.413883 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/690f940e-9bfd-4423-8f76-3e3d5c55347d-kube-api-access-h2mm7" (OuterVolumeSpecName: "kube-api-access-h2mm7") pod "690f940e-9bfd-4423-8f76-3e3d5c55347d" (UID: "690f940e-9bfd-4423-8f76-3e3d5c55347d"). InnerVolumeSpecName "kube-api-access-h2mm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:26:06 crc kubenswrapper[4766]: I1002 12:26:06.507703 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2mm7\" (UniqueName: \"kubernetes.io/projected/690f940e-9bfd-4423-8f76-3e3d5c55347d-kube-api-access-h2mm7\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:06 crc kubenswrapper[4766]: I1002 12:26:06.894336 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-96qjq" event={"ID":"690f940e-9bfd-4423-8f76-3e3d5c55347d","Type":"ContainerDied","Data":"b4076073ac36dfee2589800382096911d33bdfa1036f1cd32e6c1e0e16d93787"} Oct 02 12:26:06 crc kubenswrapper[4766]: I1002 12:26:06.894376 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4076073ac36dfee2589800382096911d33bdfa1036f1cd32e6c1e0e16d93787" Oct 02 12:26:06 crc kubenswrapper[4766]: I1002 12:26:06.894384 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-96qjq" Oct 02 12:26:09 crc kubenswrapper[4766]: I1002 12:26:09.881201 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:26:09 crc kubenswrapper[4766]: E1002 12:26:09.881690 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:26:13 crc kubenswrapper[4766]: I1002 12:26:13.119156 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-e3d4-account-create-fknc7"] Oct 02 12:26:13 crc kubenswrapper[4766]: E1002 12:26:13.124049 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="690f940e-9bfd-4423-8f76-3e3d5c55347d" containerName="mariadb-database-create" Oct 02 12:26:13 crc kubenswrapper[4766]: I1002 12:26:13.124076 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="690f940e-9bfd-4423-8f76-3e3d5c55347d" containerName="mariadb-database-create" Oct 02 12:26:13 crc kubenswrapper[4766]: I1002 12:26:13.124908 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="690f940e-9bfd-4423-8f76-3e3d5c55347d" containerName="mariadb-database-create" Oct 02 12:26:13 crc kubenswrapper[4766]: I1002 12:26:13.125876 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e3d4-account-create-fknc7" Oct 02 12:26:13 crc kubenswrapper[4766]: I1002 12:26:13.128318 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 02 12:26:13 crc kubenswrapper[4766]: I1002 12:26:13.128564 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e3d4-account-create-fknc7"] Oct 02 12:26:13 crc kubenswrapper[4766]: I1002 12:26:13.239938 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk4l2\" (UniqueName: \"kubernetes.io/projected/15e20b4c-4af1-42f7-8391-8b187672bb16-kube-api-access-kk4l2\") pod \"glance-e3d4-account-create-fknc7\" (UID: \"15e20b4c-4af1-42f7-8391-8b187672bb16\") " pod="openstack/glance-e3d4-account-create-fknc7" Oct 02 12:26:13 crc kubenswrapper[4766]: I1002 12:26:13.342909 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk4l2\" (UniqueName: \"kubernetes.io/projected/15e20b4c-4af1-42f7-8391-8b187672bb16-kube-api-access-kk4l2\") pod \"glance-e3d4-account-create-fknc7\" (UID: \"15e20b4c-4af1-42f7-8391-8b187672bb16\") " pod="openstack/glance-e3d4-account-create-fknc7" Oct 02 12:26:13 crc kubenswrapper[4766]: I1002 12:26:13.361816 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk4l2\" (UniqueName: \"kubernetes.io/projected/15e20b4c-4af1-42f7-8391-8b187672bb16-kube-api-access-kk4l2\") pod \"glance-e3d4-account-create-fknc7\" (UID: \"15e20b4c-4af1-42f7-8391-8b187672bb16\") " pod="openstack/glance-e3d4-account-create-fknc7" Oct 02 12:26:13 crc kubenswrapper[4766]: I1002 12:26:13.449326 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e3d4-account-create-fknc7" Oct 02 12:26:13 crc kubenswrapper[4766]: I1002 12:26:13.893586 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e3d4-account-create-fknc7"] Oct 02 12:26:13 crc kubenswrapper[4766]: I1002 12:26:13.959314 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e3d4-account-create-fknc7" event={"ID":"15e20b4c-4af1-42f7-8391-8b187672bb16","Type":"ContainerStarted","Data":"2a94f4aa7afc2f9d5fc1cff1af101507f4d25fa866d94e96ead85ffa90bacabe"} Oct 02 12:26:14 crc kubenswrapper[4766]: I1002 12:26:14.972191 4766 generic.go:334] "Generic (PLEG): container finished" podID="15e20b4c-4af1-42f7-8391-8b187672bb16" containerID="042375723cc3113382d4f430b4bc84948ad966137380389ed9a04aecba9152f4" exitCode=0 Oct 02 12:26:14 crc kubenswrapper[4766]: I1002 12:26:14.972433 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e3d4-account-create-fknc7" event={"ID":"15e20b4c-4af1-42f7-8391-8b187672bb16","Type":"ContainerDied","Data":"042375723cc3113382d4f430b4bc84948ad966137380389ed9a04aecba9152f4"} Oct 02 12:26:16 crc kubenswrapper[4766]: I1002 12:26:16.298279 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e3d4-account-create-fknc7" Oct 02 12:26:16 crc kubenswrapper[4766]: I1002 12:26:16.408004 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk4l2\" (UniqueName: \"kubernetes.io/projected/15e20b4c-4af1-42f7-8391-8b187672bb16-kube-api-access-kk4l2\") pod \"15e20b4c-4af1-42f7-8391-8b187672bb16\" (UID: \"15e20b4c-4af1-42f7-8391-8b187672bb16\") " Oct 02 12:26:16 crc kubenswrapper[4766]: I1002 12:26:16.413701 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15e20b4c-4af1-42f7-8391-8b187672bb16-kube-api-access-kk4l2" (OuterVolumeSpecName: "kube-api-access-kk4l2") pod "15e20b4c-4af1-42f7-8391-8b187672bb16" (UID: "15e20b4c-4af1-42f7-8391-8b187672bb16"). InnerVolumeSpecName "kube-api-access-kk4l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:26:16 crc kubenswrapper[4766]: I1002 12:26:16.509661 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk4l2\" (UniqueName: \"kubernetes.io/projected/15e20b4c-4af1-42f7-8391-8b187672bb16-kube-api-access-kk4l2\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:16 crc kubenswrapper[4766]: I1002 12:26:16.998216 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e3d4-account-create-fknc7" event={"ID":"15e20b4c-4af1-42f7-8391-8b187672bb16","Type":"ContainerDied","Data":"2a94f4aa7afc2f9d5fc1cff1af101507f4d25fa866d94e96ead85ffa90bacabe"} Oct 02 12:26:16 crc kubenswrapper[4766]: I1002 12:26:16.998271 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a94f4aa7afc2f9d5fc1cff1af101507f4d25fa866d94e96ead85ffa90bacabe" Oct 02 12:26:16 crc kubenswrapper[4766]: I1002 12:26:16.998269 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e3d4-account-create-fknc7" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.251963 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-zhtb7"] Oct 02 12:26:18 crc kubenswrapper[4766]: E1002 12:26:18.252402 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e20b4c-4af1-42f7-8391-8b187672bb16" containerName="mariadb-account-create" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.252419 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e20b4c-4af1-42f7-8391-8b187672bb16" containerName="mariadb-account-create" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.252628 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e20b4c-4af1-42f7-8391-8b187672bb16" containerName="mariadb-account-create" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.253258 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zhtb7" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.255440 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kk7bf" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.256570 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.266434 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zhtb7"] Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.341413 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-config-data\") pod \"glance-db-sync-zhtb7\" (UID: \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\") " pod="openstack/glance-db-sync-zhtb7" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.341558 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-combined-ca-bundle\") pod \"glance-db-sync-zhtb7\" (UID: \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\") " pod="openstack/glance-db-sync-zhtb7" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.341602 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jwsg\" (UniqueName: \"kubernetes.io/projected/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-kube-api-access-4jwsg\") pod \"glance-db-sync-zhtb7\" (UID: \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\") " pod="openstack/glance-db-sync-zhtb7" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.341638 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-db-sync-config-data\") pod \"glance-db-sync-zhtb7\" (UID: \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\") " pod="openstack/glance-db-sync-zhtb7" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.443336 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-combined-ca-bundle\") pod \"glance-db-sync-zhtb7\" (UID: \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\") " pod="openstack/glance-db-sync-zhtb7" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.443425 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jwsg\" (UniqueName: \"kubernetes.io/projected/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-kube-api-access-4jwsg\") pod \"glance-db-sync-zhtb7\" (UID: \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\") " pod="openstack/glance-db-sync-zhtb7" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.443446 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-db-sync-config-data\") pod \"glance-db-sync-zhtb7\" (UID: \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\") " pod="openstack/glance-db-sync-zhtb7" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.443560 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-config-data\") pod \"glance-db-sync-zhtb7\" (UID: \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\") " pod="openstack/glance-db-sync-zhtb7" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.448254 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-combined-ca-bundle\") pod \"glance-db-sync-zhtb7\" (UID: \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\") " pod="openstack/glance-db-sync-zhtb7" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.448261 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-config-data\") pod \"glance-db-sync-zhtb7\" (UID: \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\") " pod="openstack/glance-db-sync-zhtb7" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.460181 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-db-sync-config-data\") pod \"glance-db-sync-zhtb7\" (UID: \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\") " pod="openstack/glance-db-sync-zhtb7" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.465335 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jwsg\" (UniqueName: \"kubernetes.io/projected/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-kube-api-access-4jwsg\") pod \"glance-db-sync-zhtb7\" (UID: \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\") " pod="openstack/glance-db-sync-zhtb7" Oct 02 12:26:18 crc kubenswrapper[4766]: I1002 12:26:18.569578 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zhtb7" Oct 02 12:26:19 crc kubenswrapper[4766]: I1002 12:26:19.108879 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zhtb7"] Oct 02 12:26:20 crc kubenswrapper[4766]: I1002 12:26:20.037118 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zhtb7" event={"ID":"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7","Type":"ContainerStarted","Data":"bcce8611cd6371522e4b9a78afe114f325e07d3d293640d426de5d0afe64e439"} Oct 02 12:26:20 crc kubenswrapper[4766]: I1002 12:26:20.037711 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zhtb7" event={"ID":"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7","Type":"ContainerStarted","Data":"d19a00ac282c2e942ad9f6300c5de6240722dcec979f5af3e792b445b87a8aa4"} Oct 02 12:26:20 crc kubenswrapper[4766]: I1002 12:26:20.069079 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-zhtb7" podStartSLOduration=2.069051957 podStartE2EDuration="2.069051957s" podCreationTimestamp="2025-10-02 12:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:26:20.058994015 +0000 UTC m=+5695.001864969" watchObservedRunningTime="2025-10-02 12:26:20.069051957 +0000 UTC m=+5695.011922901" Oct 02 12:26:23 crc kubenswrapper[4766]: I1002 12:26:23.060912 4766 generic.go:334] "Generic (PLEG): container finished" podID="dd100b70-ea16-47b3-ac1b-6ec049ff4ee7" containerID="bcce8611cd6371522e4b9a78afe114f325e07d3d293640d426de5d0afe64e439" exitCode=0 Oct 02 12:26:23 crc kubenswrapper[4766]: I1002 12:26:23.061013 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zhtb7" event={"ID":"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7","Type":"ContainerDied","Data":"bcce8611cd6371522e4b9a78afe114f325e07d3d293640d426de5d0afe64e439"} Oct 02 12:26:24 crc kubenswrapper[4766]: I1002 12:26:24.464023 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zhtb7" Oct 02 12:26:24 crc kubenswrapper[4766]: I1002 12:26:24.563173 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-db-sync-config-data\") pod \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\" (UID: \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\") " Oct 02 12:26:24 crc kubenswrapper[4766]: I1002 12:26:24.563377 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jwsg\" (UniqueName: \"kubernetes.io/projected/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-kube-api-access-4jwsg\") pod \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\" (UID: \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\") " Oct 02 12:26:24 crc kubenswrapper[4766]: I1002 12:26:24.563436 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-config-data\") pod \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\" (UID: \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\") " Oct 02 12:26:24 crc kubenswrapper[4766]: I1002 12:26:24.563464 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-combined-ca-bundle\") pod \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\" (UID: \"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7\") " Oct 02 12:26:24 crc kubenswrapper[4766]: I1002 12:26:24.568816 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-kube-api-access-4jwsg" (OuterVolumeSpecName: "kube-api-access-4jwsg") pod "dd100b70-ea16-47b3-ac1b-6ec049ff4ee7" (UID: "dd100b70-ea16-47b3-ac1b-6ec049ff4ee7"). InnerVolumeSpecName "kube-api-access-4jwsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:26:24 crc kubenswrapper[4766]: I1002 12:26:24.570782 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dd100b70-ea16-47b3-ac1b-6ec049ff4ee7" (UID: "dd100b70-ea16-47b3-ac1b-6ec049ff4ee7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:26:24 crc kubenswrapper[4766]: I1002 12:26:24.586688 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd100b70-ea16-47b3-ac1b-6ec049ff4ee7" (UID: "dd100b70-ea16-47b3-ac1b-6ec049ff4ee7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:26:24 crc kubenswrapper[4766]: I1002 12:26:24.603708 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-config-data" (OuterVolumeSpecName: "config-data") pod "dd100b70-ea16-47b3-ac1b-6ec049ff4ee7" (UID: "dd100b70-ea16-47b3-ac1b-6ec049ff4ee7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:26:24 crc kubenswrapper[4766]: I1002 12:26:24.666204 4766 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:24 crc kubenswrapper[4766]: I1002 12:26:24.666257 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jwsg\" (UniqueName: \"kubernetes.io/projected/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-kube-api-access-4jwsg\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:24 crc kubenswrapper[4766]: I1002 12:26:24.666271 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:24 crc kubenswrapper[4766]: I1002 12:26:24.666279 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:24 crc kubenswrapper[4766]: I1002 12:26:24.881539 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:26:24 crc kubenswrapper[4766]: E1002 12:26:24.881913 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.081700 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zhtb7" event={"ID":"dd100b70-ea16-47b3-ac1b-6ec049ff4ee7","Type":"ContainerDied","Data":"d19a00ac282c2e942ad9f6300c5de6240722dcec979f5af3e792b445b87a8aa4"} Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.081987 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d19a00ac282c2e942ad9f6300c5de6240722dcec979f5af3e792b445b87a8aa4" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.081751 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zhtb7" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.510213 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cf567857-rsl2t"] Oct 02 12:26:25 crc kubenswrapper[4766]: E1002 12:26:25.510588 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd100b70-ea16-47b3-ac1b-6ec049ff4ee7" containerName="glance-db-sync" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.510601 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd100b70-ea16-47b3-ac1b-6ec049ff4ee7" containerName="glance-db-sync" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.510764 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd100b70-ea16-47b3-ac1b-6ec049ff4ee7" containerName="glance-db-sync" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.511814 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.527889 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.529662 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.536715 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.536856 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.536901 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kk7bf" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.537267 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.559562 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cf567857-rsl2t"] Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.582656 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-dns-svc\") pod \"dnsmasq-dns-6cf567857-rsl2t\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.582720 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc5kb\" (UniqueName: \"kubernetes.io/projected/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-kube-api-access-sc5kb\") pod \"dnsmasq-dns-6cf567857-rsl2t\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.582782 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-ovsdbserver-nb\") pod \"dnsmasq-dns-6cf567857-rsl2t\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.582799 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-config\") pod \"dnsmasq-dns-6cf567857-rsl2t\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.582826 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-ovsdbserver-sb\") pod \"dnsmasq-dns-6cf567857-rsl2t\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.612332 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.635268 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.646149 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.649973 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.684593 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.684375 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.685519 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-dns-svc\") pod \"dnsmasq-dns-6cf567857-rsl2t\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.685562 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-ceph\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.685588 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wmzz\" (UniqueName: \"kubernetes.io/projected/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-kube-api-access-6wmzz\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.685605 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-config-data\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.685638 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-scripts\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.685672 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc5kb\" (UniqueName: \"kubernetes.io/projected/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-kube-api-access-sc5kb\") pod \"dnsmasq-dns-6cf567857-rsl2t\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.685744 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-logs\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.685809 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-ovsdbserver-nb\") pod \"dnsmasq-dns-6cf567857-rsl2t\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.685831 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-config\") pod \"dnsmasq-dns-6cf567857-rsl2t\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.685884 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-ovsdbserver-sb\") pod \"dnsmasq-dns-6cf567857-rsl2t\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.685950 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.686860 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-dns-svc\") pod \"dnsmasq-dns-6cf567857-rsl2t\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.687385 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-ovsdbserver-nb\") pod \"dnsmasq-dns-6cf567857-rsl2t\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.687929 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-ovsdbserver-sb\") pod \"dnsmasq-dns-6cf567857-rsl2t\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.688236 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-config\") pod \"dnsmasq-dns-6cf567857-rsl2t\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.728415 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc5kb\" (UniqueName: \"kubernetes.io/projected/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-kube-api-access-sc5kb\") pod \"dnsmasq-dns-6cf567857-rsl2t\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.787988 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-ceph\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.788331 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wmzz\" (UniqueName: \"kubernetes.io/projected/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-kube-api-access-6wmzz\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.788414 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-config-data\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.788528 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsknw\" (UniqueName: \"kubernetes.io/projected/effd14d9-eba5-420a-a5da-97fb53e5f78a-kube-api-access-rsknw\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.788638 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.788730 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-scripts\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.788892 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-logs\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.789018 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/effd14d9-eba5-420a-a5da-97fb53e5f78a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.789152 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.789264 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/effd14d9-eba5-420a-a5da-97fb53e5f78a-logs\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.789390 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/effd14d9-eba5-420a-a5da-97fb53e5f78a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.789520 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.789678 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.789785 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.789749 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-logs\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.790062 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.791560 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-ceph\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.801404 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-scripts\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.801761 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-config-data\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.802376 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.808228 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wmzz\" (UniqueName: \"kubernetes.io/projected/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-kube-api-access-6wmzz\") pod \"glance-default-external-api-0\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.845780 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.855156 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.891016 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsknw\" (UniqueName: \"kubernetes.io/projected/effd14d9-eba5-420a-a5da-97fb53e5f78a-kube-api-access-rsknw\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.891071 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.891154 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/effd14d9-eba5-420a-a5da-97fb53e5f78a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.891201 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.891236 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/effd14d9-eba5-420a-a5da-97fb53e5f78a-logs\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.891279 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/effd14d9-eba5-420a-a5da-97fb53e5f78a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.891339 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.892490 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/effd14d9-eba5-420a-a5da-97fb53e5f78a-logs\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.892651 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/effd14d9-eba5-420a-a5da-97fb53e5f78a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.897932 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.899210 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.905197 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/effd14d9-eba5-420a-a5da-97fb53e5f78a-ceph\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.911945 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsknw\" (UniqueName: \"kubernetes.io/projected/effd14d9-eba5-420a-a5da-97fb53e5f78a-kube-api-access-rsknw\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.912835 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.917855 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:25 crc kubenswrapper[4766]: I1002 12:26:25.968195 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:26:26 crc kubenswrapper[4766]: I1002 12:26:26.400668 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cf567857-rsl2t"] Oct 02 12:26:26 crc kubenswrapper[4766]: I1002 12:26:26.560309 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:26:26 crc kubenswrapper[4766]: W1002 12:26:26.565280 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32bc8e87_bb2f_4be7_8ec5_61117163e3f7.slice/crio-de9de29daec547c5f6ec09ef4fae20ad7e73af38f932ab74a69d60a58846fc6c WatchSource:0}: Error finding container de9de29daec547c5f6ec09ef4fae20ad7e73af38f932ab74a69d60a58846fc6c: Status 404 returned error can't find the container with id de9de29daec547c5f6ec09ef4fae20ad7e73af38f932ab74a69d60a58846fc6c Oct 02 12:26:26 crc kubenswrapper[4766]: I1002 12:26:26.678379 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:26:26 crc kubenswrapper[4766]: W1002 12:26:26.713550 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeffd14d9_eba5_420a_a5da_97fb53e5f78a.slice/crio-56136b9d0f147a43c5d062ca2f4f1a0f62f62a2f0ed98ed5b8defd9953fdb993 WatchSource:0}: Error finding container 56136b9d0f147a43c5d062ca2f4f1a0f62f62a2f0ed98ed5b8defd9953fdb993: Status 404 returned error can't find the container with id 56136b9d0f147a43c5d062ca2f4f1a0f62f62a2f0ed98ed5b8defd9953fdb993 Oct 02 12:26:27 crc kubenswrapper[4766]: I1002 12:26:27.123469 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32bc8e87-bb2f-4be7-8ec5-61117163e3f7","Type":"ContainerStarted","Data":"de9de29daec547c5f6ec09ef4fae20ad7e73af38f932ab74a69d60a58846fc6c"} Oct 02 12:26:27 crc kubenswrapper[4766]: I1002 12:26:27.129257 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"effd14d9-eba5-420a-a5da-97fb53e5f78a","Type":"ContainerStarted","Data":"56136b9d0f147a43c5d062ca2f4f1a0f62f62a2f0ed98ed5b8defd9953fdb993"} Oct 02 12:26:27 crc kubenswrapper[4766]: I1002 12:26:27.134067 4766 generic.go:334] "Generic (PLEG): container finished" podID="2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887" containerID="3153246a907df9e4703210bf1d959614ef808e6a56b50f77965fa2b24e1b4dc9" exitCode=0 Oct 02 12:26:27 crc kubenswrapper[4766]: I1002 12:26:27.134129 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf567857-rsl2t" event={"ID":"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887","Type":"ContainerDied","Data":"3153246a907df9e4703210bf1d959614ef808e6a56b50f77965fa2b24e1b4dc9"} Oct 02 12:26:27 crc kubenswrapper[4766]: I1002 12:26:27.134160 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf567857-rsl2t" event={"ID":"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887","Type":"ContainerStarted","Data":"6f2bbc2d5952f25637adc6d4426dc24e1ff83fbd6f8a9b22cfc675d9504db66b"} Oct 02 12:26:27 crc kubenswrapper[4766]: I1002 12:26:27.182063 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:26:28 crc kubenswrapper[4766]: I1002 12:26:28.148324 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32bc8e87-bb2f-4be7-8ec5-61117163e3f7","Type":"ContainerStarted","Data":"322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0"} Oct 02 12:26:28 crc kubenswrapper[4766]: I1002 12:26:28.148989 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32bc8e87-bb2f-4be7-8ec5-61117163e3f7","Type":"ContainerStarted","Data":"7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1"} Oct 02 12:26:28 crc kubenswrapper[4766]: I1002 12:26:28.148490 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="32bc8e87-bb2f-4be7-8ec5-61117163e3f7" containerName="glance-httpd" containerID="cri-o://322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0" gracePeriod=30 Oct 02 12:26:28 crc kubenswrapper[4766]: I1002 12:26:28.148427 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="32bc8e87-bb2f-4be7-8ec5-61117163e3f7" containerName="glance-log" containerID="cri-o://7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1" gracePeriod=30 Oct 02 12:26:28 crc kubenswrapper[4766]: I1002 12:26:28.157928 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"effd14d9-eba5-420a-a5da-97fb53e5f78a","Type":"ContainerStarted","Data":"459a4ccb91c089449067908175650f9513330b3c6c26139d096e120e22de3a34"} Oct 02 12:26:28 crc kubenswrapper[4766]: I1002 12:26:28.158011 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"effd14d9-eba5-420a-a5da-97fb53e5f78a","Type":"ContainerStarted","Data":"5ebea97cc994cee1df6417d576a82a2eeaba50ae9b30c0b3f253285fdfbb66a4"} Oct 02 12:26:28 crc kubenswrapper[4766]: I1002 12:26:28.163787 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf567857-rsl2t" event={"ID":"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887","Type":"ContainerStarted","Data":"4d753e242a3dfa893faed26064f96fdeab9da52620b16251be5e229e1fafeef0"} Oct 02 12:26:28 crc kubenswrapper[4766]: I1002 12:26:28.164008 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:28 crc kubenswrapper[4766]: I1002 12:26:28.180601 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.180578252 podStartE2EDuration="3.180578252s" podCreationTimestamp="2025-10-02 12:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:26:28.175221991 +0000 UTC m=+5703.118092935" watchObservedRunningTime="2025-10-02 12:26:28.180578252 +0000 UTC m=+5703.123449196" Oct 02 12:26:28 crc kubenswrapper[4766]: I1002 12:26:28.199455 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cf567857-rsl2t" podStartSLOduration=3.1994337059999998 podStartE2EDuration="3.199433706s" podCreationTimestamp="2025-10-02 12:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:26:28.196899054 +0000 UTC m=+5703.139769998" watchObservedRunningTime="2025-10-02 12:26:28.199433706 +0000 UTC m=+5703.142304650" Oct 02 12:26:28 crc kubenswrapper[4766]: I1002 12:26:28.223041 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.223009061 podStartE2EDuration="3.223009061s" podCreationTimestamp="2025-10-02 12:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:26:28.217774162 +0000 UTC m=+5703.160645106" watchObservedRunningTime="2025-10-02 12:26:28.223009061 +0000 UTC m=+5703.165880005" Oct 02 12:26:28 crc kubenswrapper[4766]: I1002 12:26:28.932755 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.073042 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-combined-ca-bundle\") pod \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.073193 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-httpd-run\") pod \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.073638 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "32bc8e87-bb2f-4be7-8ec5-61117163e3f7" (UID: "32bc8e87-bb2f-4be7-8ec5-61117163e3f7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.073228 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wmzz\" (UniqueName: \"kubernetes.io/projected/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-kube-api-access-6wmzz\") pod \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.073714 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-ceph\") pod \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.073744 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-logs\") pod \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.074099 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-logs" (OuterVolumeSpecName: "logs") pod "32bc8e87-bb2f-4be7-8ec5-61117163e3f7" (UID: "32bc8e87-bb2f-4be7-8ec5-61117163e3f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.074430 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-scripts\") pod \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.074755 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-config-data\") pod \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\" (UID: \"32bc8e87-bb2f-4be7-8ec5-61117163e3f7\") " Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.075176 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.075196 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.090441 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-scripts" (OuterVolumeSpecName: "scripts") pod "32bc8e87-bb2f-4be7-8ec5-61117163e3f7" (UID: "32bc8e87-bb2f-4be7-8ec5-61117163e3f7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.090610 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-ceph" (OuterVolumeSpecName: "ceph") pod "32bc8e87-bb2f-4be7-8ec5-61117163e3f7" (UID: "32bc8e87-bb2f-4be7-8ec5-61117163e3f7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.090606 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-kube-api-access-6wmzz" (OuterVolumeSpecName: "kube-api-access-6wmzz") pod "32bc8e87-bb2f-4be7-8ec5-61117163e3f7" (UID: "32bc8e87-bb2f-4be7-8ec5-61117163e3f7"). InnerVolumeSpecName "kube-api-access-6wmzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.120726 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32bc8e87-bb2f-4be7-8ec5-61117163e3f7" (UID: "32bc8e87-bb2f-4be7-8ec5-61117163e3f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.156111 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-config-data" (OuterVolumeSpecName: "config-data") pod "32bc8e87-bb2f-4be7-8ec5-61117163e3f7" (UID: "32bc8e87-bb2f-4be7-8ec5-61117163e3f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.178154 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wmzz\" (UniqueName: \"kubernetes.io/projected/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-kube-api-access-6wmzz\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.178192 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.178209 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.178225 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.178239 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bc8e87-bb2f-4be7-8ec5-61117163e3f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.180263 4766 generic.go:334] "Generic (PLEG): container finished" podID="32bc8e87-bb2f-4be7-8ec5-61117163e3f7" containerID="322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0" exitCode=143 Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.180321 4766 generic.go:334] "Generic (PLEG): container finished" podID="32bc8e87-bb2f-4be7-8ec5-61117163e3f7" containerID="7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1" exitCode=143 Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.181309 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.188679 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32bc8e87-bb2f-4be7-8ec5-61117163e3f7","Type":"ContainerDied","Data":"322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0"} Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.188760 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32bc8e87-bb2f-4be7-8ec5-61117163e3f7","Type":"ContainerDied","Data":"7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1"} Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.188914 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"32bc8e87-bb2f-4be7-8ec5-61117163e3f7","Type":"ContainerDied","Data":"de9de29daec547c5f6ec09ef4fae20ad7e73af38f932ab74a69d60a58846fc6c"} Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.188937 4766 scope.go:117] "RemoveContainer" containerID="322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.228257 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.230164 4766 scope.go:117] "RemoveContainer" containerID="7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.239071 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.248774 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:26:29 crc kubenswrapper[4766]: E1002 12:26:29.249298 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32bc8e87-bb2f-4be7-8ec5-61117163e3f7" containerName="glance-httpd" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.249328 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="32bc8e87-bb2f-4be7-8ec5-61117163e3f7" containerName="glance-httpd" Oct 02 12:26:29 crc kubenswrapper[4766]: E1002 12:26:29.249358 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32bc8e87-bb2f-4be7-8ec5-61117163e3f7" containerName="glance-log" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.249369 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="32bc8e87-bb2f-4be7-8ec5-61117163e3f7" containerName="glance-log" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.249651 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="32bc8e87-bb2f-4be7-8ec5-61117163e3f7" containerName="glance-httpd" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.249698 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="32bc8e87-bb2f-4be7-8ec5-61117163e3f7" containerName="glance-log" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.269976 4766 scope.go:117] "RemoveContainer" containerID="322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.272062 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.273319 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:26:29 crc kubenswrapper[4766]: E1002 12:26:29.277404 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0\": container with ID starting with 322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0 not found: ID does not exist" containerID="322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.277488 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0"} err="failed to get container status \"322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0\": rpc error: code = NotFound desc = could not find container \"322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0\": container with ID starting with 322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0 not found: ID does not exist" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.277575 4766 scope.go:117] "RemoveContainer" containerID="7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.277549 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 12:26:29 crc kubenswrapper[4766]: E1002 12:26:29.278192 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1\": container with ID starting with 7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1 not found: ID does not exist" containerID="7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.278226 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1"} err="failed to get container status \"7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1\": rpc error: code = NotFound desc = could not find container \"7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1\": container with ID starting with 7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1 not found: ID does not exist" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.278258 4766 scope.go:117] "RemoveContainer" containerID="322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.278638 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0"} err="failed to get container status \"322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0\": rpc error: code = NotFound desc = could not find container \"322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0\": container with ID starting with 322178dcaa36a1663ade798c644df7155138c2c574be4a649739b1265648bce0 not found: ID does not exist" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.278658 4766 scope.go:117] "RemoveContainer" containerID="7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.279985 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1"} err="failed to get container status \"7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1\": rpc error: code = NotFound desc = could not find container \"7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1\": container with ID starting with 7f861971064308deaba5b2889e5e12802d5a1957fad982b2451b10b2cda122f1 not found: ID does not exist" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.386559 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-ceph\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.386648 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfr4q\" (UniqueName: \"kubernetes.io/projected/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-kube-api-access-bfr4q\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.386684 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.386699 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.386766 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-scripts\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.386789 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-logs\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.386826 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-config-data\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.488839 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-scripts\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.489256 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-logs\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.489316 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-config-data\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.489375 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-ceph\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.489411 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfr4q\" (UniqueName: \"kubernetes.io/projected/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-kube-api-access-bfr4q\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.489451 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.489474 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.489837 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-logs\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.490544 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.493913 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.494032 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-config-data\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.494079 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-scripts\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.495018 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-ceph\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.509843 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfr4q\" (UniqueName: \"kubernetes.io/projected/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-kube-api-access-bfr4q\") pod \"glance-default-external-api-0\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.605194 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:26:29 crc kubenswrapper[4766]: I1002 12:26:29.899135 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32bc8e87-bb2f-4be7-8ec5-61117163e3f7" path="/var/lib/kubelet/pods/32bc8e87-bb2f-4be7-8ec5-61117163e3f7/volumes" Oct 02 12:26:30 crc kubenswrapper[4766]: I1002 12:26:30.215398 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:26:30 crc kubenswrapper[4766]: W1002 12:26:30.216247 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9daa02cf_4179_422d_bbf1_eb56fecdaa2e.slice/crio-5f598fac833fc516efacda0e6d0d850f9c25ca970b154c245df39196893e2a15 WatchSource:0}: Error finding container 5f598fac833fc516efacda0e6d0d850f9c25ca970b154c245df39196893e2a15: Status 404 returned error can't find the container with id 5f598fac833fc516efacda0e6d0d850f9c25ca970b154c245df39196893e2a15 Oct 02 12:26:30 crc kubenswrapper[4766]: I1002 12:26:30.458177 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:26:30 crc kubenswrapper[4766]: I1002 12:26:30.458472 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="effd14d9-eba5-420a-a5da-97fb53e5f78a" containerName="glance-log" containerID="cri-o://5ebea97cc994cee1df6417d576a82a2eeaba50ae9b30c0b3f253285fdfbb66a4" gracePeriod=30 Oct 02 12:26:30 crc kubenswrapper[4766]: I1002 12:26:30.458677 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="effd14d9-eba5-420a-a5da-97fb53e5f78a" containerName="glance-httpd" containerID="cri-o://459a4ccb91c089449067908175650f9513330b3c6c26139d096e120e22de3a34" gracePeriod=30 Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.217882 4766 generic.go:334] "Generic (PLEG): container finished" podID="effd14d9-eba5-420a-a5da-97fb53e5f78a" containerID="459a4ccb91c089449067908175650f9513330b3c6c26139d096e120e22de3a34" exitCode=0 Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.218225 4766 generic.go:334] "Generic (PLEG): container finished" podID="effd14d9-eba5-420a-a5da-97fb53e5f78a" containerID="5ebea97cc994cee1df6417d576a82a2eeaba50ae9b30c0b3f253285fdfbb66a4" exitCode=143 Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.217946 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"effd14d9-eba5-420a-a5da-97fb53e5f78a","Type":"ContainerDied","Data":"459a4ccb91c089449067908175650f9513330b3c6c26139d096e120e22de3a34"} Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.218297 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"effd14d9-eba5-420a-a5da-97fb53e5f78a","Type":"ContainerDied","Data":"5ebea97cc994cee1df6417d576a82a2eeaba50ae9b30c0b3f253285fdfbb66a4"} Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.218310 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"effd14d9-eba5-420a-a5da-97fb53e5f78a","Type":"ContainerDied","Data":"56136b9d0f147a43c5d062ca2f4f1a0f62f62a2f0ed98ed5b8defd9953fdb993"} Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.218324 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56136b9d0f147a43c5d062ca2f4f1a0f62f62a2f0ed98ed5b8defd9953fdb993" Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.221021 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9daa02cf-4179-422d-bbf1-eb56fecdaa2e","Type":"ContainerStarted","Data":"1929510534d2b26bef2c41e543ec440a13c4ba67b16dcaf7168c5dc77e27373f"} Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.221082 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9daa02cf-4179-422d-bbf1-eb56fecdaa2e","Type":"ContainerStarted","Data":"5f598fac833fc516efacda0e6d0d850f9c25ca970b154c245df39196893e2a15"} Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.230279 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.332191 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsknw\" (UniqueName: \"kubernetes.io/projected/effd14d9-eba5-420a-a5da-97fb53e5f78a-kube-api-access-rsknw\") pod \"effd14d9-eba5-420a-a5da-97fb53e5f78a\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.332681 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/effd14d9-eba5-420a-a5da-97fb53e5f78a-ceph\") pod \"effd14d9-eba5-420a-a5da-97fb53e5f78a\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.332807 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-config-data\") pod \"effd14d9-eba5-420a-a5da-97fb53e5f78a\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.332986 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/effd14d9-eba5-420a-a5da-97fb53e5f78a-httpd-run\") pod \"effd14d9-eba5-420a-a5da-97fb53e5f78a\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.333013 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-scripts\") pod \"effd14d9-eba5-420a-a5da-97fb53e5f78a\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.333047 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-combined-ca-bundle\") pod \"effd14d9-eba5-420a-a5da-97fb53e5f78a\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.333097 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/effd14d9-eba5-420a-a5da-97fb53e5f78a-logs\") pod \"effd14d9-eba5-420a-a5da-97fb53e5f78a\" (UID: \"effd14d9-eba5-420a-a5da-97fb53e5f78a\") " Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.333264 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/effd14d9-eba5-420a-a5da-97fb53e5f78a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "effd14d9-eba5-420a-a5da-97fb53e5f78a" (UID: "effd14d9-eba5-420a-a5da-97fb53e5f78a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.333538 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/effd14d9-eba5-420a-a5da-97fb53e5f78a-logs" (OuterVolumeSpecName: "logs") pod "effd14d9-eba5-420a-a5da-97fb53e5f78a" (UID: "effd14d9-eba5-420a-a5da-97fb53e5f78a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.333804 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/effd14d9-eba5-420a-a5da-97fb53e5f78a-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.333832 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/effd14d9-eba5-420a-a5da-97fb53e5f78a-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.338693 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/effd14d9-eba5-420a-a5da-97fb53e5f78a-kube-api-access-rsknw" (OuterVolumeSpecName: "kube-api-access-rsknw") pod "effd14d9-eba5-420a-a5da-97fb53e5f78a" (UID: "effd14d9-eba5-420a-a5da-97fb53e5f78a"). InnerVolumeSpecName "kube-api-access-rsknw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.338753 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-scripts" (OuterVolumeSpecName: "scripts") pod "effd14d9-eba5-420a-a5da-97fb53e5f78a" (UID: "effd14d9-eba5-420a-a5da-97fb53e5f78a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.339260 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/effd14d9-eba5-420a-a5da-97fb53e5f78a-ceph" (OuterVolumeSpecName: "ceph") pod "effd14d9-eba5-420a-a5da-97fb53e5f78a" (UID: "effd14d9-eba5-420a-a5da-97fb53e5f78a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.360947 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "effd14d9-eba5-420a-a5da-97fb53e5f78a" (UID: "effd14d9-eba5-420a-a5da-97fb53e5f78a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.380012 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-config-data" (OuterVolumeSpecName: "config-data") pod "effd14d9-eba5-420a-a5da-97fb53e5f78a" (UID: "effd14d9-eba5-420a-a5da-97fb53e5f78a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.435895 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.435951 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.435974 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsknw\" (UniqueName: \"kubernetes.io/projected/effd14d9-eba5-420a-a5da-97fb53e5f78a-kube-api-access-rsknw\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.435990 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/effd14d9-eba5-420a-a5da-97fb53e5f78a-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:31 crc kubenswrapper[4766]: I1002 12:26:31.436001 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/effd14d9-eba5-420a-a5da-97fb53e5f78a-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.237791 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9daa02cf-4179-422d-bbf1-eb56fecdaa2e","Type":"ContainerStarted","Data":"5510a5ae64fb788bc5c47e75ccb77eb054581bf32a617929d26866146d7f5cec"} Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.237907 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.278157 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.2781267720000002 podStartE2EDuration="3.278126772s" podCreationTimestamp="2025-10-02 12:26:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:26:32.262061978 +0000 UTC m=+5707.204932962" watchObservedRunningTime="2025-10-02 12:26:32.278126772 +0000 UTC m=+5707.220997726" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.291925 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.305490 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.314881 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:26:32 crc kubenswrapper[4766]: E1002 12:26:32.315359 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="effd14d9-eba5-420a-a5da-97fb53e5f78a" containerName="glance-log" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.315385 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="effd14d9-eba5-420a-a5da-97fb53e5f78a" containerName="glance-log" Oct 02 12:26:32 crc kubenswrapper[4766]: E1002 12:26:32.315421 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="effd14d9-eba5-420a-a5da-97fb53e5f78a" containerName="glance-httpd" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.315429 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="effd14d9-eba5-420a-a5da-97fb53e5f78a" containerName="glance-httpd" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.315601 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="effd14d9-eba5-420a-a5da-97fb53e5f78a" containerName="glance-log" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.315645 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="effd14d9-eba5-420a-a5da-97fb53e5f78a" containerName="glance-httpd" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.316803 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.320534 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.323350 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.454859 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.454953 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.455083 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-logs\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.455135 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.455318 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.455661 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk9bp\" (UniqueName: \"kubernetes.io/projected/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-kube-api-access-kk9bp\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.455716 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.557843 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk9bp\" (UniqueName: \"kubernetes.io/projected/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-kube-api-access-kk9bp\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.557927 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.557959 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.558000 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.558023 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-logs\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.558037 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.558084 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.558645 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.558673 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-logs\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.568046 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.568938 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.568943 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.569452 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.577330 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk9bp\" (UniqueName: \"kubernetes.io/projected/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-kube-api-access-kk9bp\") pod \"glance-default-internal-api-0\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:26:32 crc kubenswrapper[4766]: I1002 12:26:32.646042 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:26:33 crc kubenswrapper[4766]: I1002 12:26:33.193979 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:26:33 crc kubenswrapper[4766]: W1002 12:26:33.193993 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf138eee_b0bd_40cf_880a_d8a82ac1cd2f.slice/crio-2165bf4633bb084c6fb827aa9d02a8904be487566c0d436b738ac61a919b8caf WatchSource:0}: Error finding container 2165bf4633bb084c6fb827aa9d02a8904be487566c0d436b738ac61a919b8caf: Status 404 returned error can't find the container with id 2165bf4633bb084c6fb827aa9d02a8904be487566c0d436b738ac61a919b8caf Oct 02 12:26:33 crc kubenswrapper[4766]: I1002 12:26:33.249642 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f","Type":"ContainerStarted","Data":"2165bf4633bb084c6fb827aa9d02a8904be487566c0d436b738ac61a919b8caf"} Oct 02 12:26:33 crc kubenswrapper[4766]: I1002 12:26:33.895083 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="effd14d9-eba5-420a-a5da-97fb53e5f78a" path="/var/lib/kubelet/pods/effd14d9-eba5-420a-a5da-97fb53e5f78a/volumes" Oct 02 12:26:34 crc kubenswrapper[4766]: I1002 12:26:34.264573 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f","Type":"ContainerStarted","Data":"6625b9274663fec0be311dbcf60adc35c30262d032c6554770c42479e46b47ed"} Oct 02 12:26:34 crc kubenswrapper[4766]: I1002 12:26:34.265708 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f","Type":"ContainerStarted","Data":"fbc8d81cae7d71fa8a46628e8d150a14dfb67b400bf9f6724d0672397f9f3127"} Oct 02 12:26:34 crc kubenswrapper[4766]: I1002 12:26:34.285137 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.285102849 podStartE2EDuration="2.285102849s" podCreationTimestamp="2025-10-02 12:26:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:26:34.284441777 +0000 UTC m=+5709.227312721" watchObservedRunningTime="2025-10-02 12:26:34.285102849 +0000 UTC m=+5709.227973793" Oct 02 12:26:35 crc kubenswrapper[4766]: I1002 12:26:35.634477 4766 scope.go:117] "RemoveContainer" containerID="da2ff533e579ba8db4cac2b657386a4458b63aef13724aa11b3b45b1e8648c96" Oct 02 12:26:35 crc kubenswrapper[4766]: I1002 12:26:35.848779 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:26:35 crc kubenswrapper[4766]: I1002 12:26:35.936421 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8488b8bd7c-vxgf6"] Oct 02 12:26:35 crc kubenswrapper[4766]: I1002 12:26:35.936772 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" podUID="88226e6f-8efa-4505-a5e5-d4cf947b2d86" containerName="dnsmasq-dns" containerID="cri-o://1f0799f089758641ac1fac053e212d9f4c577b98ec8729f678d606a22cc63912" gracePeriod=10 Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.291475 4766 generic.go:334] "Generic (PLEG): container finished" podID="88226e6f-8efa-4505-a5e5-d4cf947b2d86" containerID="1f0799f089758641ac1fac053e212d9f4c577b98ec8729f678d606a22cc63912" exitCode=0 Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.292042 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" event={"ID":"88226e6f-8efa-4505-a5e5-d4cf947b2d86","Type":"ContainerDied","Data":"1f0799f089758641ac1fac053e212d9f4c577b98ec8729f678d606a22cc63912"} Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.428632 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.540255 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-ovsdbserver-nb\") pod \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.540365 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-ovsdbserver-sb\") pod \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.540445 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-config\") pod \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.540493 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzrts\" (UniqueName: \"kubernetes.io/projected/88226e6f-8efa-4505-a5e5-d4cf947b2d86-kube-api-access-hzrts\") pod \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.540571 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-dns-svc\") pod \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\" (UID: \"88226e6f-8efa-4505-a5e5-d4cf947b2d86\") " Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.548826 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88226e6f-8efa-4505-a5e5-d4cf947b2d86-kube-api-access-hzrts" (OuterVolumeSpecName: "kube-api-access-hzrts") pod "88226e6f-8efa-4505-a5e5-d4cf947b2d86" (UID: "88226e6f-8efa-4505-a5e5-d4cf947b2d86"). InnerVolumeSpecName "kube-api-access-hzrts". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.589264 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-config" (OuterVolumeSpecName: "config") pod "88226e6f-8efa-4505-a5e5-d4cf947b2d86" (UID: "88226e6f-8efa-4505-a5e5-d4cf947b2d86"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.589570 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "88226e6f-8efa-4505-a5e5-d4cf947b2d86" (UID: "88226e6f-8efa-4505-a5e5-d4cf947b2d86"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.595479 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "88226e6f-8efa-4505-a5e5-d4cf947b2d86" (UID: "88226e6f-8efa-4505-a5e5-d4cf947b2d86"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.597043 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "88226e6f-8efa-4505-a5e5-d4cf947b2d86" (UID: "88226e6f-8efa-4505-a5e5-d4cf947b2d86"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.643708 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzrts\" (UniqueName: \"kubernetes.io/projected/88226e6f-8efa-4505-a5e5-d4cf947b2d86-kube-api-access-hzrts\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.644082 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.644098 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.644108 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:36 crc kubenswrapper[4766]: I1002 12:26:36.644118 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88226e6f-8efa-4505-a5e5-d4cf947b2d86-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:37 crc kubenswrapper[4766]: I1002 12:26:37.301973 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" event={"ID":"88226e6f-8efa-4505-a5e5-d4cf947b2d86","Type":"ContainerDied","Data":"c46f81997ed5ee514a2056bd9176f0aaeda47bae2d9e2a869f72ef4279ceb71b"} Oct 02 12:26:37 crc kubenswrapper[4766]: I1002 12:26:37.302028 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8488b8bd7c-vxgf6" Oct 02 12:26:37 crc kubenswrapper[4766]: I1002 12:26:37.302048 4766 scope.go:117] "RemoveContainer" containerID="1f0799f089758641ac1fac053e212d9f4c577b98ec8729f678d606a22cc63912" Oct 02 12:26:37 crc kubenswrapper[4766]: I1002 12:26:37.325583 4766 scope.go:117] "RemoveContainer" containerID="0626ce2f9b774c40c74588a2fb53d5cabeed95b64e8f63d6933910a1c809b6d4" Oct 02 12:26:37 crc kubenswrapper[4766]: I1002 12:26:37.330955 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8488b8bd7c-vxgf6"] Oct 02 12:26:37 crc kubenswrapper[4766]: I1002 12:26:37.338274 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8488b8bd7c-vxgf6"] Oct 02 12:26:37 crc kubenswrapper[4766]: I1002 12:26:37.900998 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88226e6f-8efa-4505-a5e5-d4cf947b2d86" path="/var/lib/kubelet/pods/88226e6f-8efa-4505-a5e5-d4cf947b2d86/volumes" Oct 02 12:26:39 crc kubenswrapper[4766]: I1002 12:26:39.606686 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 12:26:39 crc kubenswrapper[4766]: I1002 12:26:39.608429 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 12:26:39 crc kubenswrapper[4766]: I1002 12:26:39.636749 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 12:26:39 crc kubenswrapper[4766]: I1002 12:26:39.652174 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 12:26:39 crc kubenswrapper[4766]: I1002 12:26:39.881879 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:26:39 crc kubenswrapper[4766]: E1002 12:26:39.882106 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:26:40 crc kubenswrapper[4766]: I1002 12:26:40.331059 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 12:26:40 crc kubenswrapper[4766]: I1002 12:26:40.331746 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 12:26:42 crc kubenswrapper[4766]: I1002 12:26:42.362434 4766 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 12:26:42 crc kubenswrapper[4766]: I1002 12:26:42.362820 4766 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 12:26:42 crc kubenswrapper[4766]: I1002 12:26:42.526142 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 12:26:42 crc kubenswrapper[4766]: I1002 12:26:42.606414 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 12:26:42 crc kubenswrapper[4766]: I1002 12:26:42.646521 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 12:26:42 crc kubenswrapper[4766]: I1002 12:26:42.646577 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 12:26:42 crc kubenswrapper[4766]: I1002 12:26:42.685335 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 12:26:42 crc kubenswrapper[4766]: I1002 12:26:42.700006 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 12:26:43 crc kubenswrapper[4766]: I1002 12:26:43.370838 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 12:26:43 crc kubenswrapper[4766]: I1002 12:26:43.370902 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 12:26:45 crc kubenswrapper[4766]: I1002 12:26:45.419749 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 12:26:45 crc kubenswrapper[4766]: I1002 12:26:45.420164 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 12:26:52 crc kubenswrapper[4766]: I1002 12:26:52.882338 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:26:52 crc kubenswrapper[4766]: E1002 12:26:52.883209 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:26:53 crc kubenswrapper[4766]: I1002 12:26:53.709452 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-7tgcb"] Oct 02 12:26:53 crc kubenswrapper[4766]: E1002 12:26:53.709993 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88226e6f-8efa-4505-a5e5-d4cf947b2d86" containerName="dnsmasq-dns" Oct 02 12:26:53 crc kubenswrapper[4766]: I1002 12:26:53.710010 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="88226e6f-8efa-4505-a5e5-d4cf947b2d86" containerName="dnsmasq-dns" Oct 02 12:26:53 crc kubenswrapper[4766]: E1002 12:26:53.710061 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88226e6f-8efa-4505-a5e5-d4cf947b2d86" containerName="init" Oct 02 12:26:53 crc kubenswrapper[4766]: I1002 12:26:53.710067 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="88226e6f-8efa-4505-a5e5-d4cf947b2d86" containerName="init" Oct 02 12:26:53 crc kubenswrapper[4766]: I1002 12:26:53.710228 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="88226e6f-8efa-4505-a5e5-d4cf947b2d86" containerName="dnsmasq-dns" Oct 02 12:26:53 crc kubenswrapper[4766]: I1002 12:26:53.711061 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7tgcb" Oct 02 12:26:53 crc kubenswrapper[4766]: I1002 12:26:53.718950 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7tgcb"] Oct 02 12:26:53 crc kubenswrapper[4766]: I1002 12:26:53.765377 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4tbc\" (UniqueName: \"kubernetes.io/projected/067d7606-7cf3-433d-8908-8d4e5fcae88c-kube-api-access-g4tbc\") pod \"placement-db-create-7tgcb\" (UID: \"067d7606-7cf3-433d-8908-8d4e5fcae88c\") " pod="openstack/placement-db-create-7tgcb" Oct 02 12:26:53 crc kubenswrapper[4766]: I1002 12:26:53.866978 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4tbc\" (UniqueName: \"kubernetes.io/projected/067d7606-7cf3-433d-8908-8d4e5fcae88c-kube-api-access-g4tbc\") pod \"placement-db-create-7tgcb\" (UID: \"067d7606-7cf3-433d-8908-8d4e5fcae88c\") " pod="openstack/placement-db-create-7tgcb" Oct 02 12:26:53 crc kubenswrapper[4766]: I1002 12:26:53.889921 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4tbc\" (UniqueName: \"kubernetes.io/projected/067d7606-7cf3-433d-8908-8d4e5fcae88c-kube-api-access-g4tbc\") pod \"placement-db-create-7tgcb\" (UID: \"067d7606-7cf3-433d-8908-8d4e5fcae88c\") " pod="openstack/placement-db-create-7tgcb" Oct 02 12:26:54 crc kubenswrapper[4766]: I1002 12:26:54.028429 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7tgcb" Oct 02 12:26:54 crc kubenswrapper[4766]: I1002 12:26:54.469374 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7tgcb"] Oct 02 12:26:55 crc kubenswrapper[4766]: I1002 12:26:55.461357 4766 generic.go:334] "Generic (PLEG): container finished" podID="067d7606-7cf3-433d-8908-8d4e5fcae88c" containerID="7c17d62cda7bf4b1b7ddfd1834208a92c7c0261c32935b07bb68922cbbebe55d" exitCode=0 Oct 02 12:26:55 crc kubenswrapper[4766]: I1002 12:26:55.461627 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7tgcb" event={"ID":"067d7606-7cf3-433d-8908-8d4e5fcae88c","Type":"ContainerDied","Data":"7c17d62cda7bf4b1b7ddfd1834208a92c7c0261c32935b07bb68922cbbebe55d"} Oct 02 12:26:55 crc kubenswrapper[4766]: I1002 12:26:55.461659 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7tgcb" event={"ID":"067d7606-7cf3-433d-8908-8d4e5fcae88c","Type":"ContainerStarted","Data":"42908c3310e96f86d1155420c095dad7b43aaed2ab05b400a7086270b99386ba"} Oct 02 12:26:56 crc kubenswrapper[4766]: I1002 12:26:56.808453 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7tgcb" Oct 02 12:26:56 crc kubenswrapper[4766]: I1002 12:26:56.920995 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4tbc\" (UniqueName: \"kubernetes.io/projected/067d7606-7cf3-433d-8908-8d4e5fcae88c-kube-api-access-g4tbc\") pod \"067d7606-7cf3-433d-8908-8d4e5fcae88c\" (UID: \"067d7606-7cf3-433d-8908-8d4e5fcae88c\") " Oct 02 12:26:56 crc kubenswrapper[4766]: I1002 12:26:56.927203 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/067d7606-7cf3-433d-8908-8d4e5fcae88c-kube-api-access-g4tbc" (OuterVolumeSpecName: "kube-api-access-g4tbc") pod "067d7606-7cf3-433d-8908-8d4e5fcae88c" (UID: "067d7606-7cf3-433d-8908-8d4e5fcae88c"). InnerVolumeSpecName "kube-api-access-g4tbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:26:57 crc kubenswrapper[4766]: I1002 12:26:57.023473 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4tbc\" (UniqueName: \"kubernetes.io/projected/067d7606-7cf3-433d-8908-8d4e5fcae88c-kube-api-access-g4tbc\") on node \"crc\" DevicePath \"\"" Oct 02 12:26:57 crc kubenswrapper[4766]: I1002 12:26:57.479394 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7tgcb" event={"ID":"067d7606-7cf3-433d-8908-8d4e5fcae88c","Type":"ContainerDied","Data":"42908c3310e96f86d1155420c095dad7b43aaed2ab05b400a7086270b99386ba"} Oct 02 12:26:57 crc kubenswrapper[4766]: I1002 12:26:57.479458 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42908c3310e96f86d1155420c095dad7b43aaed2ab05b400a7086270b99386ba" Oct 02 12:26:57 crc kubenswrapper[4766]: I1002 12:26:57.479685 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7tgcb" Oct 02 12:27:00 crc kubenswrapper[4766]: I1002 12:27:00.104065 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-q6b7f"] Oct 02 12:27:00 crc kubenswrapper[4766]: E1002 12:27:00.106771 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="067d7606-7cf3-433d-8908-8d4e5fcae88c" containerName="mariadb-database-create" Oct 02 12:27:00 crc kubenswrapper[4766]: I1002 12:27:00.106799 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="067d7606-7cf3-433d-8908-8d4e5fcae88c" containerName="mariadb-database-create" Oct 02 12:27:00 crc kubenswrapper[4766]: I1002 12:27:00.106995 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="067d7606-7cf3-433d-8908-8d4e5fcae88c" containerName="mariadb-database-create" Oct 02 12:27:00 crc kubenswrapper[4766]: I1002 12:27:00.116657 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6b7f" Oct 02 12:27:00 crc kubenswrapper[4766]: I1002 12:27:00.129453 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q6b7f"] Oct 02 12:27:00 crc kubenswrapper[4766]: I1002 12:27:00.194958 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8e3f0c2-3bda-46d9-9560-443036524d12-catalog-content\") pod \"redhat-operators-q6b7f\" (UID: \"f8e3f0c2-3bda-46d9-9560-443036524d12\") " pod="openshift-marketplace/redhat-operators-q6b7f" Oct 02 12:27:00 crc kubenswrapper[4766]: I1002 12:27:00.195184 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbmkr\" (UniqueName: \"kubernetes.io/projected/f8e3f0c2-3bda-46d9-9560-443036524d12-kube-api-access-mbmkr\") pod \"redhat-operators-q6b7f\" (UID: \"f8e3f0c2-3bda-46d9-9560-443036524d12\") " pod="openshift-marketplace/redhat-operators-q6b7f" Oct 02 12:27:00 crc kubenswrapper[4766]: I1002 12:27:00.195280 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8e3f0c2-3bda-46d9-9560-443036524d12-utilities\") pod \"redhat-operators-q6b7f\" (UID: \"f8e3f0c2-3bda-46d9-9560-443036524d12\") " pod="openshift-marketplace/redhat-operators-q6b7f" Oct 02 12:27:00 crc kubenswrapper[4766]: I1002 12:27:00.297096 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8e3f0c2-3bda-46d9-9560-443036524d12-utilities\") pod \"redhat-operators-q6b7f\" (UID: \"f8e3f0c2-3bda-46d9-9560-443036524d12\") " pod="openshift-marketplace/redhat-operators-q6b7f" Oct 02 12:27:00 crc kubenswrapper[4766]: I1002 12:27:00.297159 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8e3f0c2-3bda-46d9-9560-443036524d12-catalog-content\") pod \"redhat-operators-q6b7f\" (UID: \"f8e3f0c2-3bda-46d9-9560-443036524d12\") " pod="openshift-marketplace/redhat-operators-q6b7f" Oct 02 12:27:00 crc kubenswrapper[4766]: I1002 12:27:00.297247 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbmkr\" (UniqueName: \"kubernetes.io/projected/f8e3f0c2-3bda-46d9-9560-443036524d12-kube-api-access-mbmkr\") pod \"redhat-operators-q6b7f\" (UID: \"f8e3f0c2-3bda-46d9-9560-443036524d12\") " pod="openshift-marketplace/redhat-operators-q6b7f" Oct 02 12:27:00 crc kubenswrapper[4766]: I1002 12:27:00.297864 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8e3f0c2-3bda-46d9-9560-443036524d12-catalog-content\") pod \"redhat-operators-q6b7f\" (UID: \"f8e3f0c2-3bda-46d9-9560-443036524d12\") " pod="openshift-marketplace/redhat-operators-q6b7f" Oct 02 12:27:00 crc kubenswrapper[4766]: I1002 12:27:00.297878 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8e3f0c2-3bda-46d9-9560-443036524d12-utilities\") pod \"redhat-operators-q6b7f\" (UID: \"f8e3f0c2-3bda-46d9-9560-443036524d12\") " pod="openshift-marketplace/redhat-operators-q6b7f" Oct 02 12:27:00 crc kubenswrapper[4766]: I1002 12:27:00.319202 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbmkr\" (UniqueName: \"kubernetes.io/projected/f8e3f0c2-3bda-46d9-9560-443036524d12-kube-api-access-mbmkr\") pod \"redhat-operators-q6b7f\" (UID: \"f8e3f0c2-3bda-46d9-9560-443036524d12\") " pod="openshift-marketplace/redhat-operators-q6b7f" Oct 02 12:27:00 crc kubenswrapper[4766]: I1002 12:27:00.447869 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6b7f" Oct 02 12:27:00 crc kubenswrapper[4766]: I1002 12:27:00.910668 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-q6b7f"] Oct 02 12:27:01 crc kubenswrapper[4766]: I1002 12:27:01.519835 4766 generic.go:334] "Generic (PLEG): container finished" podID="f8e3f0c2-3bda-46d9-9560-443036524d12" containerID="c7192fcc291ed976968f4da53cefcf9a8b2fda7d113e8bd1d8560da378926522" exitCode=0 Oct 02 12:27:01 crc kubenswrapper[4766]: I1002 12:27:01.519933 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6b7f" event={"ID":"f8e3f0c2-3bda-46d9-9560-443036524d12","Type":"ContainerDied","Data":"c7192fcc291ed976968f4da53cefcf9a8b2fda7d113e8bd1d8560da378926522"} Oct 02 12:27:01 crc kubenswrapper[4766]: I1002 12:27:01.520208 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6b7f" event={"ID":"f8e3f0c2-3bda-46d9-9560-443036524d12","Type":"ContainerStarted","Data":"148b499f8ab38e2c4ef8c55ede47ee7e8b0bc40ba28b3e55854894d6bb4b0d88"} Oct 02 12:27:01 crc kubenswrapper[4766]: I1002 12:27:01.521648 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:27:03 crc kubenswrapper[4766]: I1002 12:27:03.537398 4766 generic.go:334] "Generic (PLEG): container finished" podID="f8e3f0c2-3bda-46d9-9560-443036524d12" containerID="3693b12b89ffc220e682520f3c0f946f7d316b0ea61ae33720faa50dd096c503" exitCode=0 Oct 02 12:27:03 crc kubenswrapper[4766]: I1002 12:27:03.537484 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6b7f" event={"ID":"f8e3f0c2-3bda-46d9-9560-443036524d12","Type":"ContainerDied","Data":"3693b12b89ffc220e682520f3c0f946f7d316b0ea61ae33720faa50dd096c503"} Oct 02 12:27:03 crc kubenswrapper[4766]: I1002 12:27:03.817306 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-1150-account-create-cct6z"] Oct 02 12:27:03 crc kubenswrapper[4766]: I1002 12:27:03.818958 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1150-account-create-cct6z" Oct 02 12:27:03 crc kubenswrapper[4766]: I1002 12:27:03.821826 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 02 12:27:03 crc kubenswrapper[4766]: I1002 12:27:03.826728 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1150-account-create-cct6z"] Oct 02 12:27:03 crc kubenswrapper[4766]: I1002 12:27:03.960719 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqm4b\" (UniqueName: \"kubernetes.io/projected/13e3fa1c-1096-4c60-9191-b11de2440178-kube-api-access-sqm4b\") pod \"placement-1150-account-create-cct6z\" (UID: \"13e3fa1c-1096-4c60-9191-b11de2440178\") " pod="openstack/placement-1150-account-create-cct6z" Oct 02 12:27:04 crc kubenswrapper[4766]: I1002 12:27:04.062826 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqm4b\" (UniqueName: \"kubernetes.io/projected/13e3fa1c-1096-4c60-9191-b11de2440178-kube-api-access-sqm4b\") pod \"placement-1150-account-create-cct6z\" (UID: \"13e3fa1c-1096-4c60-9191-b11de2440178\") " pod="openstack/placement-1150-account-create-cct6z" Oct 02 12:27:04 crc kubenswrapper[4766]: I1002 12:27:04.082409 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqm4b\" (UniqueName: \"kubernetes.io/projected/13e3fa1c-1096-4c60-9191-b11de2440178-kube-api-access-sqm4b\") pod \"placement-1150-account-create-cct6z\" (UID: \"13e3fa1c-1096-4c60-9191-b11de2440178\") " pod="openstack/placement-1150-account-create-cct6z" Oct 02 12:27:04 crc kubenswrapper[4766]: I1002 12:27:04.137417 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1150-account-create-cct6z" Oct 02 12:27:04 crc kubenswrapper[4766]: I1002 12:27:04.551489 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6b7f" event={"ID":"f8e3f0c2-3bda-46d9-9560-443036524d12","Type":"ContainerStarted","Data":"0ebdfedef77b7acf79fec6000d44ed5b69029a779c8955983aa6856b48bb9089"} Oct 02 12:27:04 crc kubenswrapper[4766]: I1002 12:27:04.585875 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-q6b7f" podStartSLOduration=1.9099754180000001 podStartE2EDuration="4.585849512s" podCreationTimestamp="2025-10-02 12:27:00 +0000 UTC" firstStartedPulling="2025-10-02 12:27:01.521329135 +0000 UTC m=+5736.464200079" lastFinishedPulling="2025-10-02 12:27:04.197203229 +0000 UTC m=+5739.140074173" observedRunningTime="2025-10-02 12:27:04.577598378 +0000 UTC m=+5739.520469322" watchObservedRunningTime="2025-10-02 12:27:04.585849512 +0000 UTC m=+5739.528720456" Oct 02 12:27:04 crc kubenswrapper[4766]: W1002 12:27:04.614763 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13e3fa1c_1096_4c60_9191_b11de2440178.slice/crio-520d9133b1f0e8e22d2979f2a7932d0d4acda973b6d55da9b06b5b23a13b724f WatchSource:0}: Error finding container 520d9133b1f0e8e22d2979f2a7932d0d4acda973b6d55da9b06b5b23a13b724f: Status 404 returned error can't find the container with id 520d9133b1f0e8e22d2979f2a7932d0d4acda973b6d55da9b06b5b23a13b724f Oct 02 12:27:04 crc kubenswrapper[4766]: I1002 12:27:04.617899 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1150-account-create-cct6z"] Oct 02 12:27:05 crc kubenswrapper[4766]: I1002 12:27:05.565387 4766 generic.go:334] "Generic (PLEG): container finished" podID="13e3fa1c-1096-4c60-9191-b11de2440178" containerID="6231307ebda306c68c9cb6a164ed7bd47bc48a6c406d1a0669db5a8e63bc22b4" exitCode=0 Oct 02 12:27:05 crc kubenswrapper[4766]: I1002 12:27:05.565571 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1150-account-create-cct6z" event={"ID":"13e3fa1c-1096-4c60-9191-b11de2440178","Type":"ContainerDied","Data":"6231307ebda306c68c9cb6a164ed7bd47bc48a6c406d1a0669db5a8e63bc22b4"} Oct 02 12:27:05 crc kubenswrapper[4766]: I1002 12:27:05.566045 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1150-account-create-cct6z" event={"ID":"13e3fa1c-1096-4c60-9191-b11de2440178","Type":"ContainerStarted","Data":"520d9133b1f0e8e22d2979f2a7932d0d4acda973b6d55da9b06b5b23a13b724f"} Oct 02 12:27:05 crc kubenswrapper[4766]: I1002 12:27:05.887799 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:27:05 crc kubenswrapper[4766]: E1002 12:27:05.888193 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:27:06 crc kubenswrapper[4766]: I1002 12:27:06.895989 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1150-account-create-cct6z" Oct 02 12:27:07 crc kubenswrapper[4766]: I1002 12:27:07.047361 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqm4b\" (UniqueName: \"kubernetes.io/projected/13e3fa1c-1096-4c60-9191-b11de2440178-kube-api-access-sqm4b\") pod \"13e3fa1c-1096-4c60-9191-b11de2440178\" (UID: \"13e3fa1c-1096-4c60-9191-b11de2440178\") " Oct 02 12:27:07 crc kubenswrapper[4766]: I1002 12:27:07.054291 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13e3fa1c-1096-4c60-9191-b11de2440178-kube-api-access-sqm4b" (OuterVolumeSpecName: "kube-api-access-sqm4b") pod "13e3fa1c-1096-4c60-9191-b11de2440178" (UID: "13e3fa1c-1096-4c60-9191-b11de2440178"). InnerVolumeSpecName "kube-api-access-sqm4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:27:07 crc kubenswrapper[4766]: I1002 12:27:07.150018 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqm4b\" (UniqueName: \"kubernetes.io/projected/13e3fa1c-1096-4c60-9191-b11de2440178-kube-api-access-sqm4b\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:07 crc kubenswrapper[4766]: I1002 12:27:07.585042 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1150-account-create-cct6z" event={"ID":"13e3fa1c-1096-4c60-9191-b11de2440178","Type":"ContainerDied","Data":"520d9133b1f0e8e22d2979f2a7932d0d4acda973b6d55da9b06b5b23a13b724f"} Oct 02 12:27:07 crc kubenswrapper[4766]: I1002 12:27:07.585388 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="520d9133b1f0e8e22d2979f2a7932d0d4acda973b6d55da9b06b5b23a13b724f" Oct 02 12:27:07 crc kubenswrapper[4766]: I1002 12:27:07.585131 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1150-account-create-cct6z" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.061426 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74ccf9c867-gkpf8"] Oct 02 12:27:09 crc kubenswrapper[4766]: E1002 12:27:09.061999 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13e3fa1c-1096-4c60-9191-b11de2440178" containerName="mariadb-account-create" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.062020 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="13e3fa1c-1096-4c60-9191-b11de2440178" containerName="mariadb-account-create" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.062356 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="13e3fa1c-1096-4c60-9191-b11de2440178" containerName="mariadb-account-create" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.063487 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.075195 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-fgnx6"] Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.076918 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.079408 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.079634 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.093243 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-sp4q5" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.101286 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74ccf9c867-gkpf8"] Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.133376 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fgnx6"] Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.202592 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-dns-svc\") pod \"dnsmasq-dns-74ccf9c867-gkpf8\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.202946 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgzhn\" (UniqueName: \"kubernetes.io/projected/4d650726-0f54-4320-9764-c984cefe3c0b-kube-api-access-wgzhn\") pod \"dnsmasq-dns-74ccf9c867-gkpf8\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.202982 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-config\") pod \"dnsmasq-dns-74ccf9c867-gkpf8\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.203015 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-ovsdbserver-nb\") pod \"dnsmasq-dns-74ccf9c867-gkpf8\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.203090 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-config-data\") pod \"placement-db-sync-fgnx6\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.203174 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-ovsdbserver-sb\") pod \"dnsmasq-dns-74ccf9c867-gkpf8\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.203240 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-combined-ca-bundle\") pod \"placement-db-sync-fgnx6\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.203365 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bkx5\" (UniqueName: \"kubernetes.io/projected/15fe436b-0401-43b0-910f-529ae2ed73d1-kube-api-access-8bkx5\") pod \"placement-db-sync-fgnx6\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.203394 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fe436b-0401-43b0-910f-529ae2ed73d1-logs\") pod \"placement-db-sync-fgnx6\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.203443 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-scripts\") pod \"placement-db-sync-fgnx6\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.305078 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-dns-svc\") pod \"dnsmasq-dns-74ccf9c867-gkpf8\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.305137 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgzhn\" (UniqueName: \"kubernetes.io/projected/4d650726-0f54-4320-9764-c984cefe3c0b-kube-api-access-wgzhn\") pod \"dnsmasq-dns-74ccf9c867-gkpf8\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.305157 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-config\") pod \"dnsmasq-dns-74ccf9c867-gkpf8\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.305181 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-ovsdbserver-nb\") pod \"dnsmasq-dns-74ccf9c867-gkpf8\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.305209 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-config-data\") pod \"placement-db-sync-fgnx6\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.305252 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-ovsdbserver-sb\") pod \"dnsmasq-dns-74ccf9c867-gkpf8\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.305277 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-combined-ca-bundle\") pod \"placement-db-sync-fgnx6\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.305336 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bkx5\" (UniqueName: \"kubernetes.io/projected/15fe436b-0401-43b0-910f-529ae2ed73d1-kube-api-access-8bkx5\") pod \"placement-db-sync-fgnx6\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.305356 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fe436b-0401-43b0-910f-529ae2ed73d1-logs\") pod \"placement-db-sync-fgnx6\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.305380 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-scripts\") pod \"placement-db-sync-fgnx6\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.306397 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fe436b-0401-43b0-910f-529ae2ed73d1-logs\") pod \"placement-db-sync-fgnx6\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.307014 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-dns-svc\") pod \"dnsmasq-dns-74ccf9c867-gkpf8\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.307017 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-ovsdbserver-sb\") pod \"dnsmasq-dns-74ccf9c867-gkpf8\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.307374 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-ovsdbserver-nb\") pod \"dnsmasq-dns-74ccf9c867-gkpf8\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.308304 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-config\") pod \"dnsmasq-dns-74ccf9c867-gkpf8\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.312701 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-config-data\") pod \"placement-db-sync-fgnx6\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.322599 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-scripts\") pod \"placement-db-sync-fgnx6\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.329154 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgzhn\" (UniqueName: \"kubernetes.io/projected/4d650726-0f54-4320-9764-c984cefe3c0b-kube-api-access-wgzhn\") pod \"dnsmasq-dns-74ccf9c867-gkpf8\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.329823 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-combined-ca-bundle\") pod \"placement-db-sync-fgnx6\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.331089 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bkx5\" (UniqueName: \"kubernetes.io/projected/15fe436b-0401-43b0-910f-529ae2ed73d1-kube-api-access-8bkx5\") pod \"placement-db-sync-fgnx6\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.402283 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:09 crc kubenswrapper[4766]: I1002 12:27:09.408814 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:10 crc kubenswrapper[4766]: I1002 12:27:10.033353 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fgnx6"] Oct 02 12:27:10 crc kubenswrapper[4766]: I1002 12:27:10.172418 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74ccf9c867-gkpf8"] Oct 02 12:27:10 crc kubenswrapper[4766]: W1002 12:27:10.175156 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d650726_0f54_4320_9764_c984cefe3c0b.slice/crio-b62bea09b5d559540a03aa24d7a2762acf71e53491e28e5b06b9f3adb130d39b WatchSource:0}: Error finding container b62bea09b5d559540a03aa24d7a2762acf71e53491e28e5b06b9f3adb130d39b: Status 404 returned error can't find the container with id b62bea09b5d559540a03aa24d7a2762acf71e53491e28e5b06b9f3adb130d39b Oct 02 12:27:10 crc kubenswrapper[4766]: I1002 12:27:10.449049 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-q6b7f" Oct 02 12:27:10 crc kubenswrapper[4766]: I1002 12:27:10.449121 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-q6b7f" Oct 02 12:27:10 crc kubenswrapper[4766]: I1002 12:27:10.492679 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-q6b7f" Oct 02 12:27:10 crc kubenswrapper[4766]: I1002 12:27:10.620111 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fgnx6" event={"ID":"15fe436b-0401-43b0-910f-529ae2ed73d1","Type":"ContainerStarted","Data":"e2dfcc63279915f3be5966486516deb094039a020bcfab8070301cfb54de06fa"} Oct 02 12:27:10 crc kubenswrapper[4766]: I1002 12:27:10.620187 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fgnx6" event={"ID":"15fe436b-0401-43b0-910f-529ae2ed73d1","Type":"ContainerStarted","Data":"f9a8196287a5feb925d5c68b234b460f5c1f385f912979411425e1f12422e28a"} Oct 02 12:27:10 crc kubenswrapper[4766]: I1002 12:27:10.623268 4766 generic.go:334] "Generic (PLEG): container finished" podID="4d650726-0f54-4320-9764-c984cefe3c0b" containerID="6d4e51408c276cffcfaf3adbac84c562aa256c47f785133dc063318fa50dff73" exitCode=0 Oct 02 12:27:10 crc kubenswrapper[4766]: I1002 12:27:10.624559 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" event={"ID":"4d650726-0f54-4320-9764-c984cefe3c0b","Type":"ContainerDied","Data":"6d4e51408c276cffcfaf3adbac84c562aa256c47f785133dc063318fa50dff73"} Oct 02 12:27:10 crc kubenswrapper[4766]: I1002 12:27:10.624587 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" event={"ID":"4d650726-0f54-4320-9764-c984cefe3c0b","Type":"ContainerStarted","Data":"b62bea09b5d559540a03aa24d7a2762acf71e53491e28e5b06b9f3adb130d39b"} Oct 02 12:27:10 crc kubenswrapper[4766]: I1002 12:27:10.637679 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-fgnx6" podStartSLOduration=1.637651521 podStartE2EDuration="1.637651521s" podCreationTimestamp="2025-10-02 12:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:27:10.635284645 +0000 UTC m=+5745.578155599" watchObservedRunningTime="2025-10-02 12:27:10.637651521 +0000 UTC m=+5745.580522475" Oct 02 12:27:10 crc kubenswrapper[4766]: I1002 12:27:10.680971 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-q6b7f" Oct 02 12:27:10 crc kubenswrapper[4766]: I1002 12:27:10.739373 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q6b7f"] Oct 02 12:27:11 crc kubenswrapper[4766]: I1002 12:27:11.634682 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" event={"ID":"4d650726-0f54-4320-9764-c984cefe3c0b","Type":"ContainerStarted","Data":"955d07bccfacf32cfba7467d0352069df2017de5a075aea66bb626c66b7e69d4"} Oct 02 12:27:11 crc kubenswrapper[4766]: I1002 12:27:11.635214 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:11 crc kubenswrapper[4766]: I1002 12:27:11.639142 4766 generic.go:334] "Generic (PLEG): container finished" podID="15fe436b-0401-43b0-910f-529ae2ed73d1" containerID="e2dfcc63279915f3be5966486516deb094039a020bcfab8070301cfb54de06fa" exitCode=0 Oct 02 12:27:11 crc kubenswrapper[4766]: I1002 12:27:11.639285 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fgnx6" event={"ID":"15fe436b-0401-43b0-910f-529ae2ed73d1","Type":"ContainerDied","Data":"e2dfcc63279915f3be5966486516deb094039a020bcfab8070301cfb54de06fa"} Oct 02 12:27:11 crc kubenswrapper[4766]: I1002 12:27:11.659504 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" podStartSLOduration=2.659477927 podStartE2EDuration="2.659477927s" podCreationTimestamp="2025-10-02 12:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:27:11.652042119 +0000 UTC m=+5746.594913073" watchObservedRunningTime="2025-10-02 12:27:11.659477927 +0000 UTC m=+5746.602348871" Oct 02 12:27:12 crc kubenswrapper[4766]: I1002 12:27:12.667144 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-q6b7f" podUID="f8e3f0c2-3bda-46d9-9560-443036524d12" containerName="registry-server" containerID="cri-o://0ebdfedef77b7acf79fec6000d44ed5b69029a779c8955983aa6856b48bb9089" gracePeriod=2 Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.033842 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.169125 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6b7f" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.198186 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fe436b-0401-43b0-910f-529ae2ed73d1-logs\") pod \"15fe436b-0401-43b0-910f-529ae2ed73d1\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.198308 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-combined-ca-bundle\") pod \"15fe436b-0401-43b0-910f-529ae2ed73d1\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.198346 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-config-data\") pod \"15fe436b-0401-43b0-910f-529ae2ed73d1\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.198661 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bkx5\" (UniqueName: \"kubernetes.io/projected/15fe436b-0401-43b0-910f-529ae2ed73d1-kube-api-access-8bkx5\") pod \"15fe436b-0401-43b0-910f-529ae2ed73d1\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.198723 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-scripts\") pod \"15fe436b-0401-43b0-910f-529ae2ed73d1\" (UID: \"15fe436b-0401-43b0-910f-529ae2ed73d1\") " Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.240281 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-scripts" (OuterVolumeSpecName: "scripts") pod "15fe436b-0401-43b0-910f-529ae2ed73d1" (UID: "15fe436b-0401-43b0-910f-529ae2ed73d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.240098 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15fe436b-0401-43b0-910f-529ae2ed73d1-logs" (OuterVolumeSpecName: "logs") pod "15fe436b-0401-43b0-910f-529ae2ed73d1" (UID: "15fe436b-0401-43b0-910f-529ae2ed73d1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.245150 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15fe436b-0401-43b0-910f-529ae2ed73d1" (UID: "15fe436b-0401-43b0-910f-529ae2ed73d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.246244 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15fe436b-0401-43b0-910f-529ae2ed73d1-kube-api-access-8bkx5" (OuterVolumeSpecName: "kube-api-access-8bkx5") pod "15fe436b-0401-43b0-910f-529ae2ed73d1" (UID: "15fe436b-0401-43b0-910f-529ae2ed73d1"). InnerVolumeSpecName "kube-api-access-8bkx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.268013 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-config-data" (OuterVolumeSpecName: "config-data") pod "15fe436b-0401-43b0-910f-529ae2ed73d1" (UID: "15fe436b-0401-43b0-910f-529ae2ed73d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.301431 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbmkr\" (UniqueName: \"kubernetes.io/projected/f8e3f0c2-3bda-46d9-9560-443036524d12-kube-api-access-mbmkr\") pod \"f8e3f0c2-3bda-46d9-9560-443036524d12\" (UID: \"f8e3f0c2-3bda-46d9-9560-443036524d12\") " Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.301627 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8e3f0c2-3bda-46d9-9560-443036524d12-utilities\") pod \"f8e3f0c2-3bda-46d9-9560-443036524d12\" (UID: \"f8e3f0c2-3bda-46d9-9560-443036524d12\") " Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.301690 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8e3f0c2-3bda-46d9-9560-443036524d12-catalog-content\") pod \"f8e3f0c2-3bda-46d9-9560-443036524d12\" (UID: \"f8e3f0c2-3bda-46d9-9560-443036524d12\") " Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.302597 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bkx5\" (UniqueName: \"kubernetes.io/projected/15fe436b-0401-43b0-910f-529ae2ed73d1-kube-api-access-8bkx5\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.302619 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.302629 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15fe436b-0401-43b0-910f-529ae2ed73d1-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.302638 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.302647 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15fe436b-0401-43b0-910f-529ae2ed73d1-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.316668 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e3f0c2-3bda-46d9-9560-443036524d12-utilities" (OuterVolumeSpecName: "utilities") pod "f8e3f0c2-3bda-46d9-9560-443036524d12" (UID: "f8e3f0c2-3bda-46d9-9560-443036524d12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.319678 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e3f0c2-3bda-46d9-9560-443036524d12-kube-api-access-mbmkr" (OuterVolumeSpecName: "kube-api-access-mbmkr") pod "f8e3f0c2-3bda-46d9-9560-443036524d12" (UID: "f8e3f0c2-3bda-46d9-9560-443036524d12"). InnerVolumeSpecName "kube-api-access-mbmkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.405170 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8e3f0c2-3bda-46d9-9560-443036524d12-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.405211 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbmkr\" (UniqueName: \"kubernetes.io/projected/f8e3f0c2-3bda-46d9-9560-443036524d12-kube-api-access-mbmkr\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.695977 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fgnx6" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.695987 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fgnx6" event={"ID":"15fe436b-0401-43b0-910f-529ae2ed73d1","Type":"ContainerDied","Data":"f9a8196287a5feb925d5c68b234b460f5c1f385f912979411425e1f12422e28a"} Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.696119 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9a8196287a5feb925d5c68b234b460f5c1f385f912979411425e1f12422e28a" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.700424 4766 generic.go:334] "Generic (PLEG): container finished" podID="f8e3f0c2-3bda-46d9-9560-443036524d12" containerID="0ebdfedef77b7acf79fec6000d44ed5b69029a779c8955983aa6856b48bb9089" exitCode=0 Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.700477 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6b7f" event={"ID":"f8e3f0c2-3bda-46d9-9560-443036524d12","Type":"ContainerDied","Data":"0ebdfedef77b7acf79fec6000d44ed5b69029a779c8955983aa6856b48bb9089"} Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.700550 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-q6b7f" event={"ID":"f8e3f0c2-3bda-46d9-9560-443036524d12","Type":"ContainerDied","Data":"148b499f8ab38e2c4ef8c55ede47ee7e8b0bc40ba28b3e55854894d6bb4b0d88"} Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.700578 4766 scope.go:117] "RemoveContainer" containerID="0ebdfedef77b7acf79fec6000d44ed5b69029a779c8955983aa6856b48bb9089" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.701009 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-q6b7f" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.759821 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f4c5978dd-trc79"] Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.759909 4766 scope.go:117] "RemoveContainer" containerID="3693b12b89ffc220e682520f3c0f946f7d316b0ea61ae33720faa50dd096c503" Oct 02 12:27:13 crc kubenswrapper[4766]: E1002 12:27:13.760288 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15fe436b-0401-43b0-910f-529ae2ed73d1" containerName="placement-db-sync" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.760312 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="15fe436b-0401-43b0-910f-529ae2ed73d1" containerName="placement-db-sync" Oct 02 12:27:13 crc kubenswrapper[4766]: E1002 12:27:13.760346 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e3f0c2-3bda-46d9-9560-443036524d12" containerName="registry-server" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.760353 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e3f0c2-3bda-46d9-9560-443036524d12" containerName="registry-server" Oct 02 12:27:13 crc kubenswrapper[4766]: E1002 12:27:13.760363 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e3f0c2-3bda-46d9-9560-443036524d12" containerName="extract-content" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.760371 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e3f0c2-3bda-46d9-9560-443036524d12" containerName="extract-content" Oct 02 12:27:13 crc kubenswrapper[4766]: E1002 12:27:13.760387 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8e3f0c2-3bda-46d9-9560-443036524d12" containerName="extract-utilities" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.760393 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e3f0c2-3bda-46d9-9560-443036524d12" containerName="extract-utilities" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.760583 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8e3f0c2-3bda-46d9-9560-443036524d12" containerName="registry-server" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.760596 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="15fe436b-0401-43b0-910f-529ae2ed73d1" containerName="placement-db-sync" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.761668 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.764114 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.764806 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-sp4q5" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.768215 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.768342 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f4c5978dd-trc79"] Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.799130 4766 scope.go:117] "RemoveContainer" containerID="c7192fcc291ed976968f4da53cefcf9a8b2fda7d113e8bd1d8560da378926522" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.857775 4766 scope.go:117] "RemoveContainer" containerID="0ebdfedef77b7acf79fec6000d44ed5b69029a779c8955983aa6856b48bb9089" Oct 02 12:27:13 crc kubenswrapper[4766]: E1002 12:27:13.858681 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ebdfedef77b7acf79fec6000d44ed5b69029a779c8955983aa6856b48bb9089\": container with ID starting with 0ebdfedef77b7acf79fec6000d44ed5b69029a779c8955983aa6856b48bb9089 not found: ID does not exist" containerID="0ebdfedef77b7acf79fec6000d44ed5b69029a779c8955983aa6856b48bb9089" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.858748 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ebdfedef77b7acf79fec6000d44ed5b69029a779c8955983aa6856b48bb9089"} err="failed to get container status \"0ebdfedef77b7acf79fec6000d44ed5b69029a779c8955983aa6856b48bb9089\": rpc error: code = NotFound desc = could not find container \"0ebdfedef77b7acf79fec6000d44ed5b69029a779c8955983aa6856b48bb9089\": container with ID starting with 0ebdfedef77b7acf79fec6000d44ed5b69029a779c8955983aa6856b48bb9089 not found: ID does not exist" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.858796 4766 scope.go:117] "RemoveContainer" containerID="3693b12b89ffc220e682520f3c0f946f7d316b0ea61ae33720faa50dd096c503" Oct 02 12:27:13 crc kubenswrapper[4766]: E1002 12:27:13.859393 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3693b12b89ffc220e682520f3c0f946f7d316b0ea61ae33720faa50dd096c503\": container with ID starting with 3693b12b89ffc220e682520f3c0f946f7d316b0ea61ae33720faa50dd096c503 not found: ID does not exist" containerID="3693b12b89ffc220e682520f3c0f946f7d316b0ea61ae33720faa50dd096c503" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.859435 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3693b12b89ffc220e682520f3c0f946f7d316b0ea61ae33720faa50dd096c503"} err="failed to get container status \"3693b12b89ffc220e682520f3c0f946f7d316b0ea61ae33720faa50dd096c503\": rpc error: code = NotFound desc = could not find container \"3693b12b89ffc220e682520f3c0f946f7d316b0ea61ae33720faa50dd096c503\": container with ID starting with 3693b12b89ffc220e682520f3c0f946f7d316b0ea61ae33720faa50dd096c503 not found: ID does not exist" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.859469 4766 scope.go:117] "RemoveContainer" containerID="c7192fcc291ed976968f4da53cefcf9a8b2fda7d113e8bd1d8560da378926522" Oct 02 12:27:13 crc kubenswrapper[4766]: E1002 12:27:13.860042 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7192fcc291ed976968f4da53cefcf9a8b2fda7d113e8bd1d8560da378926522\": container with ID starting with c7192fcc291ed976968f4da53cefcf9a8b2fda7d113e8bd1d8560da378926522 not found: ID does not exist" containerID="c7192fcc291ed976968f4da53cefcf9a8b2fda7d113e8bd1d8560da378926522" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.860315 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7192fcc291ed976968f4da53cefcf9a8b2fda7d113e8bd1d8560da378926522"} err="failed to get container status \"c7192fcc291ed976968f4da53cefcf9a8b2fda7d113e8bd1d8560da378926522\": rpc error: code = NotFound desc = could not find container \"c7192fcc291ed976968f4da53cefcf9a8b2fda7d113e8bd1d8560da378926522\": container with ID starting with c7192fcc291ed976968f4da53cefcf9a8b2fda7d113e8bd1d8560da378926522 not found: ID does not exist" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.921119 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2ad094-910d-40a1-b172-b1ad77166e18-config-data\") pod \"placement-f4c5978dd-trc79\" (UID: \"7d2ad094-910d-40a1-b172-b1ad77166e18\") " pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.921249 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzhct\" (UniqueName: \"kubernetes.io/projected/7d2ad094-910d-40a1-b172-b1ad77166e18-kube-api-access-nzhct\") pod \"placement-f4c5978dd-trc79\" (UID: \"7d2ad094-910d-40a1-b172-b1ad77166e18\") " pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.921296 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2ad094-910d-40a1-b172-b1ad77166e18-scripts\") pod \"placement-f4c5978dd-trc79\" (UID: \"7d2ad094-910d-40a1-b172-b1ad77166e18\") " pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.921315 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d2ad094-910d-40a1-b172-b1ad77166e18-logs\") pod \"placement-f4c5978dd-trc79\" (UID: \"7d2ad094-910d-40a1-b172-b1ad77166e18\") " pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:13 crc kubenswrapper[4766]: I1002 12:27:13.921351 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2ad094-910d-40a1-b172-b1ad77166e18-combined-ca-bundle\") pod \"placement-f4c5978dd-trc79\" (UID: \"7d2ad094-910d-40a1-b172-b1ad77166e18\") " pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:14 crc kubenswrapper[4766]: I1002 12:27:14.024034 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2ad094-910d-40a1-b172-b1ad77166e18-config-data\") pod \"placement-f4c5978dd-trc79\" (UID: \"7d2ad094-910d-40a1-b172-b1ad77166e18\") " pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:14 crc kubenswrapper[4766]: I1002 12:27:14.024230 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzhct\" (UniqueName: \"kubernetes.io/projected/7d2ad094-910d-40a1-b172-b1ad77166e18-kube-api-access-nzhct\") pod \"placement-f4c5978dd-trc79\" (UID: \"7d2ad094-910d-40a1-b172-b1ad77166e18\") " pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:14 crc kubenswrapper[4766]: I1002 12:27:14.024315 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2ad094-910d-40a1-b172-b1ad77166e18-scripts\") pod \"placement-f4c5978dd-trc79\" (UID: \"7d2ad094-910d-40a1-b172-b1ad77166e18\") " pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:14 crc kubenswrapper[4766]: I1002 12:27:14.024337 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d2ad094-910d-40a1-b172-b1ad77166e18-logs\") pod \"placement-f4c5978dd-trc79\" (UID: \"7d2ad094-910d-40a1-b172-b1ad77166e18\") " pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:14 crc kubenswrapper[4766]: I1002 12:27:14.024392 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2ad094-910d-40a1-b172-b1ad77166e18-combined-ca-bundle\") pod \"placement-f4c5978dd-trc79\" (UID: \"7d2ad094-910d-40a1-b172-b1ad77166e18\") " pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:14 crc kubenswrapper[4766]: I1002 12:27:14.025453 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7d2ad094-910d-40a1-b172-b1ad77166e18-logs\") pod \"placement-f4c5978dd-trc79\" (UID: \"7d2ad094-910d-40a1-b172-b1ad77166e18\") " pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:14 crc kubenswrapper[4766]: I1002 12:27:14.030444 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d2ad094-910d-40a1-b172-b1ad77166e18-scripts\") pod \"placement-f4c5978dd-trc79\" (UID: \"7d2ad094-910d-40a1-b172-b1ad77166e18\") " pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:14 crc kubenswrapper[4766]: I1002 12:27:14.030530 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d2ad094-910d-40a1-b172-b1ad77166e18-combined-ca-bundle\") pod \"placement-f4c5978dd-trc79\" (UID: \"7d2ad094-910d-40a1-b172-b1ad77166e18\") " pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:14 crc kubenswrapper[4766]: I1002 12:27:14.031373 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d2ad094-910d-40a1-b172-b1ad77166e18-config-data\") pod \"placement-f4c5978dd-trc79\" (UID: \"7d2ad094-910d-40a1-b172-b1ad77166e18\") " pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:14 crc kubenswrapper[4766]: I1002 12:27:14.045814 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzhct\" (UniqueName: \"kubernetes.io/projected/7d2ad094-910d-40a1-b172-b1ad77166e18-kube-api-access-nzhct\") pod \"placement-f4c5978dd-trc79\" (UID: \"7d2ad094-910d-40a1-b172-b1ad77166e18\") " pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:14 crc kubenswrapper[4766]: I1002 12:27:14.196980 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:14 crc kubenswrapper[4766]: I1002 12:27:14.432752 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e3f0c2-3bda-46d9-9560-443036524d12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8e3f0c2-3bda-46d9-9560-443036524d12" (UID: "f8e3f0c2-3bda-46d9-9560-443036524d12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:27:14 crc kubenswrapper[4766]: I1002 12:27:14.435148 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8e3f0c2-3bda-46d9-9560-443036524d12-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:14 crc kubenswrapper[4766]: I1002 12:27:14.646619 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-q6b7f"] Oct 02 12:27:14 crc kubenswrapper[4766]: I1002 12:27:14.658117 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-q6b7f"] Oct 02 12:27:14 crc kubenswrapper[4766]: I1002 12:27:14.678356 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f4c5978dd-trc79"] Oct 02 12:27:14 crc kubenswrapper[4766]: I1002 12:27:14.714335 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f4c5978dd-trc79" event={"ID":"7d2ad094-910d-40a1-b172-b1ad77166e18","Type":"ContainerStarted","Data":"c77543cf6050cfcfb5deaf219b227925c333684be337d11eb0ed80c6d6cc418d"} Oct 02 12:27:15 crc kubenswrapper[4766]: I1002 12:27:15.729934 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f4c5978dd-trc79" event={"ID":"7d2ad094-910d-40a1-b172-b1ad77166e18","Type":"ContainerStarted","Data":"c741b8bccf6f3b50ae686d76d4befbc27f779c1ff83e5c3a8bfd8a960b20cd15"} Oct 02 12:27:15 crc kubenswrapper[4766]: I1002 12:27:15.730008 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f4c5978dd-trc79" event={"ID":"7d2ad094-910d-40a1-b172-b1ad77166e18","Type":"ContainerStarted","Data":"b156c75ff605430d0eda0402f45d0f2830362ddf9b8cc4dd25b614e59184e9cb"} Oct 02 12:27:15 crc kubenswrapper[4766]: I1002 12:27:15.730104 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:15 crc kubenswrapper[4766]: I1002 12:27:15.754475 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-f4c5978dd-trc79" podStartSLOduration=2.754452204 podStartE2EDuration="2.754452204s" podCreationTimestamp="2025-10-02 12:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:27:15.751229182 +0000 UTC m=+5750.694100126" watchObservedRunningTime="2025-10-02 12:27:15.754452204 +0000 UTC m=+5750.697323148" Oct 02 12:27:15 crc kubenswrapper[4766]: I1002 12:27:15.898106 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e3f0c2-3bda-46d9-9560-443036524d12" path="/var/lib/kubelet/pods/f8e3f0c2-3bda-46d9-9560-443036524d12/volumes" Oct 02 12:27:16 crc kubenswrapper[4766]: I1002 12:27:16.738570 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:19 crc kubenswrapper[4766]: I1002 12:27:19.404761 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:27:19 crc kubenswrapper[4766]: I1002 12:27:19.474943 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cf567857-rsl2t"] Oct 02 12:27:19 crc kubenswrapper[4766]: I1002 12:27:19.475638 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cf567857-rsl2t" podUID="2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887" containerName="dnsmasq-dns" containerID="cri-o://4d753e242a3dfa893faed26064f96fdeab9da52620b16251be5e229e1fafeef0" gracePeriod=10 Oct 02 12:27:19 crc kubenswrapper[4766]: I1002 12:27:19.786607 4766 generic.go:334] "Generic (PLEG): container finished" podID="2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887" containerID="4d753e242a3dfa893faed26064f96fdeab9da52620b16251be5e229e1fafeef0" exitCode=0 Oct 02 12:27:19 crc kubenswrapper[4766]: I1002 12:27:19.786685 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf567857-rsl2t" event={"ID":"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887","Type":"ContainerDied","Data":"4d753e242a3dfa893faed26064f96fdeab9da52620b16251be5e229e1fafeef0"} Oct 02 12:27:19 crc kubenswrapper[4766]: I1002 12:27:19.966387 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.058353 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc5kb\" (UniqueName: \"kubernetes.io/projected/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-kube-api-access-sc5kb\") pod \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.058449 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-ovsdbserver-sb\") pod \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.058478 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-dns-svc\") pod \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.058593 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-config\") pod \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.058674 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-ovsdbserver-nb\") pod \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\" (UID: \"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887\") " Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.064886 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-kube-api-access-sc5kb" (OuterVolumeSpecName: "kube-api-access-sc5kb") pod "2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887" (UID: "2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887"). InnerVolumeSpecName "kube-api-access-sc5kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.106322 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887" (UID: "2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.106907 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887" (UID: "2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.112715 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-config" (OuterVolumeSpecName: "config") pod "2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887" (UID: "2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.124976 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887" (UID: "2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.162042 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.162261 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc5kb\" (UniqueName: \"kubernetes.io/projected/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-kube-api-access-sc5kb\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.162336 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.162438 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.162531 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.813617 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cf567857-rsl2t" event={"ID":"2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887","Type":"ContainerDied","Data":"6f2bbc2d5952f25637adc6d4426dc24e1ff83fbd6f8a9b22cfc675d9504db66b"} Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.813743 4766 scope.go:117] "RemoveContainer" containerID="4d753e242a3dfa893faed26064f96fdeab9da52620b16251be5e229e1fafeef0" Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.814086 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cf567857-rsl2t" Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.848881 4766 scope.go:117] "RemoveContainer" containerID="3153246a907df9e4703210bf1d959614ef808e6a56b50f77965fa2b24e1b4dc9" Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.865579 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cf567857-rsl2t"] Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.878797 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cf567857-rsl2t"] Oct 02 12:27:20 crc kubenswrapper[4766]: I1002 12:27:20.881586 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:27:20 crc kubenswrapper[4766]: E1002 12:27:20.881886 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:27:21 crc kubenswrapper[4766]: I1002 12:27:21.891706 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887" path="/var/lib/kubelet/pods/2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887/volumes" Oct 02 12:27:33 crc kubenswrapper[4766]: I1002 12:27:33.881437 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:27:33 crc kubenswrapper[4766]: E1002 12:27:33.882482 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:27:35 crc kubenswrapper[4766]: I1002 12:27:35.712362 4766 scope.go:117] "RemoveContainer" containerID="7f9d6aa64912a1f0490b908af1cb430114db9094fd6c749b47906b82ea636561" Oct 02 12:27:35 crc kubenswrapper[4766]: I1002 12:27:35.739592 4766 scope.go:117] "RemoveContainer" containerID="99fa510c7eea63d8fb340958cb18f6ca7fcbb65cf925fc8b16c3f5214a18c806" Oct 02 12:27:35 crc kubenswrapper[4766]: I1002 12:27:35.806164 4766 scope.go:117] "RemoveContainer" containerID="963829ec9ae72105303eb74de2dea6160242d4f5c6d2babc8c50942fa2834978" Oct 02 12:27:44 crc kubenswrapper[4766]: I1002 12:27:44.882368 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:27:44 crc kubenswrapper[4766]: E1002 12:27:44.883455 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:27:45 crc kubenswrapper[4766]: I1002 12:27:45.279114 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:45 crc kubenswrapper[4766]: I1002 12:27:45.279958 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-f4c5978dd-trc79" Oct 02 12:27:58 crc kubenswrapper[4766]: I1002 12:27:58.881956 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:27:58 crc kubenswrapper[4766]: E1002 12:27:58.883476 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:28:07 crc kubenswrapper[4766]: I1002 12:28:07.895859 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-2mjk4"] Oct 02 12:28:07 crc kubenswrapper[4766]: E1002 12:28:07.896941 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887" containerName="init" Oct 02 12:28:07 crc kubenswrapper[4766]: I1002 12:28:07.896957 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887" containerName="init" Oct 02 12:28:07 crc kubenswrapper[4766]: E1002 12:28:07.896975 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887" containerName="dnsmasq-dns" Oct 02 12:28:07 crc kubenswrapper[4766]: I1002 12:28:07.896981 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887" containerName="dnsmasq-dns" Oct 02 12:28:07 crc kubenswrapper[4766]: I1002 12:28:07.897156 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2387fe2c-1066-4ec0-b3fb-8e0b8d1c6887" containerName="dnsmasq-dns" Oct 02 12:28:07 crc kubenswrapper[4766]: I1002 12:28:07.897824 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2mjk4" Oct 02 12:28:07 crc kubenswrapper[4766]: I1002 12:28:07.912519 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2mjk4"] Oct 02 12:28:07 crc kubenswrapper[4766]: I1002 12:28:07.990080 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-v5nvm"] Oct 02 12:28:07 crc kubenswrapper[4766]: I1002 12:28:07.991592 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-v5nvm" Oct 02 12:28:07 crc kubenswrapper[4766]: I1002 12:28:07.994766 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lbws\" (UniqueName: \"kubernetes.io/projected/c7b8b672-6952-4faf-9a27-61e46932d297-kube-api-access-6lbws\") pod \"nova-api-db-create-2mjk4\" (UID: \"c7b8b672-6952-4faf-9a27-61e46932d297\") " pod="openstack/nova-api-db-create-2mjk4" Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.002224 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-v5nvm"] Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.091822 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-6fx7r"] Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.097682 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6fx7r" Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.110238 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lbws\" (UniqueName: \"kubernetes.io/projected/c7b8b672-6952-4faf-9a27-61e46932d297-kube-api-access-6lbws\") pod \"nova-api-db-create-2mjk4\" (UID: \"c7b8b672-6952-4faf-9a27-61e46932d297\") " pod="openstack/nova-api-db-create-2mjk4" Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.110357 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss6t6\" (UniqueName: \"kubernetes.io/projected/49a751b6-3e73-4b99-8407-386e287fedf3-kube-api-access-ss6t6\") pod \"nova-cell0-db-create-v5nvm\" (UID: \"49a751b6-3e73-4b99-8407-386e287fedf3\") " pod="openstack/nova-cell0-db-create-v5nvm" Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.126218 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-6fx7r"] Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.141233 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lbws\" (UniqueName: \"kubernetes.io/projected/c7b8b672-6952-4faf-9a27-61e46932d297-kube-api-access-6lbws\") pod \"nova-api-db-create-2mjk4\" (UID: \"c7b8b672-6952-4faf-9a27-61e46932d297\") " pod="openstack/nova-api-db-create-2mjk4" Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.213141 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss6t6\" (UniqueName: \"kubernetes.io/projected/49a751b6-3e73-4b99-8407-386e287fedf3-kube-api-access-ss6t6\") pod \"nova-cell0-db-create-v5nvm\" (UID: \"49a751b6-3e73-4b99-8407-386e287fedf3\") " pod="openstack/nova-cell0-db-create-v5nvm" Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.213662 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54rbf\" (UniqueName: \"kubernetes.io/projected/9af48511-50fd-409c-a582-1473fc1776cf-kube-api-access-54rbf\") pod \"nova-cell1-db-create-6fx7r\" (UID: \"9af48511-50fd-409c-a582-1473fc1776cf\") " pod="openstack/nova-cell1-db-create-6fx7r" Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.228316 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2mjk4" Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.235834 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss6t6\" (UniqueName: \"kubernetes.io/projected/49a751b6-3e73-4b99-8407-386e287fedf3-kube-api-access-ss6t6\") pod \"nova-cell0-db-create-v5nvm\" (UID: \"49a751b6-3e73-4b99-8407-386e287fedf3\") " pod="openstack/nova-cell0-db-create-v5nvm" Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.309374 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-v5nvm" Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.315718 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54rbf\" (UniqueName: \"kubernetes.io/projected/9af48511-50fd-409c-a582-1473fc1776cf-kube-api-access-54rbf\") pod \"nova-cell1-db-create-6fx7r\" (UID: \"9af48511-50fd-409c-a582-1473fc1776cf\") " pod="openstack/nova-cell1-db-create-6fx7r" Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.339691 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54rbf\" (UniqueName: \"kubernetes.io/projected/9af48511-50fd-409c-a582-1473fc1776cf-kube-api-access-54rbf\") pod \"nova-cell1-db-create-6fx7r\" (UID: \"9af48511-50fd-409c-a582-1473fc1776cf\") " pod="openstack/nova-cell1-db-create-6fx7r" Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.427082 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6fx7r" Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.759775 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-2mjk4"] Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.849962 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-v5nvm"] Oct 02 12:28:08 crc kubenswrapper[4766]: W1002 12:28:08.869528 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49a751b6_3e73_4b99_8407_386e287fedf3.slice/crio-477baa5b866900c77d7155feb1fdd4c8d49782c5070f61f31e4d877693975d80 WatchSource:0}: Error finding container 477baa5b866900c77d7155feb1fdd4c8d49782c5070f61f31e4d877693975d80: Status 404 returned error can't find the container with id 477baa5b866900c77d7155feb1fdd4c8d49782c5070f61f31e4d877693975d80 Oct 02 12:28:08 crc kubenswrapper[4766]: I1002 12:28:08.932467 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-6fx7r"] Oct 02 12:28:08 crc kubenswrapper[4766]: W1002 12:28:08.941294 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9af48511_50fd_409c_a582_1473fc1776cf.slice/crio-600801312a23283956d8e7815fc6253ac1332701a851238cd82b92991c4dd69c WatchSource:0}: Error finding container 600801312a23283956d8e7815fc6253ac1332701a851238cd82b92991c4dd69c: Status 404 returned error can't find the container with id 600801312a23283956d8e7815fc6253ac1332701a851238cd82b92991c4dd69c Oct 02 12:28:09 crc kubenswrapper[4766]: I1002 12:28:09.348744 4766 generic.go:334] "Generic (PLEG): container finished" podID="49a751b6-3e73-4b99-8407-386e287fedf3" containerID="f46e2edf4c77ade8fba78bc0c90166d0c016f8c08e45ecd4e15018bbcc0f4a54" exitCode=0 Oct 02 12:28:09 crc kubenswrapper[4766]: I1002 12:28:09.348857 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-v5nvm" event={"ID":"49a751b6-3e73-4b99-8407-386e287fedf3","Type":"ContainerDied","Data":"f46e2edf4c77ade8fba78bc0c90166d0c016f8c08e45ecd4e15018bbcc0f4a54"} Oct 02 12:28:09 crc kubenswrapper[4766]: I1002 12:28:09.348895 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-v5nvm" event={"ID":"49a751b6-3e73-4b99-8407-386e287fedf3","Type":"ContainerStarted","Data":"477baa5b866900c77d7155feb1fdd4c8d49782c5070f61f31e4d877693975d80"} Oct 02 12:28:09 crc kubenswrapper[4766]: I1002 12:28:09.353402 4766 generic.go:334] "Generic (PLEG): container finished" podID="c7b8b672-6952-4faf-9a27-61e46932d297" containerID="e4db6605cb0b202a8d0c73618d6578faebc86fb1850d5eb2de0a0fe56fe93bc3" exitCode=0 Oct 02 12:28:09 crc kubenswrapper[4766]: I1002 12:28:09.353605 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2mjk4" event={"ID":"c7b8b672-6952-4faf-9a27-61e46932d297","Type":"ContainerDied","Data":"e4db6605cb0b202a8d0c73618d6578faebc86fb1850d5eb2de0a0fe56fe93bc3"} Oct 02 12:28:09 crc kubenswrapper[4766]: I1002 12:28:09.353689 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2mjk4" event={"ID":"c7b8b672-6952-4faf-9a27-61e46932d297","Type":"ContainerStarted","Data":"38054cd4b17b95f1eabba5a385920b8322f5605b04435778a0c60b5c4ef9b97b"} Oct 02 12:28:09 crc kubenswrapper[4766]: I1002 12:28:09.355818 4766 generic.go:334] "Generic (PLEG): container finished" podID="9af48511-50fd-409c-a582-1473fc1776cf" containerID="06e3ff123212eadc9f53fbf6cb66440a39104196b857c43e66473f5c01dc9c49" exitCode=0 Oct 02 12:28:09 crc kubenswrapper[4766]: I1002 12:28:09.355915 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6fx7r" event={"ID":"9af48511-50fd-409c-a582-1473fc1776cf","Type":"ContainerDied","Data":"06e3ff123212eadc9f53fbf6cb66440a39104196b857c43e66473f5c01dc9c49"} Oct 02 12:28:09 crc kubenswrapper[4766]: I1002 12:28:09.356081 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6fx7r" event={"ID":"9af48511-50fd-409c-a582-1473fc1776cf","Type":"ContainerStarted","Data":"600801312a23283956d8e7815fc6253ac1332701a851238cd82b92991c4dd69c"} Oct 02 12:28:09 crc kubenswrapper[4766]: I1002 12:28:09.881805 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:28:09 crc kubenswrapper[4766]: E1002 12:28:09.882093 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:28:10 crc kubenswrapper[4766]: I1002 12:28:10.826647 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2mjk4" Oct 02 12:28:10 crc kubenswrapper[4766]: I1002 12:28:10.833258 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-v5nvm" Oct 02 12:28:10 crc kubenswrapper[4766]: I1002 12:28:10.851245 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6fx7r" Oct 02 12:28:10 crc kubenswrapper[4766]: I1002 12:28:10.974334 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54rbf\" (UniqueName: \"kubernetes.io/projected/9af48511-50fd-409c-a582-1473fc1776cf-kube-api-access-54rbf\") pod \"9af48511-50fd-409c-a582-1473fc1776cf\" (UID: \"9af48511-50fd-409c-a582-1473fc1776cf\") " Oct 02 12:28:10 crc kubenswrapper[4766]: I1002 12:28:10.974400 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss6t6\" (UniqueName: \"kubernetes.io/projected/49a751b6-3e73-4b99-8407-386e287fedf3-kube-api-access-ss6t6\") pod \"49a751b6-3e73-4b99-8407-386e287fedf3\" (UID: \"49a751b6-3e73-4b99-8407-386e287fedf3\") " Oct 02 12:28:10 crc kubenswrapper[4766]: I1002 12:28:10.974542 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lbws\" (UniqueName: \"kubernetes.io/projected/c7b8b672-6952-4faf-9a27-61e46932d297-kube-api-access-6lbws\") pod \"c7b8b672-6952-4faf-9a27-61e46932d297\" (UID: \"c7b8b672-6952-4faf-9a27-61e46932d297\") " Oct 02 12:28:10 crc kubenswrapper[4766]: I1002 12:28:10.983324 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49a751b6-3e73-4b99-8407-386e287fedf3-kube-api-access-ss6t6" (OuterVolumeSpecName: "kube-api-access-ss6t6") pod "49a751b6-3e73-4b99-8407-386e287fedf3" (UID: "49a751b6-3e73-4b99-8407-386e287fedf3"). InnerVolumeSpecName "kube-api-access-ss6t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:28:10 crc kubenswrapper[4766]: I1002 12:28:10.983498 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9af48511-50fd-409c-a582-1473fc1776cf-kube-api-access-54rbf" (OuterVolumeSpecName: "kube-api-access-54rbf") pod "9af48511-50fd-409c-a582-1473fc1776cf" (UID: "9af48511-50fd-409c-a582-1473fc1776cf"). InnerVolumeSpecName "kube-api-access-54rbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:28:10 crc kubenswrapper[4766]: I1002 12:28:10.983744 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7b8b672-6952-4faf-9a27-61e46932d297-kube-api-access-6lbws" (OuterVolumeSpecName: "kube-api-access-6lbws") pod "c7b8b672-6952-4faf-9a27-61e46932d297" (UID: "c7b8b672-6952-4faf-9a27-61e46932d297"). InnerVolumeSpecName "kube-api-access-6lbws". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:28:11 crc kubenswrapper[4766]: I1002 12:28:11.077838 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lbws\" (UniqueName: \"kubernetes.io/projected/c7b8b672-6952-4faf-9a27-61e46932d297-kube-api-access-6lbws\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:11 crc kubenswrapper[4766]: I1002 12:28:11.077899 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54rbf\" (UniqueName: \"kubernetes.io/projected/9af48511-50fd-409c-a582-1473fc1776cf-kube-api-access-54rbf\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:11 crc kubenswrapper[4766]: I1002 12:28:11.077928 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss6t6\" (UniqueName: \"kubernetes.io/projected/49a751b6-3e73-4b99-8407-386e287fedf3-kube-api-access-ss6t6\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:11 crc kubenswrapper[4766]: I1002 12:28:11.388779 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-6fx7r" event={"ID":"9af48511-50fd-409c-a582-1473fc1776cf","Type":"ContainerDied","Data":"600801312a23283956d8e7815fc6253ac1332701a851238cd82b92991c4dd69c"} Oct 02 12:28:11 crc kubenswrapper[4766]: I1002 12:28:11.388838 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="600801312a23283956d8e7815fc6253ac1332701a851238cd82b92991c4dd69c" Oct 02 12:28:11 crc kubenswrapper[4766]: I1002 12:28:11.388913 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-6fx7r" Oct 02 12:28:11 crc kubenswrapper[4766]: I1002 12:28:11.394939 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-v5nvm" Oct 02 12:28:11 crc kubenswrapper[4766]: I1002 12:28:11.394968 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-v5nvm" event={"ID":"49a751b6-3e73-4b99-8407-386e287fedf3","Type":"ContainerDied","Data":"477baa5b866900c77d7155feb1fdd4c8d49782c5070f61f31e4d877693975d80"} Oct 02 12:28:11 crc kubenswrapper[4766]: I1002 12:28:11.395011 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="477baa5b866900c77d7155feb1fdd4c8d49782c5070f61f31e4d877693975d80" Oct 02 12:28:11 crc kubenswrapper[4766]: I1002 12:28:11.400164 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-2mjk4" event={"ID":"c7b8b672-6952-4faf-9a27-61e46932d297","Type":"ContainerDied","Data":"38054cd4b17b95f1eabba5a385920b8322f5605b04435778a0c60b5c4ef9b97b"} Oct 02 12:28:11 crc kubenswrapper[4766]: I1002 12:28:11.400209 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-2mjk4" Oct 02 12:28:11 crc kubenswrapper[4766]: I1002 12:28:11.400234 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38054cd4b17b95f1eabba5a385920b8322f5605b04435778a0c60b5c4ef9b97b" Oct 02 12:28:11 crc kubenswrapper[4766]: E1002 12:28:11.597773 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7b8b672_6952_4faf_9a27_61e46932d297.slice/crio-38054cd4b17b95f1eabba5a385920b8322f5605b04435778a0c60b5c4ef9b97b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7b8b672_6952_4faf_9a27_61e46932d297.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49a751b6_3e73_4b99_8407_386e287fedf3.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.139620 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-af55-account-create-cqr4r"] Oct 02 12:28:18 crc kubenswrapper[4766]: E1002 12:28:18.143747 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9af48511-50fd-409c-a582-1473fc1776cf" containerName="mariadb-database-create" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.143773 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9af48511-50fd-409c-a582-1473fc1776cf" containerName="mariadb-database-create" Oct 02 12:28:18 crc kubenswrapper[4766]: E1002 12:28:18.143802 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49a751b6-3e73-4b99-8407-386e287fedf3" containerName="mariadb-database-create" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.143811 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="49a751b6-3e73-4b99-8407-386e287fedf3" containerName="mariadb-database-create" Oct 02 12:28:18 crc kubenswrapper[4766]: E1002 12:28:18.143833 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7b8b672-6952-4faf-9a27-61e46932d297" containerName="mariadb-database-create" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.143842 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7b8b672-6952-4faf-9a27-61e46932d297" containerName="mariadb-database-create" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.144131 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="49a751b6-3e73-4b99-8407-386e287fedf3" containerName="mariadb-database-create" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.144154 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7b8b672-6952-4faf-9a27-61e46932d297" containerName="mariadb-database-create" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.144173 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9af48511-50fd-409c-a582-1473fc1776cf" containerName="mariadb-database-create" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.145162 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-af55-account-create-cqr4r" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.148667 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.157152 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-af55-account-create-cqr4r"] Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.250611 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chkwh\" (UniqueName: \"kubernetes.io/projected/678b54d8-3db1-45ac-b2bc-85bbf874b697-kube-api-access-chkwh\") pod \"nova-api-af55-account-create-cqr4r\" (UID: \"678b54d8-3db1-45ac-b2bc-85bbf874b697\") " pod="openstack/nova-api-af55-account-create-cqr4r" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.341974 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9040-account-create-gvdld"] Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.347180 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9040-account-create-gvdld" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.350769 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.352563 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chkwh\" (UniqueName: \"kubernetes.io/projected/678b54d8-3db1-45ac-b2bc-85bbf874b697-kube-api-access-chkwh\") pod \"nova-api-af55-account-create-cqr4r\" (UID: \"678b54d8-3db1-45ac-b2bc-85bbf874b697\") " pod="openstack/nova-api-af55-account-create-cqr4r" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.360376 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9040-account-create-gvdld"] Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.400002 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chkwh\" (UniqueName: \"kubernetes.io/projected/678b54d8-3db1-45ac-b2bc-85bbf874b697-kube-api-access-chkwh\") pod \"nova-api-af55-account-create-cqr4r\" (UID: \"678b54d8-3db1-45ac-b2bc-85bbf874b697\") " pod="openstack/nova-api-af55-account-create-cqr4r" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.455067 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sbr8\" (UniqueName: \"kubernetes.io/projected/0951018c-a2da-4b4e-9855-5f8c01e996d3-kube-api-access-2sbr8\") pod \"nova-cell0-9040-account-create-gvdld\" (UID: \"0951018c-a2da-4b4e-9855-5f8c01e996d3\") " pod="openstack/nova-cell0-9040-account-create-gvdld" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.475549 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-af55-account-create-cqr4r" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.534107 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-f7c6-account-create-ksxjs"] Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.535812 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f7c6-account-create-ksxjs" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.539690 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.546415 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f7c6-account-create-ksxjs"] Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.557735 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sbr8\" (UniqueName: \"kubernetes.io/projected/0951018c-a2da-4b4e-9855-5f8c01e996d3-kube-api-access-2sbr8\") pod \"nova-cell0-9040-account-create-gvdld\" (UID: \"0951018c-a2da-4b4e-9855-5f8c01e996d3\") " pod="openstack/nova-cell0-9040-account-create-gvdld" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.580254 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sbr8\" (UniqueName: \"kubernetes.io/projected/0951018c-a2da-4b4e-9855-5f8c01e996d3-kube-api-access-2sbr8\") pod \"nova-cell0-9040-account-create-gvdld\" (UID: \"0951018c-a2da-4b4e-9855-5f8c01e996d3\") " pod="openstack/nova-cell0-9040-account-create-gvdld" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.660942 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7mxz\" (UniqueName: \"kubernetes.io/projected/c777bfde-3b16-4cd1-865a-910c60753ba3-kube-api-access-v7mxz\") pod \"nova-cell1-f7c6-account-create-ksxjs\" (UID: \"c777bfde-3b16-4cd1-865a-910c60753ba3\") " pod="openstack/nova-cell1-f7c6-account-create-ksxjs" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.676194 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9040-account-create-gvdld" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.763415 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7mxz\" (UniqueName: \"kubernetes.io/projected/c777bfde-3b16-4cd1-865a-910c60753ba3-kube-api-access-v7mxz\") pod \"nova-cell1-f7c6-account-create-ksxjs\" (UID: \"c777bfde-3b16-4cd1-865a-910c60753ba3\") " pod="openstack/nova-cell1-f7c6-account-create-ksxjs" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.792472 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7mxz\" (UniqueName: \"kubernetes.io/projected/c777bfde-3b16-4cd1-865a-910c60753ba3-kube-api-access-v7mxz\") pod \"nova-cell1-f7c6-account-create-ksxjs\" (UID: \"c777bfde-3b16-4cd1-865a-910c60753ba3\") " pod="openstack/nova-cell1-f7c6-account-create-ksxjs" Oct 02 12:28:18 crc kubenswrapper[4766]: I1002 12:28:18.957223 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f7c6-account-create-ksxjs" Oct 02 12:28:19 crc kubenswrapper[4766]: I1002 12:28:19.012487 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-af55-account-create-cqr4r"] Oct 02 12:28:19 crc kubenswrapper[4766]: I1002 12:28:19.173334 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9040-account-create-gvdld"] Oct 02 12:28:19 crc kubenswrapper[4766]: I1002 12:28:19.491953 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9040-account-create-gvdld" event={"ID":"0951018c-a2da-4b4e-9855-5f8c01e996d3","Type":"ContainerStarted","Data":"c05146ce2eae86aef129757af364beaf1fefdc130f2c1d0519bd2d957200167f"} Oct 02 12:28:19 crc kubenswrapper[4766]: I1002 12:28:19.492017 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9040-account-create-gvdld" event={"ID":"0951018c-a2da-4b4e-9855-5f8c01e996d3","Type":"ContainerStarted","Data":"2d6d159784cd88f6b3eb5bb0d3be8c28668d5ff8073c48b009848269e328c2db"} Oct 02 12:28:19 crc kubenswrapper[4766]: I1002 12:28:19.495814 4766 generic.go:334] "Generic (PLEG): container finished" podID="678b54d8-3db1-45ac-b2bc-85bbf874b697" containerID="8268d2fecded465798967c832ac0112aa25fffc8c1535460c33d506ba344a174" exitCode=0 Oct 02 12:28:19 crc kubenswrapper[4766]: I1002 12:28:19.495940 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-af55-account-create-cqr4r" event={"ID":"678b54d8-3db1-45ac-b2bc-85bbf874b697","Type":"ContainerDied","Data":"8268d2fecded465798967c832ac0112aa25fffc8c1535460c33d506ba344a174"} Oct 02 12:28:19 crc kubenswrapper[4766]: I1002 12:28:19.496021 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-af55-account-create-cqr4r" event={"ID":"678b54d8-3db1-45ac-b2bc-85bbf874b697","Type":"ContainerStarted","Data":"9bc56646bcab3d8d35eecc8addc2d2449cfcf10b94795464c59f67de91669507"} Oct 02 12:28:19 crc kubenswrapper[4766]: I1002 12:28:19.499975 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-f7c6-account-create-ksxjs"] Oct 02 12:28:19 crc kubenswrapper[4766]: W1002 12:28:19.504040 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc777bfde_3b16_4cd1_865a_910c60753ba3.slice/crio-edb68bac996c05b7ce5d5a74f2b905fe17d571aa054454cc2902a64eb2655a89 WatchSource:0}: Error finding container edb68bac996c05b7ce5d5a74f2b905fe17d571aa054454cc2902a64eb2655a89: Status 404 returned error can't find the container with id edb68bac996c05b7ce5d5a74f2b905fe17d571aa054454cc2902a64eb2655a89 Oct 02 12:28:19 crc kubenswrapper[4766]: I1002 12:28:19.550011 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-9040-account-create-gvdld" podStartSLOduration=1.549972203 podStartE2EDuration="1.549972203s" podCreationTimestamp="2025-10-02 12:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:19.510816119 +0000 UTC m=+5814.453687073" watchObservedRunningTime="2025-10-02 12:28:19.549972203 +0000 UTC m=+5814.492843157" Oct 02 12:28:20 crc kubenswrapper[4766]: I1002 12:28:20.507056 4766 generic.go:334] "Generic (PLEG): container finished" podID="0951018c-a2da-4b4e-9855-5f8c01e996d3" containerID="c05146ce2eae86aef129757af364beaf1fefdc130f2c1d0519bd2d957200167f" exitCode=0 Oct 02 12:28:20 crc kubenswrapper[4766]: I1002 12:28:20.507118 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9040-account-create-gvdld" event={"ID":"0951018c-a2da-4b4e-9855-5f8c01e996d3","Type":"ContainerDied","Data":"c05146ce2eae86aef129757af364beaf1fefdc130f2c1d0519bd2d957200167f"} Oct 02 12:28:20 crc kubenswrapper[4766]: I1002 12:28:20.512021 4766 generic.go:334] "Generic (PLEG): container finished" podID="c777bfde-3b16-4cd1-865a-910c60753ba3" containerID="2ba49f0fd3fa684c911d44383a7d98531f824028e5391c320ae4f9bd49a1150c" exitCode=0 Oct 02 12:28:20 crc kubenswrapper[4766]: I1002 12:28:20.512101 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f7c6-account-create-ksxjs" event={"ID":"c777bfde-3b16-4cd1-865a-910c60753ba3","Type":"ContainerDied","Data":"2ba49f0fd3fa684c911d44383a7d98531f824028e5391c320ae4f9bd49a1150c"} Oct 02 12:28:20 crc kubenswrapper[4766]: I1002 12:28:20.512157 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f7c6-account-create-ksxjs" event={"ID":"c777bfde-3b16-4cd1-865a-910c60753ba3","Type":"ContainerStarted","Data":"edb68bac996c05b7ce5d5a74f2b905fe17d571aa054454cc2902a64eb2655a89"} Oct 02 12:28:20 crc kubenswrapper[4766]: I1002 12:28:20.875392 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-af55-account-create-cqr4r" Oct 02 12:28:21 crc kubenswrapper[4766]: I1002 12:28:21.016412 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chkwh\" (UniqueName: \"kubernetes.io/projected/678b54d8-3db1-45ac-b2bc-85bbf874b697-kube-api-access-chkwh\") pod \"678b54d8-3db1-45ac-b2bc-85bbf874b697\" (UID: \"678b54d8-3db1-45ac-b2bc-85bbf874b697\") " Oct 02 12:28:21 crc kubenswrapper[4766]: I1002 12:28:21.024382 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/678b54d8-3db1-45ac-b2bc-85bbf874b697-kube-api-access-chkwh" (OuterVolumeSpecName: "kube-api-access-chkwh") pod "678b54d8-3db1-45ac-b2bc-85bbf874b697" (UID: "678b54d8-3db1-45ac-b2bc-85bbf874b697"). InnerVolumeSpecName "kube-api-access-chkwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:28:21 crc kubenswrapper[4766]: I1002 12:28:21.119897 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chkwh\" (UniqueName: \"kubernetes.io/projected/678b54d8-3db1-45ac-b2bc-85bbf874b697-kube-api-access-chkwh\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:21 crc kubenswrapper[4766]: I1002 12:28:21.525052 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-af55-account-create-cqr4r" Oct 02 12:28:21 crc kubenswrapper[4766]: I1002 12:28:21.529685 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-af55-account-create-cqr4r" event={"ID":"678b54d8-3db1-45ac-b2bc-85bbf874b697","Type":"ContainerDied","Data":"9bc56646bcab3d8d35eecc8addc2d2449cfcf10b94795464c59f67de91669507"} Oct 02 12:28:21 crc kubenswrapper[4766]: I1002 12:28:21.529769 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bc56646bcab3d8d35eecc8addc2d2449cfcf10b94795464c59f67de91669507" Oct 02 12:28:22 crc kubenswrapper[4766]: I1002 12:28:21.997796 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9040-account-create-gvdld" Oct 02 12:28:22 crc kubenswrapper[4766]: I1002 12:28:22.005417 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f7c6-account-create-ksxjs" Oct 02 12:28:22 crc kubenswrapper[4766]: I1002 12:28:22.145462 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7mxz\" (UniqueName: \"kubernetes.io/projected/c777bfde-3b16-4cd1-865a-910c60753ba3-kube-api-access-v7mxz\") pod \"c777bfde-3b16-4cd1-865a-910c60753ba3\" (UID: \"c777bfde-3b16-4cd1-865a-910c60753ba3\") " Oct 02 12:28:22 crc kubenswrapper[4766]: I1002 12:28:22.145590 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sbr8\" (UniqueName: \"kubernetes.io/projected/0951018c-a2da-4b4e-9855-5f8c01e996d3-kube-api-access-2sbr8\") pod \"0951018c-a2da-4b4e-9855-5f8c01e996d3\" (UID: \"0951018c-a2da-4b4e-9855-5f8c01e996d3\") " Oct 02 12:28:22 crc kubenswrapper[4766]: I1002 12:28:22.153611 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0951018c-a2da-4b4e-9855-5f8c01e996d3-kube-api-access-2sbr8" (OuterVolumeSpecName: "kube-api-access-2sbr8") pod "0951018c-a2da-4b4e-9855-5f8c01e996d3" (UID: "0951018c-a2da-4b4e-9855-5f8c01e996d3"). InnerVolumeSpecName "kube-api-access-2sbr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:28:22 crc kubenswrapper[4766]: I1002 12:28:22.157556 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c777bfde-3b16-4cd1-865a-910c60753ba3-kube-api-access-v7mxz" (OuterVolumeSpecName: "kube-api-access-v7mxz") pod "c777bfde-3b16-4cd1-865a-910c60753ba3" (UID: "c777bfde-3b16-4cd1-865a-910c60753ba3"). InnerVolumeSpecName "kube-api-access-v7mxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:28:22 crc kubenswrapper[4766]: I1002 12:28:22.247985 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7mxz\" (UniqueName: \"kubernetes.io/projected/c777bfde-3b16-4cd1-865a-910c60753ba3-kube-api-access-v7mxz\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:22 crc kubenswrapper[4766]: I1002 12:28:22.248030 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sbr8\" (UniqueName: \"kubernetes.io/projected/0951018c-a2da-4b4e-9855-5f8c01e996d3-kube-api-access-2sbr8\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:22 crc kubenswrapper[4766]: I1002 12:28:22.540424 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-f7c6-account-create-ksxjs" Oct 02 12:28:22 crc kubenswrapper[4766]: I1002 12:28:22.540411 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-f7c6-account-create-ksxjs" event={"ID":"c777bfde-3b16-4cd1-865a-910c60753ba3","Type":"ContainerDied","Data":"edb68bac996c05b7ce5d5a74f2b905fe17d571aa054454cc2902a64eb2655a89"} Oct 02 12:28:22 crc kubenswrapper[4766]: I1002 12:28:22.540536 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edb68bac996c05b7ce5d5a74f2b905fe17d571aa054454cc2902a64eb2655a89" Oct 02 12:28:22 crc kubenswrapper[4766]: I1002 12:28:22.543113 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9040-account-create-gvdld" event={"ID":"0951018c-a2da-4b4e-9855-5f8c01e996d3","Type":"ContainerDied","Data":"2d6d159784cd88f6b3eb5bb0d3be8c28668d5ff8073c48b009848269e328c2db"} Oct 02 12:28:22 crc kubenswrapper[4766]: I1002 12:28:22.543170 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d6d159784cd88f6b3eb5bb0d3be8c28668d5ff8073c48b009848269e328c2db" Oct 02 12:28:22 crc kubenswrapper[4766]: I1002 12:28:22.543268 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9040-account-create-gvdld" Oct 02 12:28:22 crc kubenswrapper[4766]: I1002 12:28:22.881090 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:28:22 crc kubenswrapper[4766]: E1002 12:28:22.882060 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.579626 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q6f4d"] Oct 02 12:28:23 crc kubenswrapper[4766]: E1002 12:28:23.580254 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="678b54d8-3db1-45ac-b2bc-85bbf874b697" containerName="mariadb-account-create" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.580280 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="678b54d8-3db1-45ac-b2bc-85bbf874b697" containerName="mariadb-account-create" Oct 02 12:28:23 crc kubenswrapper[4766]: E1002 12:28:23.580306 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c777bfde-3b16-4cd1-865a-910c60753ba3" containerName="mariadb-account-create" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.580313 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c777bfde-3b16-4cd1-865a-910c60753ba3" containerName="mariadb-account-create" Oct 02 12:28:23 crc kubenswrapper[4766]: E1002 12:28:23.580346 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0951018c-a2da-4b4e-9855-5f8c01e996d3" containerName="mariadb-account-create" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.580355 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0951018c-a2da-4b4e-9855-5f8c01e996d3" containerName="mariadb-account-create" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.580756 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c777bfde-3b16-4cd1-865a-910c60753ba3" containerName="mariadb-account-create" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.580818 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0951018c-a2da-4b4e-9855-5f8c01e996d3" containerName="mariadb-account-create" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.580836 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="678b54d8-3db1-45ac-b2bc-85bbf874b697" containerName="mariadb-account-create" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.581726 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-q6f4d" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.588356 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.588795 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rxn6g" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.589181 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.589220 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q6f4d"] Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.679658 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwp29\" (UniqueName: \"kubernetes.io/projected/38bacb80-6110-42c7-9923-95cebf834ef0-kube-api-access-gwp29\") pod \"nova-cell0-conductor-db-sync-q6f4d\" (UID: \"38bacb80-6110-42c7-9923-95cebf834ef0\") " pod="openstack/nova-cell0-conductor-db-sync-q6f4d" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.679754 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-config-data\") pod \"nova-cell0-conductor-db-sync-q6f4d\" (UID: \"38bacb80-6110-42c7-9923-95cebf834ef0\") " pod="openstack/nova-cell0-conductor-db-sync-q6f4d" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.679845 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-scripts\") pod \"nova-cell0-conductor-db-sync-q6f4d\" (UID: \"38bacb80-6110-42c7-9923-95cebf834ef0\") " pod="openstack/nova-cell0-conductor-db-sync-q6f4d" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.679883 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-q6f4d\" (UID: \"38bacb80-6110-42c7-9923-95cebf834ef0\") " pod="openstack/nova-cell0-conductor-db-sync-q6f4d" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.782009 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwp29\" (UniqueName: \"kubernetes.io/projected/38bacb80-6110-42c7-9923-95cebf834ef0-kube-api-access-gwp29\") pod \"nova-cell0-conductor-db-sync-q6f4d\" (UID: \"38bacb80-6110-42c7-9923-95cebf834ef0\") " pod="openstack/nova-cell0-conductor-db-sync-q6f4d" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.782093 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-config-data\") pod \"nova-cell0-conductor-db-sync-q6f4d\" (UID: \"38bacb80-6110-42c7-9923-95cebf834ef0\") " pod="openstack/nova-cell0-conductor-db-sync-q6f4d" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.782157 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-scripts\") pod \"nova-cell0-conductor-db-sync-q6f4d\" (UID: \"38bacb80-6110-42c7-9923-95cebf834ef0\") " pod="openstack/nova-cell0-conductor-db-sync-q6f4d" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.784332 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-q6f4d\" (UID: \"38bacb80-6110-42c7-9923-95cebf834ef0\") " pod="openstack/nova-cell0-conductor-db-sync-q6f4d" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.786500 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-scripts\") pod \"nova-cell0-conductor-db-sync-q6f4d\" (UID: \"38bacb80-6110-42c7-9923-95cebf834ef0\") " pod="openstack/nova-cell0-conductor-db-sync-q6f4d" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.788151 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-config-data\") pod \"nova-cell0-conductor-db-sync-q6f4d\" (UID: \"38bacb80-6110-42c7-9923-95cebf834ef0\") " pod="openstack/nova-cell0-conductor-db-sync-q6f4d" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.797389 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-q6f4d\" (UID: \"38bacb80-6110-42c7-9923-95cebf834ef0\") " pod="openstack/nova-cell0-conductor-db-sync-q6f4d" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.809919 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwp29\" (UniqueName: \"kubernetes.io/projected/38bacb80-6110-42c7-9923-95cebf834ef0-kube-api-access-gwp29\") pod \"nova-cell0-conductor-db-sync-q6f4d\" (UID: \"38bacb80-6110-42c7-9923-95cebf834ef0\") " pod="openstack/nova-cell0-conductor-db-sync-q6f4d" Oct 02 12:28:23 crc kubenswrapper[4766]: I1002 12:28:23.911038 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-q6f4d" Oct 02 12:28:24 crc kubenswrapper[4766]: I1002 12:28:24.539755 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q6f4d"] Oct 02 12:28:24 crc kubenswrapper[4766]: W1002 12:28:24.544224 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38bacb80_6110_42c7_9923_95cebf834ef0.slice/crio-0552b521e768358f249067733f76ecd85826a6c543b43c9df3e396d4dda30c25 WatchSource:0}: Error finding container 0552b521e768358f249067733f76ecd85826a6c543b43c9df3e396d4dda30c25: Status 404 returned error can't find the container with id 0552b521e768358f249067733f76ecd85826a6c543b43c9df3e396d4dda30c25 Oct 02 12:28:24 crc kubenswrapper[4766]: I1002 12:28:24.582653 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-q6f4d" event={"ID":"38bacb80-6110-42c7-9923-95cebf834ef0","Type":"ContainerStarted","Data":"0552b521e768358f249067733f76ecd85826a6c543b43c9df3e396d4dda30c25"} Oct 02 12:28:25 crc kubenswrapper[4766]: I1002 12:28:25.597926 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-q6f4d" event={"ID":"38bacb80-6110-42c7-9923-95cebf834ef0","Type":"ContainerStarted","Data":"34e4dafcd44066e3a05bfea9b089fa167ca5a4f7edec9a3214eea6ed2895a760"} Oct 02 12:28:25 crc kubenswrapper[4766]: I1002 12:28:25.631037 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-q6f4d" podStartSLOduration=2.631008478 podStartE2EDuration="2.631008478s" podCreationTimestamp="2025-10-02 12:28:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:25.620522802 +0000 UTC m=+5820.563393756" watchObservedRunningTime="2025-10-02 12:28:25.631008478 +0000 UTC m=+5820.573879432" Oct 02 12:28:31 crc kubenswrapper[4766]: I1002 12:28:31.665951 4766 generic.go:334] "Generic (PLEG): container finished" podID="38bacb80-6110-42c7-9923-95cebf834ef0" containerID="34e4dafcd44066e3a05bfea9b089fa167ca5a4f7edec9a3214eea6ed2895a760" exitCode=0 Oct 02 12:28:31 crc kubenswrapper[4766]: I1002 12:28:31.666062 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-q6f4d" event={"ID":"38bacb80-6110-42c7-9923-95cebf834ef0","Type":"ContainerDied","Data":"34e4dafcd44066e3a05bfea9b089fa167ca5a4f7edec9a3214eea6ed2895a760"} Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.036739 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-q6f4d" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.090810 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-config-data\") pod \"38bacb80-6110-42c7-9923-95cebf834ef0\" (UID: \"38bacb80-6110-42c7-9923-95cebf834ef0\") " Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.090981 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-combined-ca-bundle\") pod \"38bacb80-6110-42c7-9923-95cebf834ef0\" (UID: \"38bacb80-6110-42c7-9923-95cebf834ef0\") " Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.091073 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-scripts\") pod \"38bacb80-6110-42c7-9923-95cebf834ef0\" (UID: \"38bacb80-6110-42c7-9923-95cebf834ef0\") " Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.091180 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwp29\" (UniqueName: \"kubernetes.io/projected/38bacb80-6110-42c7-9923-95cebf834ef0-kube-api-access-gwp29\") pod \"38bacb80-6110-42c7-9923-95cebf834ef0\" (UID: \"38bacb80-6110-42c7-9923-95cebf834ef0\") " Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.103998 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-scripts" (OuterVolumeSpecName: "scripts") pod "38bacb80-6110-42c7-9923-95cebf834ef0" (UID: "38bacb80-6110-42c7-9923-95cebf834ef0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.108886 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bacb80-6110-42c7-9923-95cebf834ef0-kube-api-access-gwp29" (OuterVolumeSpecName: "kube-api-access-gwp29") pod "38bacb80-6110-42c7-9923-95cebf834ef0" (UID: "38bacb80-6110-42c7-9923-95cebf834ef0"). InnerVolumeSpecName "kube-api-access-gwp29". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.119710 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-config-data" (OuterVolumeSpecName: "config-data") pod "38bacb80-6110-42c7-9923-95cebf834ef0" (UID: "38bacb80-6110-42c7-9923-95cebf834ef0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.135782 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38bacb80-6110-42c7-9923-95cebf834ef0" (UID: "38bacb80-6110-42c7-9923-95cebf834ef0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.194399 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.194441 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.194454 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38bacb80-6110-42c7-9923-95cebf834ef0-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.194462 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwp29\" (UniqueName: \"kubernetes.io/projected/38bacb80-6110-42c7-9923-95cebf834ef0-kube-api-access-gwp29\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.695995 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-q6f4d" event={"ID":"38bacb80-6110-42c7-9923-95cebf834ef0","Type":"ContainerDied","Data":"0552b521e768358f249067733f76ecd85826a6c543b43c9df3e396d4dda30c25"} Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.696437 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0552b521e768358f249067733f76ecd85826a6c543b43c9df3e396d4dda30c25" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.696115 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-q6f4d" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.774877 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:28:33 crc kubenswrapper[4766]: E1002 12:28:33.777318 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38bacb80-6110-42c7-9923-95cebf834ef0" containerName="nova-cell0-conductor-db-sync" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.777346 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="38bacb80-6110-42c7-9923-95cebf834ef0" containerName="nova-cell0-conductor-db-sync" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.777582 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="38bacb80-6110-42c7-9923-95cebf834ef0" containerName="nova-cell0-conductor-db-sync" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.778375 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.780971 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.781052 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-rxn6g" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.796892 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.911328 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef91c640-4b08-416f-9ff0-d67fca7d5f22-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ef91c640-4b08-416f-9ff0-d67fca7d5f22\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.911781 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7pdz\" (UniqueName: \"kubernetes.io/projected/ef91c640-4b08-416f-9ff0-d67fca7d5f22-kube-api-access-z7pdz\") pod \"nova-cell0-conductor-0\" (UID: \"ef91c640-4b08-416f-9ff0-d67fca7d5f22\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:28:33 crc kubenswrapper[4766]: I1002 12:28:33.911975 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef91c640-4b08-416f-9ff0-d67fca7d5f22-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ef91c640-4b08-416f-9ff0-d67fca7d5f22\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:28:34 crc kubenswrapper[4766]: I1002 12:28:34.014535 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7pdz\" (UniqueName: \"kubernetes.io/projected/ef91c640-4b08-416f-9ff0-d67fca7d5f22-kube-api-access-z7pdz\") pod \"nova-cell0-conductor-0\" (UID: \"ef91c640-4b08-416f-9ff0-d67fca7d5f22\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:28:34 crc kubenswrapper[4766]: I1002 12:28:34.014623 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef91c640-4b08-416f-9ff0-d67fca7d5f22-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ef91c640-4b08-416f-9ff0-d67fca7d5f22\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:28:34 crc kubenswrapper[4766]: I1002 12:28:34.014762 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef91c640-4b08-416f-9ff0-d67fca7d5f22-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ef91c640-4b08-416f-9ff0-d67fca7d5f22\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:28:34 crc kubenswrapper[4766]: I1002 12:28:34.019478 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef91c640-4b08-416f-9ff0-d67fca7d5f22-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"ef91c640-4b08-416f-9ff0-d67fca7d5f22\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:28:34 crc kubenswrapper[4766]: I1002 12:28:34.019554 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef91c640-4b08-416f-9ff0-d67fca7d5f22-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"ef91c640-4b08-416f-9ff0-d67fca7d5f22\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:28:34 crc kubenswrapper[4766]: I1002 12:28:34.032685 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7pdz\" (UniqueName: \"kubernetes.io/projected/ef91c640-4b08-416f-9ff0-d67fca7d5f22-kube-api-access-z7pdz\") pod \"nova-cell0-conductor-0\" (UID: \"ef91c640-4b08-416f-9ff0-d67fca7d5f22\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:28:34 crc kubenswrapper[4766]: I1002 12:28:34.100605 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 12:28:34 crc kubenswrapper[4766]: I1002 12:28:34.550798 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:28:34 crc kubenswrapper[4766]: I1002 12:28:34.707930 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ef91c640-4b08-416f-9ff0-d67fca7d5f22","Type":"ContainerStarted","Data":"597d60ddd2efd285977c9b039df25ec3af9a72bc6a070243c62492088a75f238"} Oct 02 12:28:35 crc kubenswrapper[4766]: I1002 12:28:35.724809 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ef91c640-4b08-416f-9ff0-d67fca7d5f22","Type":"ContainerStarted","Data":"f873a5f75544bc64020277313c7bd6a384b31cff1da482b92baa87b79a343ced"} Oct 02 12:28:35 crc kubenswrapper[4766]: I1002 12:28:35.728253 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 02 12:28:35 crc kubenswrapper[4766]: I1002 12:28:35.748767 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.748725076 podStartE2EDuration="2.748725076s" podCreationTimestamp="2025-10-02 12:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:35.74700789 +0000 UTC m=+5830.689878854" watchObservedRunningTime="2025-10-02 12:28:35.748725076 +0000 UTC m=+5830.691596020" Oct 02 12:28:35 crc kubenswrapper[4766]: I1002 12:28:35.929583 4766 scope.go:117] "RemoveContainer" containerID="851d271d3b577d2cd05e809b22f883abbdd3a463bfb55a2ccaf867cb76580e8e" Oct 02 12:28:35 crc kubenswrapper[4766]: I1002 12:28:35.980621 4766 scope.go:117] "RemoveContainer" containerID="e5aee622939845c48fb4972ca00ea09e0d15d693759974e33fe4b8b9f83162cb" Oct 02 12:28:36 crc kubenswrapper[4766]: I1002 12:28:36.005991 4766 scope.go:117] "RemoveContainer" containerID="8f1bf13d2285fd52e481a92f629e99b6e54210d8dfa78606bf4cd062ac7d88d8" Oct 02 12:28:36 crc kubenswrapper[4766]: I1002 12:28:36.882182 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:28:36 crc kubenswrapper[4766]: E1002 12:28:36.883026 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.133491 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.747998 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-gf5l8"] Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.749717 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gf5l8" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.753413 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.753976 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.760441 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gf5l8"] Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.848604 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-config-data\") pod \"nova-cell0-cell-mapping-gf5l8\" (UID: \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\") " pod="openstack/nova-cell0-cell-mapping-gf5l8" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.848779 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-scripts\") pod \"nova-cell0-cell-mapping-gf5l8\" (UID: \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\") " pod="openstack/nova-cell0-cell-mapping-gf5l8" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.848870 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnlz5\" (UniqueName: \"kubernetes.io/projected/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-kube-api-access-mnlz5\") pod \"nova-cell0-cell-mapping-gf5l8\" (UID: \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\") " pod="openstack/nova-cell0-cell-mapping-gf5l8" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.848908 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gf5l8\" (UID: \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\") " pod="openstack/nova-cell0-cell-mapping-gf5l8" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.897645 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.899375 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.902261 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.907389 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.952742 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnlz5\" (UniqueName: \"kubernetes.io/projected/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-kube-api-access-mnlz5\") pod \"nova-cell0-cell-mapping-gf5l8\" (UID: \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\") " pod="openstack/nova-cell0-cell-mapping-gf5l8" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.952812 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gf5l8\" (UID: \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\") " pod="openstack/nova-cell0-cell-mapping-gf5l8" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.952857 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5gg2\" (UniqueName: \"kubernetes.io/projected/3bfa9d25-deff-4655-a251-4583fdc8e04d-kube-api-access-f5gg2\") pod \"nova-scheduler-0\" (UID: \"3bfa9d25-deff-4655-a251-4583fdc8e04d\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.952898 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfa9d25-deff-4655-a251-4583fdc8e04d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3bfa9d25-deff-4655-a251-4583fdc8e04d\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.952977 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bfa9d25-deff-4655-a251-4583fdc8e04d-config-data\") pod \"nova-scheduler-0\" (UID: \"3bfa9d25-deff-4655-a251-4583fdc8e04d\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.953013 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-config-data\") pod \"nova-cell0-cell-mapping-gf5l8\" (UID: \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\") " pod="openstack/nova-cell0-cell-mapping-gf5l8" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.953078 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-scripts\") pod \"nova-cell0-cell-mapping-gf5l8\" (UID: \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\") " pod="openstack/nova-cell0-cell-mapping-gf5l8" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.964880 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-scripts\") pod \"nova-cell0-cell-mapping-gf5l8\" (UID: \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\") " pod="openstack/nova-cell0-cell-mapping-gf5l8" Oct 02 12:28:39 crc kubenswrapper[4766]: I1002 12:28:39.989583 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnlz5\" (UniqueName: \"kubernetes.io/projected/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-kube-api-access-mnlz5\") pod \"nova-cell0-cell-mapping-gf5l8\" (UID: \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\") " pod="openstack/nova-cell0-cell-mapping-gf5l8" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:39.997890 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-config-data\") pod \"nova-cell0-cell-mapping-gf5l8\" (UID: \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\") " pod="openstack/nova-cell0-cell-mapping-gf5l8" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.005071 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.011440 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.013467 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gf5l8\" (UID: \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\") " pod="openstack/nova-cell0-cell-mapping-gf5l8" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.021201 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.032247 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.055961 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxdg2\" (UniqueName: \"kubernetes.io/projected/4f37e167-d5f2-4e35-8727-2df39b70526d-kube-api-access-qxdg2\") pod \"nova-metadata-0\" (UID: \"4f37e167-d5f2-4e35-8727-2df39b70526d\") " pod="openstack/nova-metadata-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.056061 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f37e167-d5f2-4e35-8727-2df39b70526d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f37e167-d5f2-4e35-8727-2df39b70526d\") " pod="openstack/nova-metadata-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.056131 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5gg2\" (UniqueName: \"kubernetes.io/projected/3bfa9d25-deff-4655-a251-4583fdc8e04d-kube-api-access-f5gg2\") pod \"nova-scheduler-0\" (UID: \"3bfa9d25-deff-4655-a251-4583fdc8e04d\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.056198 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfa9d25-deff-4655-a251-4583fdc8e04d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3bfa9d25-deff-4655-a251-4583fdc8e04d\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.056257 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f37e167-d5f2-4e35-8727-2df39b70526d-logs\") pod \"nova-metadata-0\" (UID: \"4f37e167-d5f2-4e35-8727-2df39b70526d\") " pod="openstack/nova-metadata-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.056314 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bfa9d25-deff-4655-a251-4583fdc8e04d-config-data\") pod \"nova-scheduler-0\" (UID: \"3bfa9d25-deff-4655-a251-4583fdc8e04d\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.056357 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f37e167-d5f2-4e35-8727-2df39b70526d-config-data\") pod \"nova-metadata-0\" (UID: \"4f37e167-d5f2-4e35-8727-2df39b70526d\") " pod="openstack/nova-metadata-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.079030 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.106005 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfa9d25-deff-4655-a251-4583fdc8e04d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3bfa9d25-deff-4655-a251-4583fdc8e04d\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.110258 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bfa9d25-deff-4655-a251-4583fdc8e04d-config-data\") pod \"nova-scheduler-0\" (UID: \"3bfa9d25-deff-4655-a251-4583fdc8e04d\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.139771 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gf5l8" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.173387 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f37e167-d5f2-4e35-8727-2df39b70526d-logs\") pod \"nova-metadata-0\" (UID: \"4f37e167-d5f2-4e35-8727-2df39b70526d\") " pod="openstack/nova-metadata-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.173514 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f37e167-d5f2-4e35-8727-2df39b70526d-config-data\") pod \"nova-metadata-0\" (UID: \"4f37e167-d5f2-4e35-8727-2df39b70526d\") " pod="openstack/nova-metadata-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.173574 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxdg2\" (UniqueName: \"kubernetes.io/projected/4f37e167-d5f2-4e35-8727-2df39b70526d-kube-api-access-qxdg2\") pod \"nova-metadata-0\" (UID: \"4f37e167-d5f2-4e35-8727-2df39b70526d\") " pod="openstack/nova-metadata-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.174596 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.175017 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f37e167-d5f2-4e35-8727-2df39b70526d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f37e167-d5f2-4e35-8727-2df39b70526d\") " pod="openstack/nova-metadata-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.180780 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f37e167-d5f2-4e35-8727-2df39b70526d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f37e167-d5f2-4e35-8727-2df39b70526d\") " pod="openstack/nova-metadata-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.181414 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f37e167-d5f2-4e35-8727-2df39b70526d-logs\") pod \"nova-metadata-0\" (UID: \"4f37e167-d5f2-4e35-8727-2df39b70526d\") " pod="openstack/nova-metadata-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.187040 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f37e167-d5f2-4e35-8727-2df39b70526d-config-data\") pod \"nova-metadata-0\" (UID: \"4f37e167-d5f2-4e35-8727-2df39b70526d\") " pod="openstack/nova-metadata-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.209892 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5gg2\" (UniqueName: \"kubernetes.io/projected/3bfa9d25-deff-4655-a251-4583fdc8e04d-kube-api-access-f5gg2\") pod \"nova-scheduler-0\" (UID: \"3bfa9d25-deff-4655-a251-4583fdc8e04d\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.213373 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.252778 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.254231 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxdg2\" (UniqueName: \"kubernetes.io/projected/4f37e167-d5f2-4e35-8727-2df39b70526d-kube-api-access-qxdg2\") pod \"nova-metadata-0\" (UID: \"4f37e167-d5f2-4e35-8727-2df39b70526d\") " pod="openstack/nova-metadata-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.254791 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.278065 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.279279 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-logs\") pod \"nova-api-0\" (UID: \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\") " pod="openstack/nova-api-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.279410 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-config-data\") pod \"nova-api-0\" (UID: \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\") " pod="openstack/nova-api-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.279520 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\") " pod="openstack/nova-api-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.279625 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4d26\" (UniqueName: \"kubernetes.io/projected/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-kube-api-access-n4d26\") pod \"nova-api-0\" (UID: \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\") " pod="openstack/nova-api-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.297631 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.299160 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.306830 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.312057 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.325702 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-874ccdbdf-52m9h"] Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.328415 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.355994 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-874ccdbdf-52m9h"] Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.381649 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48da6d4f-236b-49df-a509-388267f8db55-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"48da6d4f-236b-49df-a509-388267f8db55\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.381772 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8pqd\" (UniqueName: \"kubernetes.io/projected/001f22a0-4fb3-4795-b1aa-9ac191cab329-kube-api-access-s8pqd\") pod \"dnsmasq-dns-874ccdbdf-52m9h\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.381810 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-logs\") pod \"nova-api-0\" (UID: \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\") " pod="openstack/nova-api-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.381839 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-ovsdbserver-nb\") pod \"dnsmasq-dns-874ccdbdf-52m9h\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.381873 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-config-data\") pod \"nova-api-0\" (UID: \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\") " pod="openstack/nova-api-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.381905 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\") " pod="openstack/nova-api-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.381928 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-dns-svc\") pod \"dnsmasq-dns-874ccdbdf-52m9h\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.381952 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4d26\" (UniqueName: \"kubernetes.io/projected/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-kube-api-access-n4d26\") pod \"nova-api-0\" (UID: \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\") " pod="openstack/nova-api-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.381970 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-ovsdbserver-sb\") pod \"dnsmasq-dns-874ccdbdf-52m9h\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.382002 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-config\") pod \"dnsmasq-dns-874ccdbdf-52m9h\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.382030 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2nwc\" (UniqueName: \"kubernetes.io/projected/48da6d4f-236b-49df-a509-388267f8db55-kube-api-access-h2nwc\") pod \"nova-cell1-novncproxy-0\" (UID: \"48da6d4f-236b-49df-a509-388267f8db55\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.382070 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48da6d4f-236b-49df-a509-388267f8db55-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"48da6d4f-236b-49df-a509-388267f8db55\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.384256 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-logs\") pod \"nova-api-0\" (UID: \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\") " pod="openstack/nova-api-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.388217 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\") " pod="openstack/nova-api-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.398461 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-config-data\") pod \"nova-api-0\" (UID: \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\") " pod="openstack/nova-api-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.409555 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4d26\" (UniqueName: \"kubernetes.io/projected/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-kube-api-access-n4d26\") pod \"nova-api-0\" (UID: \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\") " pod="openstack/nova-api-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.483481 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-ovsdbserver-nb\") pod \"dnsmasq-dns-874ccdbdf-52m9h\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.483731 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-dns-svc\") pod \"dnsmasq-dns-874ccdbdf-52m9h\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.483772 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-ovsdbserver-sb\") pod \"dnsmasq-dns-874ccdbdf-52m9h\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.483803 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-config\") pod \"dnsmasq-dns-874ccdbdf-52m9h\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.483830 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2nwc\" (UniqueName: \"kubernetes.io/projected/48da6d4f-236b-49df-a509-388267f8db55-kube-api-access-h2nwc\") pod \"nova-cell1-novncproxy-0\" (UID: \"48da6d4f-236b-49df-a509-388267f8db55\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.483872 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48da6d4f-236b-49df-a509-388267f8db55-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"48da6d4f-236b-49df-a509-388267f8db55\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.483919 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48da6d4f-236b-49df-a509-388267f8db55-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"48da6d4f-236b-49df-a509-388267f8db55\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.483986 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8pqd\" (UniqueName: \"kubernetes.io/projected/001f22a0-4fb3-4795-b1aa-9ac191cab329-kube-api-access-s8pqd\") pod \"dnsmasq-dns-874ccdbdf-52m9h\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.484877 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-ovsdbserver-nb\") pod \"dnsmasq-dns-874ccdbdf-52m9h\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.485717 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-dns-svc\") pod \"dnsmasq-dns-874ccdbdf-52m9h\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.485841 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-ovsdbserver-sb\") pod \"dnsmasq-dns-874ccdbdf-52m9h\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.487448 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-config\") pod \"dnsmasq-dns-874ccdbdf-52m9h\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.490739 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48da6d4f-236b-49df-a509-388267f8db55-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"48da6d4f-236b-49df-a509-388267f8db55\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.506035 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48da6d4f-236b-49df-a509-388267f8db55-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"48da6d4f-236b-49df-a509-388267f8db55\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.506470 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2nwc\" (UniqueName: \"kubernetes.io/projected/48da6d4f-236b-49df-a509-388267f8db55-kube-api-access-h2nwc\") pod \"nova-cell1-novncproxy-0\" (UID: \"48da6d4f-236b-49df-a509-388267f8db55\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.507341 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8pqd\" (UniqueName: \"kubernetes.io/projected/001f22a0-4fb3-4795-b1aa-9ac191cab329-kube-api-access-s8pqd\") pod \"dnsmasq-dns-874ccdbdf-52m9h\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.594771 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.631729 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.676158 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.833128 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gf5l8"] Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.971274 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:28:40 crc kubenswrapper[4766]: I1002 12:28:40.991861 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:28:41 crc kubenswrapper[4766]: W1002 12:28:41.000944 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f37e167_d5f2_4e35_8727_2df39b70526d.slice/crio-91d38989a42d6e52518c42295ef7d79ecee942bb97c4bf21653cf4d378533000 WatchSource:0}: Error finding container 91d38989a42d6e52518c42295ef7d79ecee942bb97c4bf21653cf4d378533000: Status 404 returned error can't find the container with id 91d38989a42d6e52518c42295ef7d79ecee942bb97c4bf21653cf4d378533000 Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.083369 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-874ccdbdf-52m9h"] Oct 02 12:28:41 crc kubenswrapper[4766]: W1002 12:28:41.105866 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001f22a0_4fb3_4795_b1aa_9ac191cab329.slice/crio-a2349f05da5257edc3717afc222b6139472169407bd696198096046a281f66a8 WatchSource:0}: Error finding container a2349f05da5257edc3717afc222b6139472169407bd696198096046a281f66a8: Status 404 returned error can't find the container with id a2349f05da5257edc3717afc222b6139472169407bd696198096046a281f66a8 Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.147539 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.366366 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.378514 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-chmnn"] Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.380649 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-chmnn" Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.383678 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.383715 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.387791 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-chmnn"] Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.412146 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-scripts\") pod \"nova-cell1-conductor-db-sync-chmnn\" (UID: \"3dc41884-f51d-48d7-b8ef-61e1148759ea\") " pod="openstack/nova-cell1-conductor-db-sync-chmnn" Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.412259 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-chmnn\" (UID: \"3dc41884-f51d-48d7-b8ef-61e1148759ea\") " pod="openstack/nova-cell1-conductor-db-sync-chmnn" Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.412299 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-config-data\") pod \"nova-cell1-conductor-db-sync-chmnn\" (UID: \"3dc41884-f51d-48d7-b8ef-61e1148759ea\") " pod="openstack/nova-cell1-conductor-db-sync-chmnn" Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.412335 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xdfs\" (UniqueName: \"kubernetes.io/projected/3dc41884-f51d-48d7-b8ef-61e1148759ea-kube-api-access-7xdfs\") pod \"nova-cell1-conductor-db-sync-chmnn\" (UID: \"3dc41884-f51d-48d7-b8ef-61e1148759ea\") " pod="openstack/nova-cell1-conductor-db-sync-chmnn" Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.514859 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-chmnn\" (UID: \"3dc41884-f51d-48d7-b8ef-61e1148759ea\") " pod="openstack/nova-cell1-conductor-db-sync-chmnn" Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.514934 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-config-data\") pod \"nova-cell1-conductor-db-sync-chmnn\" (UID: \"3dc41884-f51d-48d7-b8ef-61e1148759ea\") " pod="openstack/nova-cell1-conductor-db-sync-chmnn" Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.514993 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xdfs\" (UniqueName: \"kubernetes.io/projected/3dc41884-f51d-48d7-b8ef-61e1148759ea-kube-api-access-7xdfs\") pod \"nova-cell1-conductor-db-sync-chmnn\" (UID: \"3dc41884-f51d-48d7-b8ef-61e1148759ea\") " pod="openstack/nova-cell1-conductor-db-sync-chmnn" Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.515129 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-scripts\") pod \"nova-cell1-conductor-db-sync-chmnn\" (UID: \"3dc41884-f51d-48d7-b8ef-61e1148759ea\") " pod="openstack/nova-cell1-conductor-db-sync-chmnn" Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.519235 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-scripts\") pod \"nova-cell1-conductor-db-sync-chmnn\" (UID: \"3dc41884-f51d-48d7-b8ef-61e1148759ea\") " pod="openstack/nova-cell1-conductor-db-sync-chmnn" Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.520098 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-chmnn\" (UID: \"3dc41884-f51d-48d7-b8ef-61e1148759ea\") " pod="openstack/nova-cell1-conductor-db-sync-chmnn" Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.521761 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-config-data\") pod \"nova-cell1-conductor-db-sync-chmnn\" (UID: \"3dc41884-f51d-48d7-b8ef-61e1148759ea\") " pod="openstack/nova-cell1-conductor-db-sync-chmnn" Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.542194 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xdfs\" (UniqueName: \"kubernetes.io/projected/3dc41884-f51d-48d7-b8ef-61e1148759ea-kube-api-access-7xdfs\") pod \"nova-cell1-conductor-db-sync-chmnn\" (UID: \"3dc41884-f51d-48d7-b8ef-61e1148759ea\") " pod="openstack/nova-cell1-conductor-db-sync-chmnn" Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.707795 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-chmnn" Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.805343 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gf5l8" event={"ID":"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb","Type":"ContainerStarted","Data":"de52ec449a0dc9b9f0562a67f7bb244ff9b254a1943e27a97287839bb09fda58"} Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.805405 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gf5l8" event={"ID":"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb","Type":"ContainerStarted","Data":"5ee01d9e30b90c732c4c3efb18ce2cf18b6cabb6e6c3218e14ff1fec659b76f1"} Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.809069 4766 generic.go:334] "Generic (PLEG): container finished" podID="001f22a0-4fb3-4795-b1aa-9ac191cab329" containerID="d828fdc69c70e7629f0f14a4c915322009929fa63a8c06a71174a5a618f1fb9f" exitCode=0 Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.809129 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" event={"ID":"001f22a0-4fb3-4795-b1aa-9ac191cab329","Type":"ContainerDied","Data":"d828fdc69c70e7629f0f14a4c915322009929fa63a8c06a71174a5a618f1fb9f"} Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.809153 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" event={"ID":"001f22a0-4fb3-4795-b1aa-9ac191cab329","Type":"ContainerStarted","Data":"a2349f05da5257edc3717afc222b6139472169407bd696198096046a281f66a8"} Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.816682 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"48da6d4f-236b-49df-a509-388267f8db55","Type":"ContainerStarted","Data":"830a8a311242ba410b0f069ddea9cea453171d8b10b81489df17647eefa821e9"} Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.816739 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"48da6d4f-236b-49df-a509-388267f8db55","Type":"ContainerStarted","Data":"69f243bdab651b442736572a1dcddc1e42df45f793eb4fc2393ada6f2d92c548"} Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.831671 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2","Type":"ContainerStarted","Data":"31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de"} Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.831733 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2","Type":"ContainerStarted","Data":"239e4ea3f3fbcd1e63cdb358af79defb43b28e1f706e37e17d55cd0be67d0f29"} Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.840914 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3bfa9d25-deff-4655-a251-4583fdc8e04d","Type":"ContainerStarted","Data":"fa2b9e811e9c0a1252eefd29ec906e477712a3e00dd7ecd33aada0a2edf7c830"} Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.840956 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3bfa9d25-deff-4655-a251-4583fdc8e04d","Type":"ContainerStarted","Data":"bb6411791c7d345131e0347fb38434fee5dc79a98f846147e3f6cbbe9cf7a3c0"} Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.841619 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-gf5l8" podStartSLOduration=2.841586169 podStartE2EDuration="2.841586169s" podCreationTimestamp="2025-10-02 12:28:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:41.831803336 +0000 UTC m=+5836.774674300" watchObservedRunningTime="2025-10-02 12:28:41.841586169 +0000 UTC m=+5836.784457103" Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.852034 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f37e167-d5f2-4e35-8727-2df39b70526d","Type":"ContainerStarted","Data":"05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d"} Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.852097 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f37e167-d5f2-4e35-8727-2df39b70526d","Type":"ContainerStarted","Data":"4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096"} Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.852111 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f37e167-d5f2-4e35-8727-2df39b70526d","Type":"ContainerStarted","Data":"91d38989a42d6e52518c42295ef7d79ecee942bb97c4bf21653cf4d378533000"} Oct 02 12:28:41 crc kubenswrapper[4766]: I1002 12:28:41.912891 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.9128661710000001 podStartE2EDuration="1.912866171s" podCreationTimestamp="2025-10-02 12:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:41.907101057 +0000 UTC m=+5836.849971991" watchObservedRunningTime="2025-10-02 12:28:41.912866171 +0000 UTC m=+5836.855737115" Oct 02 12:28:42 crc kubenswrapper[4766]: I1002 12:28:42.026412 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.026379105 podStartE2EDuration="2.026379105s" podCreationTimestamp="2025-10-02 12:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:41.938933896 +0000 UTC m=+5836.881804860" watchObservedRunningTime="2025-10-02 12:28:42.026379105 +0000 UTC m=+5836.969250049" Oct 02 12:28:42 crc kubenswrapper[4766]: I1002 12:28:42.030901 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.03088358 podStartE2EDuration="3.03088358s" podCreationTimestamp="2025-10-02 12:28:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:42.021694996 +0000 UTC m=+5836.964565950" watchObservedRunningTime="2025-10-02 12:28:42.03088358 +0000 UTC m=+5836.973754524" Oct 02 12:28:42 crc kubenswrapper[4766]: I1002 12:28:42.058120 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.058097421 podStartE2EDuration="3.058097421s" podCreationTimestamp="2025-10-02 12:28:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:42.041661395 +0000 UTC m=+5836.984532339" watchObservedRunningTime="2025-10-02 12:28:42.058097421 +0000 UTC m=+5837.000968365" Oct 02 12:28:42 crc kubenswrapper[4766]: I1002 12:28:42.294721 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-chmnn"] Oct 02 12:28:42 crc kubenswrapper[4766]: I1002 12:28:42.910915 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" event={"ID":"001f22a0-4fb3-4795-b1aa-9ac191cab329","Type":"ContainerStarted","Data":"4c4127f2a45399f5cdee38356dc3c43bb6e30e98c139aeaaaff76896baed8b62"} Oct 02 12:28:42 crc kubenswrapper[4766]: I1002 12:28:42.911877 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:42 crc kubenswrapper[4766]: I1002 12:28:42.914873 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2","Type":"ContainerStarted","Data":"2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88"} Oct 02 12:28:42 crc kubenswrapper[4766]: I1002 12:28:42.918936 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-chmnn" event={"ID":"3dc41884-f51d-48d7-b8ef-61e1148759ea","Type":"ContainerStarted","Data":"3c8a324c9daf6cf9b26a35385d3377e86dcefd3c21151e2f937c8de222c83d0e"} Oct 02 12:28:42 crc kubenswrapper[4766]: I1002 12:28:42.919035 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-chmnn" event={"ID":"3dc41884-f51d-48d7-b8ef-61e1148759ea","Type":"ContainerStarted","Data":"74a095ba2709a57528ac1732179ab1a752df80ccbd981d4a36a2a24c76b6557d"} Oct 02 12:28:42 crc kubenswrapper[4766]: I1002 12:28:42.952924 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" podStartSLOduration=2.952894809 podStartE2EDuration="2.952894809s" podCreationTimestamp="2025-10-02 12:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:42.938959433 +0000 UTC m=+5837.881830377" watchObservedRunningTime="2025-10-02 12:28:42.952894809 +0000 UTC m=+5837.895765753" Oct 02 12:28:42 crc kubenswrapper[4766]: I1002 12:28:42.964451 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-chmnn" podStartSLOduration=1.964425679 podStartE2EDuration="1.964425679s" podCreationTimestamp="2025-10-02 12:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:42.963175649 +0000 UTC m=+5837.906046593" watchObservedRunningTime="2025-10-02 12:28:42.964425679 +0000 UTC m=+5837.907296623" Oct 02 12:28:45 crc kubenswrapper[4766]: I1002 12:28:45.254192 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 12:28:45 crc kubenswrapper[4766]: I1002 12:28:45.278599 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 12:28:45 crc kubenswrapper[4766]: I1002 12:28:45.278649 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 12:28:45 crc kubenswrapper[4766]: I1002 12:28:45.632602 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:28:45 crc kubenswrapper[4766]: I1002 12:28:45.955250 4766 generic.go:334] "Generic (PLEG): container finished" podID="3dc41884-f51d-48d7-b8ef-61e1148759ea" containerID="3c8a324c9daf6cf9b26a35385d3377e86dcefd3c21151e2f937c8de222c83d0e" exitCode=0 Oct 02 12:28:45 crc kubenswrapper[4766]: I1002 12:28:45.955326 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-chmnn" event={"ID":"3dc41884-f51d-48d7-b8ef-61e1148759ea","Type":"ContainerDied","Data":"3c8a324c9daf6cf9b26a35385d3377e86dcefd3c21151e2f937c8de222c83d0e"} Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.372927 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-chmnn" Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.526439 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-scripts\") pod \"3dc41884-f51d-48d7-b8ef-61e1148759ea\" (UID: \"3dc41884-f51d-48d7-b8ef-61e1148759ea\") " Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.526743 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xdfs\" (UniqueName: \"kubernetes.io/projected/3dc41884-f51d-48d7-b8ef-61e1148759ea-kube-api-access-7xdfs\") pod \"3dc41884-f51d-48d7-b8ef-61e1148759ea\" (UID: \"3dc41884-f51d-48d7-b8ef-61e1148759ea\") " Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.526813 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-config-data\") pod \"3dc41884-f51d-48d7-b8ef-61e1148759ea\" (UID: \"3dc41884-f51d-48d7-b8ef-61e1148759ea\") " Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.526861 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-combined-ca-bundle\") pod \"3dc41884-f51d-48d7-b8ef-61e1148759ea\" (UID: \"3dc41884-f51d-48d7-b8ef-61e1148759ea\") " Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.534551 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-scripts" (OuterVolumeSpecName: "scripts") pod "3dc41884-f51d-48d7-b8ef-61e1148759ea" (UID: "3dc41884-f51d-48d7-b8ef-61e1148759ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.534667 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dc41884-f51d-48d7-b8ef-61e1148759ea-kube-api-access-7xdfs" (OuterVolumeSpecName: "kube-api-access-7xdfs") pod "3dc41884-f51d-48d7-b8ef-61e1148759ea" (UID: "3dc41884-f51d-48d7-b8ef-61e1148759ea"). InnerVolumeSpecName "kube-api-access-7xdfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.559751 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3dc41884-f51d-48d7-b8ef-61e1148759ea" (UID: "3dc41884-f51d-48d7-b8ef-61e1148759ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.561309 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-config-data" (OuterVolumeSpecName: "config-data") pod "3dc41884-f51d-48d7-b8ef-61e1148759ea" (UID: "3dc41884-f51d-48d7-b8ef-61e1148759ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.629538 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.629594 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xdfs\" (UniqueName: \"kubernetes.io/projected/3dc41884-f51d-48d7-b8ef-61e1148759ea-kube-api-access-7xdfs\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.629610 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.629623 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dc41884-f51d-48d7-b8ef-61e1148759ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.979530 4766 generic.go:334] "Generic (PLEG): container finished" podID="f26d4d8b-0c18-4fb9-b57a-cd154cb211eb" containerID="de52ec449a0dc9b9f0562a67f7bb244ff9b254a1943e27a97287839bb09fda58" exitCode=0 Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.979647 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gf5l8" event={"ID":"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb","Type":"ContainerDied","Data":"de52ec449a0dc9b9f0562a67f7bb244ff9b254a1943e27a97287839bb09fda58"} Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.987115 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-chmnn" event={"ID":"3dc41884-f51d-48d7-b8ef-61e1148759ea","Type":"ContainerDied","Data":"74a095ba2709a57528ac1732179ab1a752df80ccbd981d4a36a2a24c76b6557d"} Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.987166 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74a095ba2709a57528ac1732179ab1a752df80ccbd981d4a36a2a24c76b6557d" Oct 02 12:28:47 crc kubenswrapper[4766]: I1002 12:28:47.987170 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-chmnn" Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.072934 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:28:48 crc kubenswrapper[4766]: E1002 12:28:48.075024 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dc41884-f51d-48d7-b8ef-61e1148759ea" containerName="nova-cell1-conductor-db-sync" Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.075081 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dc41884-f51d-48d7-b8ef-61e1148759ea" containerName="nova-cell1-conductor-db-sync" Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.075491 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dc41884-f51d-48d7-b8ef-61e1148759ea" containerName="nova-cell1-conductor-db-sync" Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.076456 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.079525 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.090212 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.141568 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f7aa24-c544-420c-89da-e6f907a8860c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a1f7aa24-c544-420c-89da-e6f907a8860c\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.142280 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f7aa24-c544-420c-89da-e6f907a8860c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a1f7aa24-c544-420c-89da-e6f907a8860c\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.142351 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfj6m\" (UniqueName: \"kubernetes.io/projected/a1f7aa24-c544-420c-89da-e6f907a8860c-kube-api-access-pfj6m\") pod \"nova-cell1-conductor-0\" (UID: \"a1f7aa24-c544-420c-89da-e6f907a8860c\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.244003 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f7aa24-c544-420c-89da-e6f907a8860c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a1f7aa24-c544-420c-89da-e6f907a8860c\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.244102 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f7aa24-c544-420c-89da-e6f907a8860c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a1f7aa24-c544-420c-89da-e6f907a8860c\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.244135 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfj6m\" (UniqueName: \"kubernetes.io/projected/a1f7aa24-c544-420c-89da-e6f907a8860c-kube-api-access-pfj6m\") pod \"nova-cell1-conductor-0\" (UID: \"a1f7aa24-c544-420c-89da-e6f907a8860c\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.249055 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f7aa24-c544-420c-89da-e6f907a8860c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a1f7aa24-c544-420c-89da-e6f907a8860c\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.249739 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f7aa24-c544-420c-89da-e6f907a8860c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a1f7aa24-c544-420c-89da-e6f907a8860c\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.261994 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfj6m\" (UniqueName: \"kubernetes.io/projected/a1f7aa24-c544-420c-89da-e6f907a8860c-kube-api-access-pfj6m\") pod \"nova-cell1-conductor-0\" (UID: \"a1f7aa24-c544-420c-89da-e6f907a8860c\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.420209 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.896998 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:28:48 crc kubenswrapper[4766]: W1002 12:28:48.903547 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1f7aa24_c544_420c_89da_e6f907a8860c.slice/crio-70f27bb4160fcb21db022619abf69bbb979844363c3458288f7980c2b0d03085 WatchSource:0}: Error finding container 70f27bb4160fcb21db022619abf69bbb979844363c3458288f7980c2b0d03085: Status 404 returned error can't find the container with id 70f27bb4160fcb21db022619abf69bbb979844363c3458288f7980c2b0d03085 Oct 02 12:28:48 crc kubenswrapper[4766]: I1002 12:28:48.998836 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a1f7aa24-c544-420c-89da-e6f907a8860c","Type":"ContainerStarted","Data":"70f27bb4160fcb21db022619abf69bbb979844363c3458288f7980c2b0d03085"} Oct 02 12:28:49 crc kubenswrapper[4766]: I1002 12:28:49.283564 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gf5l8" Oct 02 12:28:49 crc kubenswrapper[4766]: I1002 12:28:49.392831 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-scripts\") pod \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\" (UID: \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\") " Oct 02 12:28:49 crc kubenswrapper[4766]: I1002 12:28:49.393085 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-config-data\") pod \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\" (UID: \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\") " Oct 02 12:28:49 crc kubenswrapper[4766]: I1002 12:28:49.393136 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-combined-ca-bundle\") pod \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\" (UID: \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\") " Oct 02 12:28:49 crc kubenswrapper[4766]: I1002 12:28:49.393172 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnlz5\" (UniqueName: \"kubernetes.io/projected/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-kube-api-access-mnlz5\") pod \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\" (UID: \"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb\") " Oct 02 12:28:49 crc kubenswrapper[4766]: I1002 12:28:49.399805 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-scripts" (OuterVolumeSpecName: "scripts") pod "f26d4d8b-0c18-4fb9-b57a-cd154cb211eb" (UID: "f26d4d8b-0c18-4fb9-b57a-cd154cb211eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:28:49 crc kubenswrapper[4766]: I1002 12:28:49.404914 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-kube-api-access-mnlz5" (OuterVolumeSpecName: "kube-api-access-mnlz5") pod "f26d4d8b-0c18-4fb9-b57a-cd154cb211eb" (UID: "f26d4d8b-0c18-4fb9-b57a-cd154cb211eb"). InnerVolumeSpecName "kube-api-access-mnlz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:28:49 crc kubenswrapper[4766]: I1002 12:28:49.426634 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-config-data" (OuterVolumeSpecName: "config-data") pod "f26d4d8b-0c18-4fb9-b57a-cd154cb211eb" (UID: "f26d4d8b-0c18-4fb9-b57a-cd154cb211eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:28:49 crc kubenswrapper[4766]: I1002 12:28:49.433716 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f26d4d8b-0c18-4fb9-b57a-cd154cb211eb" (UID: "f26d4d8b-0c18-4fb9-b57a-cd154cb211eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:28:49 crc kubenswrapper[4766]: I1002 12:28:49.495642 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:49 crc kubenswrapper[4766]: I1002 12:28:49.495702 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:49 crc kubenswrapper[4766]: I1002 12:28:49.495712 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:49 crc kubenswrapper[4766]: I1002 12:28:49.495723 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnlz5\" (UniqueName: \"kubernetes.io/projected/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb-kube-api-access-mnlz5\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:49 crc kubenswrapper[4766]: I1002 12:28:49.882361 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:28:49 crc kubenswrapper[4766]: E1002 12:28:49.882854 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.011179 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a1f7aa24-c544-420c-89da-e6f907a8860c","Type":"ContainerStarted","Data":"d09bdddf38f003fdc9558842b008b6301a3840a1560659c352fb547814605bc0"} Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.011434 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.014900 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gf5l8" event={"ID":"f26d4d8b-0c18-4fb9-b57a-cd154cb211eb","Type":"ContainerDied","Data":"5ee01d9e30b90c732c4c3efb18ce2cf18b6cabb6e6c3218e14ff1fec659b76f1"} Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.014940 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ee01d9e30b90c732c4c3efb18ce2cf18b6cabb6e6c3218e14ff1fec659b76f1" Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.014997 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gf5l8" Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.041469 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.041446702 podStartE2EDuration="2.041446702s" podCreationTimestamp="2025-10-02 12:28:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:50.034454399 +0000 UTC m=+5844.977325363" watchObservedRunningTime="2025-10-02 12:28:50.041446702 +0000 UTC m=+5844.984317646" Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.215616 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.216519 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2" containerName="nova-api-log" containerID="cri-o://31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de" gracePeriod=30 Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.216587 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2" containerName="nova-api-api" containerID="cri-o://2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88" gracePeriod=30 Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.237609 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.237885 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3bfa9d25-deff-4655-a251-4583fdc8e04d" containerName="nova-scheduler-scheduler" containerID="cri-o://fa2b9e811e9c0a1252eefd29ec906e477712a3e00dd7ecd33aada0a2edf7c830" gracePeriod=30 Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.259006 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.259286 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4f37e167-d5f2-4e35-8727-2df39b70526d" containerName="nova-metadata-log" containerID="cri-o://4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096" gracePeriod=30 Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.259361 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4f37e167-d5f2-4e35-8727-2df39b70526d" containerName="nova-metadata-metadata" containerID="cri-o://05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d" gracePeriod=30 Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.632668 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.651163 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.680958 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.767005 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74ccf9c867-gkpf8"] Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.767334 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" podUID="4d650726-0f54-4320-9764-c984cefe3c0b" containerName="dnsmasq-dns" containerID="cri-o://955d07bccfacf32cfba7467d0352069df2017de5a075aea66bb626c66b7e69d4" gracePeriod=10 Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.953344 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:28:50 crc kubenswrapper[4766]: I1002 12:28:50.974282 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.036938 4766 generic.go:334] "Generic (PLEG): container finished" podID="4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2" containerID="2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88" exitCode=0 Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.037298 4766 generic.go:334] "Generic (PLEG): container finished" podID="4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2" containerID="31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de" exitCode=143 Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.037392 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2","Type":"ContainerDied","Data":"2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88"} Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.037530 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2","Type":"ContainerDied","Data":"31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de"} Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.037598 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2","Type":"ContainerDied","Data":"239e4ea3f3fbcd1e63cdb358af79defb43b28e1f706e37e17d55cd0be67d0f29"} Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.037721 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.037754 4766 scope.go:117] "RemoveContainer" containerID="2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.049392 4766 generic.go:334] "Generic (PLEG): container finished" podID="4f37e167-d5f2-4e35-8727-2df39b70526d" containerID="05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d" exitCode=0 Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.049458 4766 generic.go:334] "Generic (PLEG): container finished" podID="4f37e167-d5f2-4e35-8727-2df39b70526d" containerID="4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096" exitCode=143 Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.049568 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f37e167-d5f2-4e35-8727-2df39b70526d","Type":"ContainerDied","Data":"05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d"} Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.049608 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f37e167-d5f2-4e35-8727-2df39b70526d","Type":"ContainerDied","Data":"4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096"} Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.049620 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f37e167-d5f2-4e35-8727-2df39b70526d","Type":"ContainerDied","Data":"91d38989a42d6e52518c42295ef7d79ecee942bb97c4bf21653cf4d378533000"} Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.049696 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.071035 4766 generic.go:334] "Generic (PLEG): container finished" podID="4d650726-0f54-4320-9764-c984cefe3c0b" containerID="955d07bccfacf32cfba7467d0352069df2017de5a075aea66bb626c66b7e69d4" exitCode=0 Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.071310 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" event={"ID":"4d650726-0f54-4320-9764-c984cefe3c0b","Type":"ContainerDied","Data":"955d07bccfacf32cfba7467d0352069df2017de5a075aea66bb626c66b7e69d4"} Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.124342 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.134625 4766 scope.go:117] "RemoveContainer" containerID="31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.141406 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f37e167-d5f2-4e35-8727-2df39b70526d-logs\") pod \"4f37e167-d5f2-4e35-8727-2df39b70526d\" (UID: \"4f37e167-d5f2-4e35-8727-2df39b70526d\") " Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.141657 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f37e167-d5f2-4e35-8727-2df39b70526d-combined-ca-bundle\") pod \"4f37e167-d5f2-4e35-8727-2df39b70526d\" (UID: \"4f37e167-d5f2-4e35-8727-2df39b70526d\") " Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.141709 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-config-data\") pod \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\" (UID: \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\") " Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.141746 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4d26\" (UniqueName: \"kubernetes.io/projected/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-kube-api-access-n4d26\") pod \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\" (UID: \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\") " Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.141808 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-combined-ca-bundle\") pod \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\" (UID: \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\") " Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.141857 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-logs\") pod \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\" (UID: \"4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2\") " Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.141901 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f37e167-d5f2-4e35-8727-2df39b70526d-config-data\") pod \"4f37e167-d5f2-4e35-8727-2df39b70526d\" (UID: \"4f37e167-d5f2-4e35-8727-2df39b70526d\") " Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.141929 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxdg2\" (UniqueName: \"kubernetes.io/projected/4f37e167-d5f2-4e35-8727-2df39b70526d-kube-api-access-qxdg2\") pod \"4f37e167-d5f2-4e35-8727-2df39b70526d\" (UID: \"4f37e167-d5f2-4e35-8727-2df39b70526d\") " Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.142306 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f37e167-d5f2-4e35-8727-2df39b70526d-logs" (OuterVolumeSpecName: "logs") pod "4f37e167-d5f2-4e35-8727-2df39b70526d" (UID: "4f37e167-d5f2-4e35-8727-2df39b70526d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.143111 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-logs" (OuterVolumeSpecName: "logs") pod "4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2" (UID: "4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.173748 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f37e167-d5f2-4e35-8727-2df39b70526d-kube-api-access-qxdg2" (OuterVolumeSpecName: "kube-api-access-qxdg2") pod "4f37e167-d5f2-4e35-8727-2df39b70526d" (UID: "4f37e167-d5f2-4e35-8727-2df39b70526d"). InnerVolumeSpecName "kube-api-access-qxdg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.188839 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-kube-api-access-n4d26" (OuterVolumeSpecName: "kube-api-access-n4d26") pod "4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2" (UID: "4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2"). InnerVolumeSpecName "kube-api-access-n4d26". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.245906 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f37e167-d5f2-4e35-8727-2df39b70526d-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.246295 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4d26\" (UniqueName: \"kubernetes.io/projected/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-kube-api-access-n4d26\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.246691 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.246801 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxdg2\" (UniqueName: \"kubernetes.io/projected/4f37e167-d5f2-4e35-8727-2df39b70526d-kube-api-access-qxdg2\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.247689 4766 scope.go:117] "RemoveContainer" containerID="2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.281061 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2" (UID: "4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.282716 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-config-data" (OuterVolumeSpecName: "config-data") pod "4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2" (UID: "4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:28:51 crc kubenswrapper[4766]: E1002 12:28:51.282969 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88\": container with ID starting with 2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88 not found: ID does not exist" containerID="2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.283012 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88"} err="failed to get container status \"2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88\": rpc error: code = NotFound desc = could not find container \"2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88\": container with ID starting with 2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88 not found: ID does not exist" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.283048 4766 scope.go:117] "RemoveContainer" containerID="31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.297756 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f37e167-d5f2-4e35-8727-2df39b70526d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f37e167-d5f2-4e35-8727-2df39b70526d" (UID: "4f37e167-d5f2-4e35-8727-2df39b70526d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:28:51 crc kubenswrapper[4766]: E1002 12:28:51.298165 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de\": container with ID starting with 31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de not found: ID does not exist" containerID="31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.298301 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de"} err="failed to get container status \"31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de\": rpc error: code = NotFound desc = could not find container \"31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de\": container with ID starting with 31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de not found: ID does not exist" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.298448 4766 scope.go:117] "RemoveContainer" containerID="2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.299537 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88"} err="failed to get container status \"2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88\": rpc error: code = NotFound desc = could not find container \"2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88\": container with ID starting with 2d2ee6fa5647c7437633a723e7393ca4ec7418fb9dd0d9dfc58eceeeb4482d88 not found: ID does not exist" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.299656 4766 scope.go:117] "RemoveContainer" containerID="31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.303255 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de"} err="failed to get container status \"31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de\": rpc error: code = NotFound desc = could not find container \"31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de\": container with ID starting with 31457e229eb7a099c081e49657debb29cd96274b130be38cd30a3e065a1be5de not found: ID does not exist" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.313230 4766 scope.go:117] "RemoveContainer" containerID="05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.306782 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f37e167-d5f2-4e35-8727-2df39b70526d-config-data" (OuterVolumeSpecName: "config-data") pod "4f37e167-d5f2-4e35-8727-2df39b70526d" (UID: "4f37e167-d5f2-4e35-8727-2df39b70526d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.356188 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f37e167-d5f2-4e35-8727-2df39b70526d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.356233 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.356245 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.356254 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f37e167-d5f2-4e35-8727-2df39b70526d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.448821 4766 scope.go:117] "RemoveContainer" containerID="4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.462485 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.482447 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.508345 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.523062 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 12:28:51 crc kubenswrapper[4766]: E1002 12:28:51.523924 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2" containerName="nova-api-log" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.523951 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2" containerName="nova-api-log" Oct 02 12:28:51 crc kubenswrapper[4766]: E1002 12:28:51.523973 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f26d4d8b-0c18-4fb9-b57a-cd154cb211eb" containerName="nova-manage" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.523979 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f26d4d8b-0c18-4fb9-b57a-cd154cb211eb" containerName="nova-manage" Oct 02 12:28:51 crc kubenswrapper[4766]: E1002 12:28:51.524006 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2" containerName="nova-api-api" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.524012 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2" containerName="nova-api-api" Oct 02 12:28:51 crc kubenswrapper[4766]: E1002 12:28:51.524023 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f37e167-d5f2-4e35-8727-2df39b70526d" containerName="nova-metadata-metadata" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.524029 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f37e167-d5f2-4e35-8727-2df39b70526d" containerName="nova-metadata-metadata" Oct 02 12:28:51 crc kubenswrapper[4766]: E1002 12:28:51.524047 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f37e167-d5f2-4e35-8727-2df39b70526d" containerName="nova-metadata-log" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.524053 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f37e167-d5f2-4e35-8727-2df39b70526d" containerName="nova-metadata-log" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.524224 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2" containerName="nova-api-api" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.524240 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2" containerName="nova-api-log" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.524271 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f37e167-d5f2-4e35-8727-2df39b70526d" containerName="nova-metadata-log" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.524283 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f37e167-d5f2-4e35-8727-2df39b70526d" containerName="nova-metadata-metadata" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.524289 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f26d4d8b-0c18-4fb9-b57a-cd154cb211eb" containerName="nova-manage" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.525659 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.531400 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.536807 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.559347 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.563737 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f71c32de-a23d-4207-9648-dacb057372ab-logs\") pod \"nova-api-0\" (UID: \"f71c32de-a23d-4207-9648-dacb057372ab\") " pod="openstack/nova-api-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.563836 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71c32de-a23d-4207-9648-dacb057372ab-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f71c32de-a23d-4207-9648-dacb057372ab\") " pod="openstack/nova-api-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.563892 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71c32de-a23d-4207-9648-dacb057372ab-config-data\") pod \"nova-api-0\" (UID: \"f71c32de-a23d-4207-9648-dacb057372ab\") " pod="openstack/nova-api-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.563943 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd5nb\" (UniqueName: \"kubernetes.io/projected/f71c32de-a23d-4207-9648-dacb057372ab-kube-api-access-sd5nb\") pod \"nova-api-0\" (UID: \"f71c32de-a23d-4207-9648-dacb057372ab\") " pod="openstack/nova-api-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.584064 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.605815 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:28:51 crc kubenswrapper[4766]: E1002 12:28:51.607097 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d650726-0f54-4320-9764-c984cefe3c0b" containerName="init" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.607168 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d650726-0f54-4320-9764-c984cefe3c0b" containerName="init" Oct 02 12:28:51 crc kubenswrapper[4766]: E1002 12:28:51.607328 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d650726-0f54-4320-9764-c984cefe3c0b" containerName="dnsmasq-dns" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.607339 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d650726-0f54-4320-9764-c984cefe3c0b" containerName="dnsmasq-dns" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.609641 4766 scope.go:117] "RemoveContainer" containerID="05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d" Oct 02 12:28:51 crc kubenswrapper[4766]: E1002 12:28:51.612159 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d\": container with ID starting with 05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d not found: ID does not exist" containerID="05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.612337 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d"} err="failed to get container status \"05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d\": rpc error: code = NotFound desc = could not find container \"05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d\": container with ID starting with 05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d not found: ID does not exist" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.612436 4766 scope.go:117] "RemoveContainer" containerID="4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.612859 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d650726-0f54-4320-9764-c984cefe3c0b" containerName="dnsmasq-dns" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.615562 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:28:51 crc kubenswrapper[4766]: E1002 12:28:51.619075 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096\": container with ID starting with 4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096 not found: ID does not exist" containerID="4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.620215 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096"} err="failed to get container status \"4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096\": rpc error: code = NotFound desc = could not find container \"4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096\": container with ID starting with 4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096 not found: ID does not exist" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.620347 4766 scope.go:117] "RemoveContainer" containerID="05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.621399 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.622402 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d"} err="failed to get container status \"05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d\": rpc error: code = NotFound desc = could not find container \"05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d\": container with ID starting with 05080c49007f3273d5f2c92f630ec11624122a0169a41645c21e0a013f8e394d not found: ID does not exist" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.622461 4766 scope.go:117] "RemoveContainer" containerID="4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.623856 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096"} err="failed to get container status \"4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096\": rpc error: code = NotFound desc = could not find container \"4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096\": container with ID starting with 4f4a2593317270791149a09779afc05a8111a21159b35ed35063d963806bc096 not found: ID does not exist" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.657317 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.664688 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-ovsdbserver-nb\") pod \"4d650726-0f54-4320-9764-c984cefe3c0b\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.664741 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-config\") pod \"4d650726-0f54-4320-9764-c984cefe3c0b\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.664776 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgzhn\" (UniqueName: \"kubernetes.io/projected/4d650726-0f54-4320-9764-c984cefe3c0b-kube-api-access-wgzhn\") pod \"4d650726-0f54-4320-9764-c984cefe3c0b\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.664796 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-ovsdbserver-sb\") pod \"4d650726-0f54-4320-9764-c984cefe3c0b\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.664825 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-dns-svc\") pod \"4d650726-0f54-4320-9764-c984cefe3c0b\" (UID: \"4d650726-0f54-4320-9764-c984cefe3c0b\") " Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.664972 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\") " pod="openstack/nova-metadata-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.665017 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f71c32de-a23d-4207-9648-dacb057372ab-logs\") pod \"nova-api-0\" (UID: \"f71c32de-a23d-4207-9648-dacb057372ab\") " pod="openstack/nova-api-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.665059 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71c32de-a23d-4207-9648-dacb057372ab-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f71c32de-a23d-4207-9648-dacb057372ab\") " pod="openstack/nova-api-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.665104 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-logs\") pod \"nova-metadata-0\" (UID: \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\") " pod="openstack/nova-metadata-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.665127 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71c32de-a23d-4207-9648-dacb057372ab-config-data\") pod \"nova-api-0\" (UID: \"f71c32de-a23d-4207-9648-dacb057372ab\") " pod="openstack/nova-api-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.665166 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd5nb\" (UniqueName: \"kubernetes.io/projected/f71c32de-a23d-4207-9648-dacb057372ab-kube-api-access-sd5nb\") pod \"nova-api-0\" (UID: \"f71c32de-a23d-4207-9648-dacb057372ab\") " pod="openstack/nova-api-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.665199 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g95x\" (UniqueName: \"kubernetes.io/projected/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-kube-api-access-9g95x\") pod \"nova-metadata-0\" (UID: \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\") " pod="openstack/nova-metadata-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.665240 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-config-data\") pod \"nova-metadata-0\" (UID: \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\") " pod="openstack/nova-metadata-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.667877 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f71c32de-a23d-4207-9648-dacb057372ab-logs\") pod \"nova-api-0\" (UID: \"f71c32de-a23d-4207-9648-dacb057372ab\") " pod="openstack/nova-api-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.670913 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d650726-0f54-4320-9764-c984cefe3c0b-kube-api-access-wgzhn" (OuterVolumeSpecName: "kube-api-access-wgzhn") pod "4d650726-0f54-4320-9764-c984cefe3c0b" (UID: "4d650726-0f54-4320-9764-c984cefe3c0b"). InnerVolumeSpecName "kube-api-access-wgzhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.673247 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71c32de-a23d-4207-9648-dacb057372ab-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f71c32de-a23d-4207-9648-dacb057372ab\") " pod="openstack/nova-api-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.686175 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd5nb\" (UniqueName: \"kubernetes.io/projected/f71c32de-a23d-4207-9648-dacb057372ab-kube-api-access-sd5nb\") pod \"nova-api-0\" (UID: \"f71c32de-a23d-4207-9648-dacb057372ab\") " pod="openstack/nova-api-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.690137 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71c32de-a23d-4207-9648-dacb057372ab-config-data\") pod \"nova-api-0\" (UID: \"f71c32de-a23d-4207-9648-dacb057372ab\") " pod="openstack/nova-api-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.725325 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-config" (OuterVolumeSpecName: "config") pod "4d650726-0f54-4320-9764-c984cefe3c0b" (UID: "4d650726-0f54-4320-9764-c984cefe3c0b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.730151 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4d650726-0f54-4320-9764-c984cefe3c0b" (UID: "4d650726-0f54-4320-9764-c984cefe3c0b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.731043 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4d650726-0f54-4320-9764-c984cefe3c0b" (UID: "4d650726-0f54-4320-9764-c984cefe3c0b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.732455 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4d650726-0f54-4320-9764-c984cefe3c0b" (UID: "4d650726-0f54-4320-9764-c984cefe3c0b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.767436 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-logs\") pod \"nova-metadata-0\" (UID: \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\") " pod="openstack/nova-metadata-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.767614 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g95x\" (UniqueName: \"kubernetes.io/projected/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-kube-api-access-9g95x\") pod \"nova-metadata-0\" (UID: \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\") " pod="openstack/nova-metadata-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.767701 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-config-data\") pod \"nova-metadata-0\" (UID: \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\") " pod="openstack/nova-metadata-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.767769 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\") " pod="openstack/nova-metadata-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.767856 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.767874 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.767886 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgzhn\" (UniqueName: \"kubernetes.io/projected/4d650726-0f54-4320-9764-c984cefe3c0b-kube-api-access-wgzhn\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.767898 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.767910 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4d650726-0f54-4320-9764-c984cefe3c0b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.768243 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-logs\") pod \"nova-metadata-0\" (UID: \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\") " pod="openstack/nova-metadata-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.775053 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-config-data\") pod \"nova-metadata-0\" (UID: \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\") " pod="openstack/nova-metadata-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.776257 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\") " pod="openstack/nova-metadata-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.787649 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g95x\" (UniqueName: \"kubernetes.io/projected/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-kube-api-access-9g95x\") pod \"nova-metadata-0\" (UID: \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\") " pod="openstack/nova-metadata-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.892343 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2" path="/var/lib/kubelet/pods/4a90fd7c-ead6-4ff4-a38c-a4d7c69883c2/volumes" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.893245 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f37e167-d5f2-4e35-8727-2df39b70526d" path="/var/lib/kubelet/pods/4f37e167-d5f2-4e35-8727-2df39b70526d/volumes" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.933150 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:28:51 crc kubenswrapper[4766]: I1002 12:28:51.937350 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:28:52 crc kubenswrapper[4766]: I1002 12:28:52.095097 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" event={"ID":"4d650726-0f54-4320-9764-c984cefe3c0b","Type":"ContainerDied","Data":"b62bea09b5d559540a03aa24d7a2762acf71e53491e28e5b06b9f3adb130d39b"} Oct 02 12:28:52 crc kubenswrapper[4766]: I1002 12:28:52.095606 4766 scope.go:117] "RemoveContainer" containerID="955d07bccfacf32cfba7467d0352069df2017de5a075aea66bb626c66b7e69d4" Oct 02 12:28:52 crc kubenswrapper[4766]: I1002 12:28:52.095766 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74ccf9c867-gkpf8" Oct 02 12:28:52 crc kubenswrapper[4766]: I1002 12:28:52.129240 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74ccf9c867-gkpf8"] Oct 02 12:28:52 crc kubenswrapper[4766]: I1002 12:28:52.129382 4766 scope.go:117] "RemoveContainer" containerID="6d4e51408c276cffcfaf3adbac84c562aa256c47f785133dc063318fa50dff73" Oct 02 12:28:52 crc kubenswrapper[4766]: I1002 12:28:52.153317 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74ccf9c867-gkpf8"] Oct 02 12:28:52 crc kubenswrapper[4766]: I1002 12:28:52.447760 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:28:52 crc kubenswrapper[4766]: I1002 12:28:52.505796 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.123624 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3","Type":"ContainerStarted","Data":"2dbbf87cd680c0441e52407eb9022a4b6b45b40496472a1ff0b0691ea8c0a842"} Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.125336 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3","Type":"ContainerStarted","Data":"19d2be01ac6fc0caa09ed23ae05ee777a6458a86de9b727fc9947c3567ae932e"} Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.125414 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3","Type":"ContainerStarted","Data":"9081658f344ece9d55b1c13d6b640d23bab4e0d61ac10ef543b1e3fa8e1f9bb3"} Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.127113 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f71c32de-a23d-4207-9648-dacb057372ab","Type":"ContainerStarted","Data":"c81aec9f93b6dfaec9d03b8e7192a2311d180d2caa58888c9ae7068044df95e0"} Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.127201 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f71c32de-a23d-4207-9648-dacb057372ab","Type":"ContainerStarted","Data":"6ab85b4ea68d3c75e105af156fe9760ae4b784f14ec274e66a2e6604b8fac127"} Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.127306 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f71c32de-a23d-4207-9648-dacb057372ab","Type":"ContainerStarted","Data":"e114a962a06c5406d7345bb4de0c48dbce92dedbbb5673c05cace4fe161596f7"} Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.151726 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.151706602 podStartE2EDuration="2.151706602s" podCreationTimestamp="2025-10-02 12:28:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:53.148203051 +0000 UTC m=+5848.091073995" watchObservedRunningTime="2025-10-02 12:28:53.151706602 +0000 UTC m=+5848.094577536" Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.178279 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.178246183 podStartE2EDuration="2.178246183s" podCreationTimestamp="2025-10-02 12:28:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:53.173066667 +0000 UTC m=+5848.115937611" watchObservedRunningTime="2025-10-02 12:28:53.178246183 +0000 UTC m=+5848.121117137" Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.706264 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.814354 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfa9d25-deff-4655-a251-4583fdc8e04d-combined-ca-bundle\") pod \"3bfa9d25-deff-4655-a251-4583fdc8e04d\" (UID: \"3bfa9d25-deff-4655-a251-4583fdc8e04d\") " Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.814555 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5gg2\" (UniqueName: \"kubernetes.io/projected/3bfa9d25-deff-4655-a251-4583fdc8e04d-kube-api-access-f5gg2\") pod \"3bfa9d25-deff-4655-a251-4583fdc8e04d\" (UID: \"3bfa9d25-deff-4655-a251-4583fdc8e04d\") " Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.815799 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bfa9d25-deff-4655-a251-4583fdc8e04d-config-data\") pod \"3bfa9d25-deff-4655-a251-4583fdc8e04d\" (UID: \"3bfa9d25-deff-4655-a251-4583fdc8e04d\") " Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.822053 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bfa9d25-deff-4655-a251-4583fdc8e04d-kube-api-access-f5gg2" (OuterVolumeSpecName: "kube-api-access-f5gg2") pod "3bfa9d25-deff-4655-a251-4583fdc8e04d" (UID: "3bfa9d25-deff-4655-a251-4583fdc8e04d"). InnerVolumeSpecName "kube-api-access-f5gg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.842824 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bfa9d25-deff-4655-a251-4583fdc8e04d-config-data" (OuterVolumeSpecName: "config-data") pod "3bfa9d25-deff-4655-a251-4583fdc8e04d" (UID: "3bfa9d25-deff-4655-a251-4583fdc8e04d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.845794 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bfa9d25-deff-4655-a251-4583fdc8e04d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bfa9d25-deff-4655-a251-4583fdc8e04d" (UID: "3bfa9d25-deff-4655-a251-4583fdc8e04d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.912432 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d650726-0f54-4320-9764-c984cefe3c0b" path="/var/lib/kubelet/pods/4d650726-0f54-4320-9764-c984cefe3c0b/volumes" Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.931578 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfa9d25-deff-4655-a251-4583fdc8e04d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.931859 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5gg2\" (UniqueName: \"kubernetes.io/projected/3bfa9d25-deff-4655-a251-4583fdc8e04d-kube-api-access-f5gg2\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:53 crc kubenswrapper[4766]: I1002 12:28:53.931903 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bfa9d25-deff-4655-a251-4583fdc8e04d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.141086 4766 generic.go:334] "Generic (PLEG): container finished" podID="3bfa9d25-deff-4655-a251-4583fdc8e04d" containerID="fa2b9e811e9c0a1252eefd29ec906e477712a3e00dd7ecd33aada0a2edf7c830" exitCode=0 Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.141190 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.141220 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3bfa9d25-deff-4655-a251-4583fdc8e04d","Type":"ContainerDied","Data":"fa2b9e811e9c0a1252eefd29ec906e477712a3e00dd7ecd33aada0a2edf7c830"} Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.141346 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3bfa9d25-deff-4655-a251-4583fdc8e04d","Type":"ContainerDied","Data":"bb6411791c7d345131e0347fb38434fee5dc79a98f846147e3f6cbbe9cf7a3c0"} Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.141370 4766 scope.go:117] "RemoveContainer" containerID="fa2b9e811e9c0a1252eefd29ec906e477712a3e00dd7ecd33aada0a2edf7c830" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.174096 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.184427 4766 scope.go:117] "RemoveContainer" containerID="fa2b9e811e9c0a1252eefd29ec906e477712a3e00dd7ecd33aada0a2edf7c830" Oct 02 12:28:54 crc kubenswrapper[4766]: E1002 12:28:54.185406 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa2b9e811e9c0a1252eefd29ec906e477712a3e00dd7ecd33aada0a2edf7c830\": container with ID starting with fa2b9e811e9c0a1252eefd29ec906e477712a3e00dd7ecd33aada0a2edf7c830 not found: ID does not exist" containerID="fa2b9e811e9c0a1252eefd29ec906e477712a3e00dd7ecd33aada0a2edf7c830" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.185594 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa2b9e811e9c0a1252eefd29ec906e477712a3e00dd7ecd33aada0a2edf7c830"} err="failed to get container status \"fa2b9e811e9c0a1252eefd29ec906e477712a3e00dd7ecd33aada0a2edf7c830\": rpc error: code = NotFound desc = could not find container \"fa2b9e811e9c0a1252eefd29ec906e477712a3e00dd7ecd33aada0a2edf7c830\": container with ID starting with fa2b9e811e9c0a1252eefd29ec906e477712a3e00dd7ecd33aada0a2edf7c830 not found: ID does not exist" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.192006 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.204969 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:28:54 crc kubenswrapper[4766]: E1002 12:28:54.205669 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bfa9d25-deff-4655-a251-4583fdc8e04d" containerName="nova-scheduler-scheduler" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.205696 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bfa9d25-deff-4655-a251-4583fdc8e04d" containerName="nova-scheduler-scheduler" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.205975 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bfa9d25-deff-4655-a251-4583fdc8e04d" containerName="nova-scheduler-scheduler" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.206927 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.209645 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.225001 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.340173 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pf8n\" (UniqueName: \"kubernetes.io/projected/320f3be6-47d3-41b0-b3ac-8cb064be97f5-kube-api-access-5pf8n\") pod \"nova-scheduler-0\" (UID: \"320f3be6-47d3-41b0-b3ac-8cb064be97f5\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.341018 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320f3be6-47d3-41b0-b3ac-8cb064be97f5-config-data\") pod \"nova-scheduler-0\" (UID: \"320f3be6-47d3-41b0-b3ac-8cb064be97f5\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.341054 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/320f3be6-47d3-41b0-b3ac-8cb064be97f5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"320f3be6-47d3-41b0-b3ac-8cb064be97f5\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.443045 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320f3be6-47d3-41b0-b3ac-8cb064be97f5-config-data\") pod \"nova-scheduler-0\" (UID: \"320f3be6-47d3-41b0-b3ac-8cb064be97f5\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.443093 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/320f3be6-47d3-41b0-b3ac-8cb064be97f5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"320f3be6-47d3-41b0-b3ac-8cb064be97f5\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.443185 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pf8n\" (UniqueName: \"kubernetes.io/projected/320f3be6-47d3-41b0-b3ac-8cb064be97f5-kube-api-access-5pf8n\") pod \"nova-scheduler-0\" (UID: \"320f3be6-47d3-41b0-b3ac-8cb064be97f5\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.448741 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320f3be6-47d3-41b0-b3ac-8cb064be97f5-config-data\") pod \"nova-scheduler-0\" (UID: \"320f3be6-47d3-41b0-b3ac-8cb064be97f5\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.449917 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/320f3be6-47d3-41b0-b3ac-8cb064be97f5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"320f3be6-47d3-41b0-b3ac-8cb064be97f5\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.462472 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pf8n\" (UniqueName: \"kubernetes.io/projected/320f3be6-47d3-41b0-b3ac-8cb064be97f5-kube-api-access-5pf8n\") pod \"nova-scheduler-0\" (UID: \"320f3be6-47d3-41b0-b3ac-8cb064be97f5\") " pod="openstack/nova-scheduler-0" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.537303 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:28:54 crc kubenswrapper[4766]: I1002 12:28:54.985478 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:28:54 crc kubenswrapper[4766]: W1002 12:28:54.993452 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod320f3be6_47d3_41b0_b3ac_8cb064be97f5.slice/crio-d3442e9de5856ae752bd8dee1531f8c252bbb23113764608353f887bafc5109b WatchSource:0}: Error finding container d3442e9de5856ae752bd8dee1531f8c252bbb23113764608353f887bafc5109b: Status 404 returned error can't find the container with id d3442e9de5856ae752bd8dee1531f8c252bbb23113764608353f887bafc5109b Oct 02 12:28:55 crc kubenswrapper[4766]: I1002 12:28:55.157619 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"320f3be6-47d3-41b0-b3ac-8cb064be97f5","Type":"ContainerStarted","Data":"d3442e9de5856ae752bd8dee1531f8c252bbb23113764608353f887bafc5109b"} Oct 02 12:28:55 crc kubenswrapper[4766]: I1002 12:28:55.893984 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bfa9d25-deff-4655-a251-4583fdc8e04d" path="/var/lib/kubelet/pods/3bfa9d25-deff-4655-a251-4583fdc8e04d/volumes" Oct 02 12:28:56 crc kubenswrapper[4766]: I1002 12:28:56.170264 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"320f3be6-47d3-41b0-b3ac-8cb064be97f5","Type":"ContainerStarted","Data":"5cfa4b6f1ac006f0bd42c8b30ce9578ad6df0955dc2818cf09df77744f842f02"} Oct 02 12:28:56 crc kubenswrapper[4766]: I1002 12:28:56.202911 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.2028409 podStartE2EDuration="2.2028409s" podCreationTimestamp="2025-10-02 12:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:28:56.189521434 +0000 UTC m=+5851.132392388" watchObservedRunningTime="2025-10-02 12:28:56.2028409 +0000 UTC m=+5851.145711844" Oct 02 12:28:56 crc kubenswrapper[4766]: I1002 12:28:56.938394 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 12:28:56 crc kubenswrapper[4766]: I1002 12:28:56.938463 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 12:28:58 crc kubenswrapper[4766]: I1002 12:28:58.456137 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.037837 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-pd994"] Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.039320 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pd994" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.048707 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.048819 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.059844 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pd994"] Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.153076 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-scripts\") pod \"nova-cell1-cell-mapping-pd994\" (UID: \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\") " pod="openstack/nova-cell1-cell-mapping-pd994" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.153149 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pd994\" (UID: \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\") " pod="openstack/nova-cell1-cell-mapping-pd994" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.153183 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-config-data\") pod \"nova-cell1-cell-mapping-pd994\" (UID: \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\") " pod="openstack/nova-cell1-cell-mapping-pd994" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.153388 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpz55\" (UniqueName: \"kubernetes.io/projected/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-kube-api-access-lpz55\") pod \"nova-cell1-cell-mapping-pd994\" (UID: \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\") " pod="openstack/nova-cell1-cell-mapping-pd994" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.255747 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-config-data\") pod \"nova-cell1-cell-mapping-pd994\" (UID: \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\") " pod="openstack/nova-cell1-cell-mapping-pd994" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.255873 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpz55\" (UniqueName: \"kubernetes.io/projected/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-kube-api-access-lpz55\") pod \"nova-cell1-cell-mapping-pd994\" (UID: \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\") " pod="openstack/nova-cell1-cell-mapping-pd994" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.255957 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-scripts\") pod \"nova-cell1-cell-mapping-pd994\" (UID: \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\") " pod="openstack/nova-cell1-cell-mapping-pd994" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.255991 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pd994\" (UID: \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\") " pod="openstack/nova-cell1-cell-mapping-pd994" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.263674 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-scripts\") pod \"nova-cell1-cell-mapping-pd994\" (UID: \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\") " pod="openstack/nova-cell1-cell-mapping-pd994" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.266561 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pd994\" (UID: \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\") " pod="openstack/nova-cell1-cell-mapping-pd994" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.279343 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-config-data\") pod \"nova-cell1-cell-mapping-pd994\" (UID: \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\") " pod="openstack/nova-cell1-cell-mapping-pd994" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.283371 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpz55\" (UniqueName: \"kubernetes.io/projected/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-kube-api-access-lpz55\") pod \"nova-cell1-cell-mapping-pd994\" (UID: \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\") " pod="openstack/nova-cell1-cell-mapping-pd994" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.379861 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pd994" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.537964 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 12:28:59 crc kubenswrapper[4766]: I1002 12:28:59.898174 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pd994"] Oct 02 12:28:59 crc kubenswrapper[4766]: W1002 12:28:59.908191 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b15aa28_bb1c_4f14_9aeb_06c0033ae58b.slice/crio-5cafbfe0f8747bb18527fce4e93ed92c891d4d2d07d9961d8f12901474bf628b WatchSource:0}: Error finding container 5cafbfe0f8747bb18527fce4e93ed92c891d4d2d07d9961d8f12901474bf628b: Status 404 returned error can't find the container with id 5cafbfe0f8747bb18527fce4e93ed92c891d4d2d07d9961d8f12901474bf628b Oct 02 12:29:00 crc kubenswrapper[4766]: I1002 12:29:00.230320 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pd994" event={"ID":"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b","Type":"ContainerStarted","Data":"f82976d8a1d07384533a107bd6daba5c9fe638af34a1fea8fa64be1c6c278ad2"} Oct 02 12:29:00 crc kubenswrapper[4766]: I1002 12:29:00.230758 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pd994" event={"ID":"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b","Type":"ContainerStarted","Data":"5cafbfe0f8747bb18527fce4e93ed92c891d4d2d07d9961d8f12901474bf628b"} Oct 02 12:29:00 crc kubenswrapper[4766]: I1002 12:29:00.255529 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-pd994" podStartSLOduration=1.2554869339999999 podStartE2EDuration="1.255486934s" podCreationTimestamp="2025-10-02 12:28:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:29:00.246616089 +0000 UTC m=+5855.189487053" watchObservedRunningTime="2025-10-02 12:29:00.255486934 +0000 UTC m=+5855.198357878" Oct 02 12:29:01 crc kubenswrapper[4766]: I1002 12:29:01.935022 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 12:29:01 crc kubenswrapper[4766]: I1002 12:29:01.936190 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 12:29:01 crc kubenswrapper[4766]: I1002 12:29:01.937924 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 12:29:01 crc kubenswrapper[4766]: I1002 12:29:01.938006 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 12:29:03 crc kubenswrapper[4766]: I1002 12:29:03.099893 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.67:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:29:03 crc kubenswrapper[4766]: I1002 12:29:03.099964 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f71c32de-a23d-4207-9648-dacb057372ab" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.66:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:29:03 crc kubenswrapper[4766]: I1002 12:29:03.100122 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.67:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:29:03 crc kubenswrapper[4766]: I1002 12:29:03.099895 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f71c32de-a23d-4207-9648-dacb057372ab" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.66:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:29:03 crc kubenswrapper[4766]: I1002 12:29:03.882071 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:29:03 crc kubenswrapper[4766]: E1002 12:29:03.883567 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:29:04 crc kubenswrapper[4766]: I1002 12:29:04.538317 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 12:29:04 crc kubenswrapper[4766]: I1002 12:29:04.572989 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 12:29:05 crc kubenswrapper[4766]: I1002 12:29:05.325611 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 12:29:06 crc kubenswrapper[4766]: I1002 12:29:06.305386 4766 generic.go:334] "Generic (PLEG): container finished" podID="7b15aa28-bb1c-4f14-9aeb-06c0033ae58b" containerID="f82976d8a1d07384533a107bd6daba5c9fe638af34a1fea8fa64be1c6c278ad2" exitCode=0 Oct 02 12:29:06 crc kubenswrapper[4766]: I1002 12:29:06.305563 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pd994" event={"ID":"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b","Type":"ContainerDied","Data":"f82976d8a1d07384533a107bd6daba5c9fe638af34a1fea8fa64be1c6c278ad2"} Oct 02 12:29:07 crc kubenswrapper[4766]: I1002 12:29:07.725200 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pd994" Oct 02 12:29:07 crc kubenswrapper[4766]: I1002 12:29:07.853330 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpz55\" (UniqueName: \"kubernetes.io/projected/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-kube-api-access-lpz55\") pod \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\" (UID: \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\") " Oct 02 12:29:07 crc kubenswrapper[4766]: I1002 12:29:07.853799 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-scripts\") pod \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\" (UID: \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\") " Oct 02 12:29:07 crc kubenswrapper[4766]: I1002 12:29:07.853910 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-config-data\") pod \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\" (UID: \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\") " Oct 02 12:29:07 crc kubenswrapper[4766]: I1002 12:29:07.854104 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-combined-ca-bundle\") pod \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\" (UID: \"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b\") " Oct 02 12:29:07 crc kubenswrapper[4766]: I1002 12:29:07.861563 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-scripts" (OuterVolumeSpecName: "scripts") pod "7b15aa28-bb1c-4f14-9aeb-06c0033ae58b" (UID: "7b15aa28-bb1c-4f14-9aeb-06c0033ae58b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:29:07 crc kubenswrapper[4766]: I1002 12:29:07.863555 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-kube-api-access-lpz55" (OuterVolumeSpecName: "kube-api-access-lpz55") pod "7b15aa28-bb1c-4f14-9aeb-06c0033ae58b" (UID: "7b15aa28-bb1c-4f14-9aeb-06c0033ae58b"). InnerVolumeSpecName "kube-api-access-lpz55". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:29:07 crc kubenswrapper[4766]: I1002 12:29:07.889822 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b15aa28-bb1c-4f14-9aeb-06c0033ae58b" (UID: "7b15aa28-bb1c-4f14-9aeb-06c0033ae58b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:29:07 crc kubenswrapper[4766]: I1002 12:29:07.890031 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-config-data" (OuterVolumeSpecName: "config-data") pod "7b15aa28-bb1c-4f14-9aeb-06c0033ae58b" (UID: "7b15aa28-bb1c-4f14-9aeb-06c0033ae58b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:29:07 crc kubenswrapper[4766]: I1002 12:29:07.957373 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:07 crc kubenswrapper[4766]: I1002 12:29:07.957435 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpz55\" (UniqueName: \"kubernetes.io/projected/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-kube-api-access-lpz55\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:07 crc kubenswrapper[4766]: I1002 12:29:07.957455 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:07 crc kubenswrapper[4766]: I1002 12:29:07.957469 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:08 crc kubenswrapper[4766]: I1002 12:29:08.353120 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pd994" event={"ID":"7b15aa28-bb1c-4f14-9aeb-06c0033ae58b","Type":"ContainerDied","Data":"5cafbfe0f8747bb18527fce4e93ed92c891d4d2d07d9961d8f12901474bf628b"} Oct 02 12:29:08 crc kubenswrapper[4766]: I1002 12:29:08.353623 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cafbfe0f8747bb18527fce4e93ed92c891d4d2d07d9961d8f12901474bf628b" Oct 02 12:29:08 crc kubenswrapper[4766]: I1002 12:29:08.353303 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pd994" Oct 02 12:29:08 crc kubenswrapper[4766]: I1002 12:29:08.543786 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:29:08 crc kubenswrapper[4766]: I1002 12:29:08.544243 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f71c32de-a23d-4207-9648-dacb057372ab" containerName="nova-api-log" containerID="cri-o://6ab85b4ea68d3c75e105af156fe9760ae4b784f14ec274e66a2e6604b8fac127" gracePeriod=30 Oct 02 12:29:08 crc kubenswrapper[4766]: I1002 12:29:08.544339 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f71c32de-a23d-4207-9648-dacb057372ab" containerName="nova-api-api" containerID="cri-o://c81aec9f93b6dfaec9d03b8e7192a2311d180d2caa58888c9ae7068044df95e0" gracePeriod=30 Oct 02 12:29:08 crc kubenswrapper[4766]: I1002 12:29:08.554433 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:29:08 crc kubenswrapper[4766]: I1002 12:29:08.554784 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="320f3be6-47d3-41b0-b3ac-8cb064be97f5" containerName="nova-scheduler-scheduler" containerID="cri-o://5cfa4b6f1ac006f0bd42c8b30ce9578ad6df0955dc2818cf09df77744f842f02" gracePeriod=30 Oct 02 12:29:08 crc kubenswrapper[4766]: I1002 12:29:08.568590 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:29:08 crc kubenswrapper[4766]: I1002 12:29:08.568866 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3" containerName="nova-metadata-log" containerID="cri-o://19d2be01ac6fc0caa09ed23ae05ee777a6458a86de9b727fc9947c3567ae932e" gracePeriod=30 Oct 02 12:29:08 crc kubenswrapper[4766]: I1002 12:29:08.568997 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3" containerName="nova-metadata-metadata" containerID="cri-o://2dbbf87cd680c0441e52407eb9022a4b6b45b40496472a1ff0b0691ea8c0a842" gracePeriod=30 Oct 02 12:29:09 crc kubenswrapper[4766]: I1002 12:29:09.373143 4766 generic.go:334] "Generic (PLEG): container finished" podID="f71c32de-a23d-4207-9648-dacb057372ab" containerID="6ab85b4ea68d3c75e105af156fe9760ae4b784f14ec274e66a2e6604b8fac127" exitCode=143 Oct 02 12:29:09 crc kubenswrapper[4766]: I1002 12:29:09.373203 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f71c32de-a23d-4207-9648-dacb057372ab","Type":"ContainerDied","Data":"6ab85b4ea68d3c75e105af156fe9760ae4b784f14ec274e66a2e6604b8fac127"} Oct 02 12:29:09 crc kubenswrapper[4766]: I1002 12:29:09.377476 4766 generic.go:334] "Generic (PLEG): container finished" podID="35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3" containerID="19d2be01ac6fc0caa09ed23ae05ee777a6458a86de9b727fc9947c3567ae932e" exitCode=143 Oct 02 12:29:09 crc kubenswrapper[4766]: I1002 12:29:09.377552 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3","Type":"ContainerDied","Data":"19d2be01ac6fc0caa09ed23ae05ee777a6458a86de9b727fc9947c3567ae932e"} Oct 02 12:29:09 crc kubenswrapper[4766]: E1002 12:29:09.541371 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5cfa4b6f1ac006f0bd42c8b30ce9578ad6df0955dc2818cf09df77744f842f02" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 12:29:09 crc kubenswrapper[4766]: E1002 12:29:09.543497 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5cfa4b6f1ac006f0bd42c8b30ce9578ad6df0955dc2818cf09df77744f842f02" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 12:29:09 crc kubenswrapper[4766]: E1002 12:29:09.545163 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5cfa4b6f1ac006f0bd42c8b30ce9578ad6df0955dc2818cf09df77744f842f02" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 02 12:29:09 crc kubenswrapper[4766]: E1002 12:29:09.545200 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="320f3be6-47d3-41b0-b3ac-8cb064be97f5" containerName="nova-scheduler-scheduler" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.217102 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.315113 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.319086 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.368765 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pf8n\" (UniqueName: \"kubernetes.io/projected/320f3be6-47d3-41b0-b3ac-8cb064be97f5-kube-api-access-5pf8n\") pod \"320f3be6-47d3-41b0-b3ac-8cb064be97f5\" (UID: \"320f3be6-47d3-41b0-b3ac-8cb064be97f5\") " Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.369070 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320f3be6-47d3-41b0-b3ac-8cb064be97f5-config-data\") pod \"320f3be6-47d3-41b0-b3ac-8cb064be97f5\" (UID: \"320f3be6-47d3-41b0-b3ac-8cb064be97f5\") " Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.369271 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/320f3be6-47d3-41b0-b3ac-8cb064be97f5-combined-ca-bundle\") pod \"320f3be6-47d3-41b0-b3ac-8cb064be97f5\" (UID: \"320f3be6-47d3-41b0-b3ac-8cb064be97f5\") " Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.376303 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320f3be6-47d3-41b0-b3ac-8cb064be97f5-kube-api-access-5pf8n" (OuterVolumeSpecName: "kube-api-access-5pf8n") pod "320f3be6-47d3-41b0-b3ac-8cb064be97f5" (UID: "320f3be6-47d3-41b0-b3ac-8cb064be97f5"). InnerVolumeSpecName "kube-api-access-5pf8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.400273 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/320f3be6-47d3-41b0-b3ac-8cb064be97f5-config-data" (OuterVolumeSpecName: "config-data") pod "320f3be6-47d3-41b0-b3ac-8cb064be97f5" (UID: "320f3be6-47d3-41b0-b3ac-8cb064be97f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.409348 4766 generic.go:334] "Generic (PLEG): container finished" podID="f71c32de-a23d-4207-9648-dacb057372ab" containerID="c81aec9f93b6dfaec9d03b8e7192a2311d180d2caa58888c9ae7068044df95e0" exitCode=0 Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.409445 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f71c32de-a23d-4207-9648-dacb057372ab","Type":"ContainerDied","Data":"c81aec9f93b6dfaec9d03b8e7192a2311d180d2caa58888c9ae7068044df95e0"} Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.409482 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f71c32de-a23d-4207-9648-dacb057372ab","Type":"ContainerDied","Data":"e114a962a06c5406d7345bb4de0c48dbce92dedbbb5673c05cace4fe161596f7"} Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.409510 4766 scope.go:117] "RemoveContainer" containerID="c81aec9f93b6dfaec9d03b8e7192a2311d180d2caa58888c9ae7068044df95e0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.409745 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.413866 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/320f3be6-47d3-41b0-b3ac-8cb064be97f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "320f3be6-47d3-41b0-b3ac-8cb064be97f5" (UID: "320f3be6-47d3-41b0-b3ac-8cb064be97f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.415609 4766 generic.go:334] "Generic (PLEG): container finished" podID="35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3" containerID="2dbbf87cd680c0441e52407eb9022a4b6b45b40496472a1ff0b0691ea8c0a842" exitCode=0 Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.415715 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3","Type":"ContainerDied","Data":"2dbbf87cd680c0441e52407eb9022a4b6b45b40496472a1ff0b0691ea8c0a842"} Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.415748 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3","Type":"ContainerDied","Data":"9081658f344ece9d55b1c13d6b640d23bab4e0d61ac10ef543b1e3fa8e1f9bb3"} Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.415833 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.423228 4766 generic.go:334] "Generic (PLEG): container finished" podID="320f3be6-47d3-41b0-b3ac-8cb064be97f5" containerID="5cfa4b6f1ac006f0bd42c8b30ce9578ad6df0955dc2818cf09df77744f842f02" exitCode=0 Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.423685 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"320f3be6-47d3-41b0-b3ac-8cb064be97f5","Type":"ContainerDied","Data":"5cfa4b6f1ac006f0bd42c8b30ce9578ad6df0955dc2818cf09df77744f842f02"} Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.423717 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"320f3be6-47d3-41b0-b3ac-8cb064be97f5","Type":"ContainerDied","Data":"d3442e9de5856ae752bd8dee1531f8c252bbb23113764608353f887bafc5109b"} Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.423783 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.445991 4766 scope.go:117] "RemoveContainer" containerID="6ab85b4ea68d3c75e105af156fe9760ae4b784f14ec274e66a2e6604b8fac127" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.466672 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.472112 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f71c32de-a23d-4207-9648-dacb057372ab-logs\") pod \"f71c32de-a23d-4207-9648-dacb057372ab\" (UID: \"f71c32de-a23d-4207-9648-dacb057372ab\") " Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.472169 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd5nb\" (UniqueName: \"kubernetes.io/projected/f71c32de-a23d-4207-9648-dacb057372ab-kube-api-access-sd5nb\") pod \"f71c32de-a23d-4207-9648-dacb057372ab\" (UID: \"f71c32de-a23d-4207-9648-dacb057372ab\") " Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.472234 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-combined-ca-bundle\") pod \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\" (UID: \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\") " Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.472337 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71c32de-a23d-4207-9648-dacb057372ab-config-data\") pod \"f71c32de-a23d-4207-9648-dacb057372ab\" (UID: \"f71c32de-a23d-4207-9648-dacb057372ab\") " Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.472434 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-logs\") pod \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\" (UID: \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\") " Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.472571 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71c32de-a23d-4207-9648-dacb057372ab-combined-ca-bundle\") pod \"f71c32de-a23d-4207-9648-dacb057372ab\" (UID: \"f71c32de-a23d-4207-9648-dacb057372ab\") " Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.472617 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-config-data\") pod \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\" (UID: \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\") " Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.472774 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g95x\" (UniqueName: \"kubernetes.io/projected/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-kube-api-access-9g95x\") pod \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\" (UID: \"35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3\") " Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.472767 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f71c32de-a23d-4207-9648-dacb057372ab-logs" (OuterVolumeSpecName: "logs") pod "f71c32de-a23d-4207-9648-dacb057372ab" (UID: "f71c32de-a23d-4207-9648-dacb057372ab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.473795 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f71c32de-a23d-4207-9648-dacb057372ab-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.473816 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/320f3be6-47d3-41b0-b3ac-8cb064be97f5-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.473828 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/320f3be6-47d3-41b0-b3ac-8cb064be97f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.473840 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pf8n\" (UniqueName: \"kubernetes.io/projected/320f3be6-47d3-41b0-b3ac-8cb064be97f5-kube-api-access-5pf8n\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.476059 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-logs" (OuterVolumeSpecName: "logs") pod "35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3" (UID: "35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.481804 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.485401 4766 scope.go:117] "RemoveContainer" containerID="c81aec9f93b6dfaec9d03b8e7192a2311d180d2caa58888c9ae7068044df95e0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.487416 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-kube-api-access-9g95x" (OuterVolumeSpecName: "kube-api-access-9g95x") pod "35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3" (UID: "35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3"). InnerVolumeSpecName "kube-api-access-9g95x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.492303 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f71c32de-a23d-4207-9648-dacb057372ab-kube-api-access-sd5nb" (OuterVolumeSpecName: "kube-api-access-sd5nb") pod "f71c32de-a23d-4207-9648-dacb057372ab" (UID: "f71c32de-a23d-4207-9648-dacb057372ab"). InnerVolumeSpecName "kube-api-access-sd5nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:29:12 crc kubenswrapper[4766]: E1002 12:29:12.493238 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c81aec9f93b6dfaec9d03b8e7192a2311d180d2caa58888c9ae7068044df95e0\": container with ID starting with c81aec9f93b6dfaec9d03b8e7192a2311d180d2caa58888c9ae7068044df95e0 not found: ID does not exist" containerID="c81aec9f93b6dfaec9d03b8e7192a2311d180d2caa58888c9ae7068044df95e0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.493304 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c81aec9f93b6dfaec9d03b8e7192a2311d180d2caa58888c9ae7068044df95e0"} err="failed to get container status \"c81aec9f93b6dfaec9d03b8e7192a2311d180d2caa58888c9ae7068044df95e0\": rpc error: code = NotFound desc = could not find container \"c81aec9f93b6dfaec9d03b8e7192a2311d180d2caa58888c9ae7068044df95e0\": container with ID starting with c81aec9f93b6dfaec9d03b8e7192a2311d180d2caa58888c9ae7068044df95e0 not found: ID does not exist" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.493338 4766 scope.go:117] "RemoveContainer" containerID="6ab85b4ea68d3c75e105af156fe9760ae4b784f14ec274e66a2e6604b8fac127" Oct 02 12:29:12 crc kubenswrapper[4766]: E1002 12:29:12.493721 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ab85b4ea68d3c75e105af156fe9760ae4b784f14ec274e66a2e6604b8fac127\": container with ID starting with 6ab85b4ea68d3c75e105af156fe9760ae4b784f14ec274e66a2e6604b8fac127 not found: ID does not exist" containerID="6ab85b4ea68d3c75e105af156fe9760ae4b784f14ec274e66a2e6604b8fac127" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.493742 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab85b4ea68d3c75e105af156fe9760ae4b784f14ec274e66a2e6604b8fac127"} err="failed to get container status \"6ab85b4ea68d3c75e105af156fe9760ae4b784f14ec274e66a2e6604b8fac127\": rpc error: code = NotFound desc = could not find container \"6ab85b4ea68d3c75e105af156fe9760ae4b784f14ec274e66a2e6604b8fac127\": container with ID starting with 6ab85b4ea68d3c75e105af156fe9760ae4b784f14ec274e66a2e6604b8fac127 not found: ID does not exist" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.493755 4766 scope.go:117] "RemoveContainer" containerID="2dbbf87cd680c0441e52407eb9022a4b6b45b40496472a1ff0b0691ea8c0a842" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.501536 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:29:12 crc kubenswrapper[4766]: E1002 12:29:12.502100 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b15aa28-bb1c-4f14-9aeb-06c0033ae58b" containerName="nova-manage" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.502126 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b15aa28-bb1c-4f14-9aeb-06c0033ae58b" containerName="nova-manage" Oct 02 12:29:12 crc kubenswrapper[4766]: E1002 12:29:12.502155 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71c32de-a23d-4207-9648-dacb057372ab" containerName="nova-api-log" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.502164 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71c32de-a23d-4207-9648-dacb057372ab" containerName="nova-api-log" Oct 02 12:29:12 crc kubenswrapper[4766]: E1002 12:29:12.502195 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3" containerName="nova-metadata-log" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.502204 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3" containerName="nova-metadata-log" Oct 02 12:29:12 crc kubenswrapper[4766]: E1002 12:29:12.502226 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71c32de-a23d-4207-9648-dacb057372ab" containerName="nova-api-api" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.502234 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71c32de-a23d-4207-9648-dacb057372ab" containerName="nova-api-api" Oct 02 12:29:12 crc kubenswrapper[4766]: E1002 12:29:12.502243 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3" containerName="nova-metadata-metadata" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.502251 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3" containerName="nova-metadata-metadata" Oct 02 12:29:12 crc kubenswrapper[4766]: E1002 12:29:12.502259 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320f3be6-47d3-41b0-b3ac-8cb064be97f5" containerName="nova-scheduler-scheduler" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.502267 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="320f3be6-47d3-41b0-b3ac-8cb064be97f5" containerName="nova-scheduler-scheduler" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.502548 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3" containerName="nova-metadata-log" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.502571 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b15aa28-bb1c-4f14-9aeb-06c0033ae58b" containerName="nova-manage" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.502585 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="320f3be6-47d3-41b0-b3ac-8cb064be97f5" containerName="nova-scheduler-scheduler" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.502606 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3" containerName="nova-metadata-metadata" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.502620 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f71c32de-a23d-4207-9648-dacb057372ab" containerName="nova-api-api" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.502632 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f71c32de-a23d-4207-9648-dacb057372ab" containerName="nova-api-log" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.505976 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.508177 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-config-data" (OuterVolumeSpecName: "config-data") pod "35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3" (UID: "35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.510944 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.516431 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71c32de-a23d-4207-9648-dacb057372ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f71c32de-a23d-4207-9648-dacb057372ab" (UID: "f71c32de-a23d-4207-9648-dacb057372ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.522489 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71c32de-a23d-4207-9648-dacb057372ab-config-data" (OuterVolumeSpecName: "config-data") pod "f71c32de-a23d-4207-9648-dacb057372ab" (UID: "f71c32de-a23d-4207-9648-dacb057372ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.533932 4766 scope.go:117] "RemoveContainer" containerID="19d2be01ac6fc0caa09ed23ae05ee777a6458a86de9b727fc9947c3567ae932e" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.534400 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.536314 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3" (UID: "35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.562418 4766 scope.go:117] "RemoveContainer" containerID="2dbbf87cd680c0441e52407eb9022a4b6b45b40496472a1ff0b0691ea8c0a842" Oct 02 12:29:12 crc kubenswrapper[4766]: E1002 12:29:12.563338 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dbbf87cd680c0441e52407eb9022a4b6b45b40496472a1ff0b0691ea8c0a842\": container with ID starting with 2dbbf87cd680c0441e52407eb9022a4b6b45b40496472a1ff0b0691ea8c0a842 not found: ID does not exist" containerID="2dbbf87cd680c0441e52407eb9022a4b6b45b40496472a1ff0b0691ea8c0a842" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.563378 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dbbf87cd680c0441e52407eb9022a4b6b45b40496472a1ff0b0691ea8c0a842"} err="failed to get container status \"2dbbf87cd680c0441e52407eb9022a4b6b45b40496472a1ff0b0691ea8c0a842\": rpc error: code = NotFound desc = could not find container \"2dbbf87cd680c0441e52407eb9022a4b6b45b40496472a1ff0b0691ea8c0a842\": container with ID starting with 2dbbf87cd680c0441e52407eb9022a4b6b45b40496472a1ff0b0691ea8c0a842 not found: ID does not exist" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.563411 4766 scope.go:117] "RemoveContainer" containerID="19d2be01ac6fc0caa09ed23ae05ee777a6458a86de9b727fc9947c3567ae932e" Oct 02 12:29:12 crc kubenswrapper[4766]: E1002 12:29:12.563799 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19d2be01ac6fc0caa09ed23ae05ee777a6458a86de9b727fc9947c3567ae932e\": container with ID starting with 19d2be01ac6fc0caa09ed23ae05ee777a6458a86de9b727fc9947c3567ae932e not found: ID does not exist" containerID="19d2be01ac6fc0caa09ed23ae05ee777a6458a86de9b727fc9947c3567ae932e" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.563847 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19d2be01ac6fc0caa09ed23ae05ee777a6458a86de9b727fc9947c3567ae932e"} err="failed to get container status \"19d2be01ac6fc0caa09ed23ae05ee777a6458a86de9b727fc9947c3567ae932e\": rpc error: code = NotFound desc = could not find container \"19d2be01ac6fc0caa09ed23ae05ee777a6458a86de9b727fc9947c3567ae932e\": container with ID starting with 19d2be01ac6fc0caa09ed23ae05ee777a6458a86de9b727fc9947c3567ae932e not found: ID does not exist" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.563879 4766 scope.go:117] "RemoveContainer" containerID="5cfa4b6f1ac006f0bd42c8b30ce9578ad6df0955dc2818cf09df77744f842f02" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.575786 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71c32de-a23d-4207-9648-dacb057372ab-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.575819 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.575833 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71c32de-a23d-4207-9648-dacb057372ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.575846 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.575858 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g95x\" (UniqueName: \"kubernetes.io/projected/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-kube-api-access-9g95x\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.575869 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd5nb\" (UniqueName: \"kubernetes.io/projected/f71c32de-a23d-4207-9648-dacb057372ab-kube-api-access-sd5nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.575879 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.591375 4766 scope.go:117] "RemoveContainer" containerID="5cfa4b6f1ac006f0bd42c8b30ce9578ad6df0955dc2818cf09df77744f842f02" Oct 02 12:29:12 crc kubenswrapper[4766]: E1002 12:29:12.591871 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfa4b6f1ac006f0bd42c8b30ce9578ad6df0955dc2818cf09df77744f842f02\": container with ID starting with 5cfa4b6f1ac006f0bd42c8b30ce9578ad6df0955dc2818cf09df77744f842f02 not found: ID does not exist" containerID="5cfa4b6f1ac006f0bd42c8b30ce9578ad6df0955dc2818cf09df77744f842f02" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.591901 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfa4b6f1ac006f0bd42c8b30ce9578ad6df0955dc2818cf09df77744f842f02"} err="failed to get container status \"5cfa4b6f1ac006f0bd42c8b30ce9578ad6df0955dc2818cf09df77744f842f02\": rpc error: code = NotFound desc = could not find container \"5cfa4b6f1ac006f0bd42c8b30ce9578ad6df0955dc2818cf09df77744f842f02\": container with ID starting with 5cfa4b6f1ac006f0bd42c8b30ce9578ad6df0955dc2818cf09df77744f842f02 not found: ID does not exist" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.677771 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz5hw\" (UniqueName: \"kubernetes.io/projected/fa2f5e53-8f06-4248-a150-6a98286af063-kube-api-access-gz5hw\") pod \"nova-scheduler-0\" (UID: \"fa2f5e53-8f06-4248-a150-6a98286af063\") " pod="openstack/nova-scheduler-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.678215 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa2f5e53-8f06-4248-a150-6a98286af063-config-data\") pod \"nova-scheduler-0\" (UID: \"fa2f5e53-8f06-4248-a150-6a98286af063\") " pod="openstack/nova-scheduler-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.678316 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2f5e53-8f06-4248-a150-6a98286af063-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fa2f5e53-8f06-4248-a150-6a98286af063\") " pod="openstack/nova-scheduler-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.748550 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.766015 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.780416 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.781622 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2f5e53-8f06-4248-a150-6a98286af063-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fa2f5e53-8f06-4248-a150-6a98286af063\") " pod="openstack/nova-scheduler-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.781682 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gz5hw\" (UniqueName: \"kubernetes.io/projected/fa2f5e53-8f06-4248-a150-6a98286af063-kube-api-access-gz5hw\") pod \"nova-scheduler-0\" (UID: \"fa2f5e53-8f06-4248-a150-6a98286af063\") " pod="openstack/nova-scheduler-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.781804 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa2f5e53-8f06-4248-a150-6a98286af063-config-data\") pod \"nova-scheduler-0\" (UID: \"fa2f5e53-8f06-4248-a150-6a98286af063\") " pod="openstack/nova-scheduler-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.790800 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa2f5e53-8f06-4248-a150-6a98286af063-config-data\") pod \"nova-scheduler-0\" (UID: \"fa2f5e53-8f06-4248-a150-6a98286af063\") " pod="openstack/nova-scheduler-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.790900 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.798420 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2f5e53-8f06-4248-a150-6a98286af063-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fa2f5e53-8f06-4248-a150-6a98286af063\") " pod="openstack/nova-scheduler-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.812277 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gz5hw\" (UniqueName: \"kubernetes.io/projected/fa2f5e53-8f06-4248-a150-6a98286af063-kube-api-access-gz5hw\") pod \"nova-scheduler-0\" (UID: \"fa2f5e53-8f06-4248-a150-6a98286af063\") " pod="openstack/nova-scheduler-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.814305 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.821921 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.825153 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.827857 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.839913 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.845320 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.847040 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.852895 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.890161 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.986742 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-logs\") pod \"nova-metadata-0\" (UID: \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\") " pod="openstack/nova-metadata-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.995651 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwtm2\" (UniqueName: \"kubernetes.io/projected/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-kube-api-access-jwtm2\") pod \"nova-metadata-0\" (UID: \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\") " pod="openstack/nova-metadata-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.995767 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a0fea6f-e88f-4bd5-935a-b55869a434d0-logs\") pod \"nova-api-0\" (UID: \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\") " pod="openstack/nova-api-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.995866 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\") " pod="openstack/nova-metadata-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.995929 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a0fea6f-e88f-4bd5-935a-b55869a434d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\") " pod="openstack/nova-api-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.995974 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a0fea6f-e88f-4bd5-935a-b55869a434d0-config-data\") pod \"nova-api-0\" (UID: \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\") " pod="openstack/nova-api-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.996093 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bsrs\" (UniqueName: \"kubernetes.io/projected/1a0fea6f-e88f-4bd5-935a-b55869a434d0-kube-api-access-7bsrs\") pod \"nova-api-0\" (UID: \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\") " pod="openstack/nova-api-0" Oct 02 12:29:12 crc kubenswrapper[4766]: I1002 12:29:12.996132 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-config-data\") pod \"nova-metadata-0\" (UID: \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\") " pod="openstack/nova-metadata-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.097456 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bsrs\" (UniqueName: \"kubernetes.io/projected/1a0fea6f-e88f-4bd5-935a-b55869a434d0-kube-api-access-7bsrs\") pod \"nova-api-0\" (UID: \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\") " pod="openstack/nova-api-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.097579 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-config-data\") pod \"nova-metadata-0\" (UID: \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\") " pod="openstack/nova-metadata-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.097629 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-logs\") pod \"nova-metadata-0\" (UID: \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\") " pod="openstack/nova-metadata-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.097648 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwtm2\" (UniqueName: \"kubernetes.io/projected/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-kube-api-access-jwtm2\") pod \"nova-metadata-0\" (UID: \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\") " pod="openstack/nova-metadata-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.097686 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a0fea6f-e88f-4bd5-935a-b55869a434d0-logs\") pod \"nova-api-0\" (UID: \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\") " pod="openstack/nova-api-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.097724 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\") " pod="openstack/nova-metadata-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.097800 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a0fea6f-e88f-4bd5-935a-b55869a434d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\") " pod="openstack/nova-api-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.097857 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a0fea6f-e88f-4bd5-935a-b55869a434d0-config-data\") pod \"nova-api-0\" (UID: \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\") " pod="openstack/nova-api-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.100334 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a0fea6f-e88f-4bd5-935a-b55869a434d0-logs\") pod \"nova-api-0\" (UID: \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\") " pod="openstack/nova-api-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.100769 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-logs\") pod \"nova-metadata-0\" (UID: \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\") " pod="openstack/nova-metadata-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.105614 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\") " pod="openstack/nova-metadata-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.109254 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a0fea6f-e88f-4bd5-935a-b55869a434d0-config-data\") pod \"nova-api-0\" (UID: \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\") " pod="openstack/nova-api-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.115838 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwtm2\" (UniqueName: \"kubernetes.io/projected/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-kube-api-access-jwtm2\") pod \"nova-metadata-0\" (UID: \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\") " pod="openstack/nova-metadata-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.117186 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-config-data\") pod \"nova-metadata-0\" (UID: \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\") " pod="openstack/nova-metadata-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.118371 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a0fea6f-e88f-4bd5-935a-b55869a434d0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\") " pod="openstack/nova-api-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.122108 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bsrs\" (UniqueName: \"kubernetes.io/projected/1a0fea6f-e88f-4bd5-935a-b55869a434d0-kube-api-access-7bsrs\") pod \"nova-api-0\" (UID: \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\") " pod="openstack/nova-api-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.296628 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.311203 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.311634 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.445736 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fa2f5e53-8f06-4248-a150-6a98286af063","Type":"ContainerStarted","Data":"176a5a58778acd88cd46145dc01f6fe8462034af85101f400fd00576702af9e0"} Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.793720 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.803402 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.897472 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="320f3be6-47d3-41b0-b3ac-8cb064be97f5" path="/var/lib/kubelet/pods/320f3be6-47d3-41b0-b3ac-8cb064be97f5/volumes" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.898827 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3" path="/var/lib/kubelet/pods/35cf8e4d-8239-4ff7-9b2d-4a964f2e53d3/volumes" Oct 02 12:29:13 crc kubenswrapper[4766]: I1002 12:29:13.902152 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f71c32de-a23d-4207-9648-dacb057372ab" path="/var/lib/kubelet/pods/f71c32de-a23d-4207-9648-dacb057372ab/volumes" Oct 02 12:29:14 crc kubenswrapper[4766]: I1002 12:29:14.459937 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5","Type":"ContainerStarted","Data":"8cdaa629e8ce2f86c0bda824a55d1668e40e7731b60f0274cf01b989dd2acbf8"} Oct 02 12:29:14 crc kubenswrapper[4766]: I1002 12:29:14.460367 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5","Type":"ContainerStarted","Data":"96afa1d405dd9285accc70f49b7ece06af972846d482100dfcf7868f75129cfc"} Oct 02 12:29:14 crc kubenswrapper[4766]: I1002 12:29:14.460381 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5","Type":"ContainerStarted","Data":"0ebda6bb4e136c60917c5d589d1bdd2248ed060426aa8ec1150e8b25da1b6819"} Oct 02 12:29:14 crc kubenswrapper[4766]: I1002 12:29:14.464936 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fa2f5e53-8f06-4248-a150-6a98286af063","Type":"ContainerStarted","Data":"a247b068558cdfd3750ed66585d2668ad6a4593006df480bb2d08dd9f7ae1aae"} Oct 02 12:29:14 crc kubenswrapper[4766]: I1002 12:29:14.468393 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a0fea6f-e88f-4bd5-935a-b55869a434d0","Type":"ContainerStarted","Data":"052f6c1890c8d89aa97184536b47f034997d104512e606207e1b720b6be62a39"} Oct 02 12:29:14 crc kubenswrapper[4766]: I1002 12:29:14.468464 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a0fea6f-e88f-4bd5-935a-b55869a434d0","Type":"ContainerStarted","Data":"e5db32e650fc8416620a5b9ac5433257777a38aef44ecfa4d19022695c9bc169"} Oct 02 12:29:14 crc kubenswrapper[4766]: I1002 12:29:14.468479 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a0fea6f-e88f-4bd5-935a-b55869a434d0","Type":"ContainerStarted","Data":"e3188d2d994a0d7c2fe9386304fde0f24460bbc196e1e5e0a6cae480c84a2f65"} Oct 02 12:29:14 crc kubenswrapper[4766]: I1002 12:29:14.493559 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.493509449 podStartE2EDuration="2.493509449s" podCreationTimestamp="2025-10-02 12:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:29:14.478074526 +0000 UTC m=+5869.420945480" watchObservedRunningTime="2025-10-02 12:29:14.493509449 +0000 UTC m=+5869.436380393" Oct 02 12:29:14 crc kubenswrapper[4766]: I1002 12:29:14.503797 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.503776308 podStartE2EDuration="2.503776308s" podCreationTimestamp="2025-10-02 12:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:29:14.497626252 +0000 UTC m=+5869.440497206" watchObservedRunningTime="2025-10-02 12:29:14.503776308 +0000 UTC m=+5869.446647252" Oct 02 12:29:17 crc kubenswrapper[4766]: I1002 12:29:17.840926 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 12:29:17 crc kubenswrapper[4766]: I1002 12:29:17.887240 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:29:17 crc kubenswrapper[4766]: E1002 12:29:17.887524 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:29:18 crc kubenswrapper[4766]: I1002 12:29:18.312362 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 12:29:18 crc kubenswrapper[4766]: I1002 12:29:18.312415 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 12:29:22 crc kubenswrapper[4766]: I1002 12:29:22.841443 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 12:29:22 crc kubenswrapper[4766]: I1002 12:29:22.868302 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 12:29:22 crc kubenswrapper[4766]: I1002 12:29:22.888888 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=10.888864671 podStartE2EDuration="10.888864671s" podCreationTimestamp="2025-10-02 12:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:29:14.512127275 +0000 UTC m=+5869.454998239" watchObservedRunningTime="2025-10-02 12:29:22.888864671 +0000 UTC m=+5877.831735615" Oct 02 12:29:23 crc kubenswrapper[4766]: I1002 12:29:23.297594 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 12:29:23 crc kubenswrapper[4766]: I1002 12:29:23.297746 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 12:29:23 crc kubenswrapper[4766]: I1002 12:29:23.312290 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 12:29:23 crc kubenswrapper[4766]: I1002 12:29:23.312356 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 12:29:23 crc kubenswrapper[4766]: I1002 12:29:23.580587 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 12:29:24 crc kubenswrapper[4766]: I1002 12:29:24.462851 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d8ed39b6-8460-4fcd-995e-7f65b15fc4f5" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.72:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:29:24 crc kubenswrapper[4766]: I1002 12:29:24.462858 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1a0fea6f-e88f-4bd5-935a-b55869a434d0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:29:24 crc kubenswrapper[4766]: I1002 12:29:24.462941 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1a0fea6f-e88f-4bd5-935a-b55869a434d0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.71:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:29:24 crc kubenswrapper[4766]: I1002 12:29:24.462979 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d8ed39b6-8460-4fcd-995e-7f65b15fc4f5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.72:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:29:30 crc kubenswrapper[4766]: I1002 12:29:30.881719 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:29:30 crc kubenswrapper[4766]: E1002 12:29:30.882661 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:29:33 crc kubenswrapper[4766]: I1002 12:29:33.302756 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 12:29:33 crc kubenswrapper[4766]: I1002 12:29:33.304220 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 12:29:33 crc kubenswrapper[4766]: I1002 12:29:33.307551 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 12:29:33 crc kubenswrapper[4766]: I1002 12:29:33.307935 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 12:29:33 crc kubenswrapper[4766]: I1002 12:29:33.316636 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 12:29:33 crc kubenswrapper[4766]: I1002 12:29:33.318870 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 12:29:33 crc kubenswrapper[4766]: I1002 12:29:33.322967 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 12:29:33 crc kubenswrapper[4766]: I1002 12:29:33.659933 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 12:29:33 crc kubenswrapper[4766]: I1002 12:29:33.662850 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 12:29:33 crc kubenswrapper[4766]: I1002 12:29:33.665258 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 12:29:33 crc kubenswrapper[4766]: I1002 12:29:33.930530 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5db468fcd9-hx5kk"] Oct 02 12:29:33 crc kubenswrapper[4766]: I1002 12:29:33.933004 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:33 crc kubenswrapper[4766]: I1002 12:29:33.955337 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5db468fcd9-hx5kk"] Oct 02 12:29:34 crc kubenswrapper[4766]: I1002 12:29:34.046810 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chbvm\" (UniqueName: \"kubernetes.io/projected/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-kube-api-access-chbvm\") pod \"dnsmasq-dns-5db468fcd9-hx5kk\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:34 crc kubenswrapper[4766]: I1002 12:29:34.046875 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-dns-svc\") pod \"dnsmasq-dns-5db468fcd9-hx5kk\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:34 crc kubenswrapper[4766]: I1002 12:29:34.046924 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-config\") pod \"dnsmasq-dns-5db468fcd9-hx5kk\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:34 crc kubenswrapper[4766]: I1002 12:29:34.047333 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-ovsdbserver-nb\") pod \"dnsmasq-dns-5db468fcd9-hx5kk\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:34 crc kubenswrapper[4766]: I1002 12:29:34.047381 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-ovsdbserver-sb\") pod \"dnsmasq-dns-5db468fcd9-hx5kk\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:34 crc kubenswrapper[4766]: I1002 12:29:34.149309 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-ovsdbserver-nb\") pod \"dnsmasq-dns-5db468fcd9-hx5kk\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:34 crc kubenswrapper[4766]: I1002 12:29:34.149715 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-ovsdbserver-sb\") pod \"dnsmasq-dns-5db468fcd9-hx5kk\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:34 crc kubenswrapper[4766]: I1002 12:29:34.149781 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chbvm\" (UniqueName: \"kubernetes.io/projected/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-kube-api-access-chbvm\") pod \"dnsmasq-dns-5db468fcd9-hx5kk\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:34 crc kubenswrapper[4766]: I1002 12:29:34.149810 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-dns-svc\") pod \"dnsmasq-dns-5db468fcd9-hx5kk\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:34 crc kubenswrapper[4766]: I1002 12:29:34.149838 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-config\") pod \"dnsmasq-dns-5db468fcd9-hx5kk\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:34 crc kubenswrapper[4766]: I1002 12:29:34.150587 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-ovsdbserver-nb\") pod \"dnsmasq-dns-5db468fcd9-hx5kk\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:34 crc kubenswrapper[4766]: I1002 12:29:34.150741 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-config\") pod \"dnsmasq-dns-5db468fcd9-hx5kk\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:34 crc kubenswrapper[4766]: I1002 12:29:34.151515 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-dns-svc\") pod \"dnsmasq-dns-5db468fcd9-hx5kk\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:34 crc kubenswrapper[4766]: I1002 12:29:34.151590 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-ovsdbserver-sb\") pod \"dnsmasq-dns-5db468fcd9-hx5kk\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:34 crc kubenswrapper[4766]: I1002 12:29:34.174893 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chbvm\" (UniqueName: \"kubernetes.io/projected/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-kube-api-access-chbvm\") pod \"dnsmasq-dns-5db468fcd9-hx5kk\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:34 crc kubenswrapper[4766]: I1002 12:29:34.270805 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:34 crc kubenswrapper[4766]: I1002 12:29:34.701393 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5db468fcd9-hx5kk"] Oct 02 12:29:35 crc kubenswrapper[4766]: I1002 12:29:35.691122 4766 generic.go:334] "Generic (PLEG): container finished" podID="2c81e7a7-498f-47d8-9a53-6e25bbbcb19d" containerID="8bce77a94b6ebd8897ddfa6d6d3fee3d3c1df00330b09945ee8dd896c6b85ab7" exitCode=0 Oct 02 12:29:35 crc kubenswrapper[4766]: I1002 12:29:35.691287 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" event={"ID":"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d","Type":"ContainerDied","Data":"8bce77a94b6ebd8897ddfa6d6d3fee3d3c1df00330b09945ee8dd896c6b85ab7"} Oct 02 12:29:35 crc kubenswrapper[4766]: I1002 12:29:35.691626 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" event={"ID":"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d","Type":"ContainerStarted","Data":"37e3953bab6b7be17079dde1983a385ae266d2fb854dfc2480a635a12a03c4dd"} Oct 02 12:29:36 crc kubenswrapper[4766]: I1002 12:29:36.704827 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" event={"ID":"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d","Type":"ContainerStarted","Data":"da44561f601cd1c3bff4ac35561fd2fb1764e34645da4a7eb2fa036e31f05826"} Oct 02 12:29:36 crc kubenswrapper[4766]: I1002 12:29:36.705149 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:36 crc kubenswrapper[4766]: I1002 12:29:36.732666 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" podStartSLOduration=3.732638504 podStartE2EDuration="3.732638504s" podCreationTimestamp="2025-10-02 12:29:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:29:36.722671066 +0000 UTC m=+5891.665542030" watchObservedRunningTime="2025-10-02 12:29:36.732638504 +0000 UTC m=+5891.675509448" Oct 02 12:29:42 crc kubenswrapper[4766]: I1002 12:29:42.882672 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:29:42 crc kubenswrapper[4766]: E1002 12:29:42.883801 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:29:44 crc kubenswrapper[4766]: I1002 12:29:44.272884 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:29:44 crc kubenswrapper[4766]: I1002 12:29:44.337254 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-874ccdbdf-52m9h"] Oct 02 12:29:44 crc kubenswrapper[4766]: I1002 12:29:44.341001 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" podUID="001f22a0-4fb3-4795-b1aa-9ac191cab329" containerName="dnsmasq-dns" containerID="cri-o://4c4127f2a45399f5cdee38356dc3c43bb6e30e98c139aeaaaff76896baed8b62" gracePeriod=10 Oct 02 12:29:44 crc kubenswrapper[4766]: I1002 12:29:44.823902 4766 generic.go:334] "Generic (PLEG): container finished" podID="001f22a0-4fb3-4795-b1aa-9ac191cab329" containerID="4c4127f2a45399f5cdee38356dc3c43bb6e30e98c139aeaaaff76896baed8b62" exitCode=0 Oct 02 12:29:44 crc kubenswrapper[4766]: I1002 12:29:44.824313 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" event={"ID":"001f22a0-4fb3-4795-b1aa-9ac191cab329","Type":"ContainerDied","Data":"4c4127f2a45399f5cdee38356dc3c43bb6e30e98c139aeaaaff76896baed8b62"} Oct 02 12:29:44 crc kubenswrapper[4766]: I1002 12:29:44.824347 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" event={"ID":"001f22a0-4fb3-4795-b1aa-9ac191cab329","Type":"ContainerDied","Data":"a2349f05da5257edc3717afc222b6139472169407bd696198096046a281f66a8"} Oct 02 12:29:44 crc kubenswrapper[4766]: I1002 12:29:44.824360 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2349f05da5257edc3717afc222b6139472169407bd696198096046a281f66a8" Oct 02 12:29:44 crc kubenswrapper[4766]: I1002 12:29:44.843536 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:29:44 crc kubenswrapper[4766]: I1002 12:29:44.988907 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-dns-svc\") pod \"001f22a0-4fb3-4795-b1aa-9ac191cab329\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " Oct 02 12:29:44 crc kubenswrapper[4766]: I1002 12:29:44.989086 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-ovsdbserver-nb\") pod \"001f22a0-4fb3-4795-b1aa-9ac191cab329\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " Oct 02 12:29:44 crc kubenswrapper[4766]: I1002 12:29:44.989116 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-config\") pod \"001f22a0-4fb3-4795-b1aa-9ac191cab329\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " Oct 02 12:29:44 crc kubenswrapper[4766]: I1002 12:29:44.989179 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-ovsdbserver-sb\") pod \"001f22a0-4fb3-4795-b1aa-9ac191cab329\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " Oct 02 12:29:44 crc kubenswrapper[4766]: I1002 12:29:44.989267 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8pqd\" (UniqueName: \"kubernetes.io/projected/001f22a0-4fb3-4795-b1aa-9ac191cab329-kube-api-access-s8pqd\") pod \"001f22a0-4fb3-4795-b1aa-9ac191cab329\" (UID: \"001f22a0-4fb3-4795-b1aa-9ac191cab329\") " Oct 02 12:29:44 crc kubenswrapper[4766]: I1002 12:29:44.996461 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/001f22a0-4fb3-4795-b1aa-9ac191cab329-kube-api-access-s8pqd" (OuterVolumeSpecName: "kube-api-access-s8pqd") pod "001f22a0-4fb3-4795-b1aa-9ac191cab329" (UID: "001f22a0-4fb3-4795-b1aa-9ac191cab329"). InnerVolumeSpecName "kube-api-access-s8pqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:29:45 crc kubenswrapper[4766]: I1002 12:29:45.044676 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "001f22a0-4fb3-4795-b1aa-9ac191cab329" (UID: "001f22a0-4fb3-4795-b1aa-9ac191cab329"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:29:45 crc kubenswrapper[4766]: I1002 12:29:45.052469 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-config" (OuterVolumeSpecName: "config") pod "001f22a0-4fb3-4795-b1aa-9ac191cab329" (UID: "001f22a0-4fb3-4795-b1aa-9ac191cab329"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:29:45 crc kubenswrapper[4766]: I1002 12:29:45.054831 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "001f22a0-4fb3-4795-b1aa-9ac191cab329" (UID: "001f22a0-4fb3-4795-b1aa-9ac191cab329"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:29:45 crc kubenswrapper[4766]: I1002 12:29:45.056145 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "001f22a0-4fb3-4795-b1aa-9ac191cab329" (UID: "001f22a0-4fb3-4795-b1aa-9ac191cab329"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:29:45 crc kubenswrapper[4766]: I1002 12:29:45.092468 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8pqd\" (UniqueName: \"kubernetes.io/projected/001f22a0-4fb3-4795-b1aa-9ac191cab329-kube-api-access-s8pqd\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:45 crc kubenswrapper[4766]: I1002 12:29:45.092560 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:45 crc kubenswrapper[4766]: I1002 12:29:45.092580 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:45 crc kubenswrapper[4766]: I1002 12:29:45.092607 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:45 crc kubenswrapper[4766]: I1002 12:29:45.092617 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/001f22a0-4fb3-4795-b1aa-9ac191cab329-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:45 crc kubenswrapper[4766]: I1002 12:29:45.836781 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-874ccdbdf-52m9h" Oct 02 12:29:45 crc kubenswrapper[4766]: I1002 12:29:45.871892 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-874ccdbdf-52m9h"] Oct 02 12:29:45 crc kubenswrapper[4766]: I1002 12:29:45.879877 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-874ccdbdf-52m9h"] Oct 02 12:29:45 crc kubenswrapper[4766]: I1002 12:29:45.895382 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="001f22a0-4fb3-4795-b1aa-9ac191cab329" path="/var/lib/kubelet/pods/001f22a0-4fb3-4795-b1aa-9ac191cab329/volumes" Oct 02 12:29:48 crc kubenswrapper[4766]: I1002 12:29:48.077786 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-f2tmv"] Oct 02 12:29:48 crc kubenswrapper[4766]: E1002 12:29:48.078434 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001f22a0-4fb3-4795-b1aa-9ac191cab329" containerName="init" Oct 02 12:29:48 crc kubenswrapper[4766]: I1002 12:29:48.078456 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="001f22a0-4fb3-4795-b1aa-9ac191cab329" containerName="init" Oct 02 12:29:48 crc kubenswrapper[4766]: E1002 12:29:48.078477 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001f22a0-4fb3-4795-b1aa-9ac191cab329" containerName="dnsmasq-dns" Oct 02 12:29:48 crc kubenswrapper[4766]: I1002 12:29:48.078485 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="001f22a0-4fb3-4795-b1aa-9ac191cab329" containerName="dnsmasq-dns" Oct 02 12:29:48 crc kubenswrapper[4766]: I1002 12:29:48.078799 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="001f22a0-4fb3-4795-b1aa-9ac191cab329" containerName="dnsmasq-dns" Oct 02 12:29:48 crc kubenswrapper[4766]: I1002 12:29:48.079746 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-f2tmv" Oct 02 12:29:48 crc kubenswrapper[4766]: I1002 12:29:48.101776 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-f2tmv"] Oct 02 12:29:48 crc kubenswrapper[4766]: I1002 12:29:48.164643 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnbp5\" (UniqueName: \"kubernetes.io/projected/39c9842b-d4c5-4942-832c-568207d18446-kube-api-access-wnbp5\") pod \"cinder-db-create-f2tmv\" (UID: \"39c9842b-d4c5-4942-832c-568207d18446\") " pod="openstack/cinder-db-create-f2tmv" Oct 02 12:29:48 crc kubenswrapper[4766]: I1002 12:29:48.266847 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnbp5\" (UniqueName: \"kubernetes.io/projected/39c9842b-d4c5-4942-832c-568207d18446-kube-api-access-wnbp5\") pod \"cinder-db-create-f2tmv\" (UID: \"39c9842b-d4c5-4942-832c-568207d18446\") " pod="openstack/cinder-db-create-f2tmv" Oct 02 12:29:48 crc kubenswrapper[4766]: I1002 12:29:48.290209 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnbp5\" (UniqueName: \"kubernetes.io/projected/39c9842b-d4c5-4942-832c-568207d18446-kube-api-access-wnbp5\") pod \"cinder-db-create-f2tmv\" (UID: \"39c9842b-d4c5-4942-832c-568207d18446\") " pod="openstack/cinder-db-create-f2tmv" Oct 02 12:29:48 crc kubenswrapper[4766]: I1002 12:29:48.411526 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-f2tmv" Oct 02 12:29:48 crc kubenswrapper[4766]: I1002 12:29:48.916969 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-f2tmv"] Oct 02 12:29:49 crc kubenswrapper[4766]: I1002 12:29:49.883033 4766 generic.go:334] "Generic (PLEG): container finished" podID="39c9842b-d4c5-4942-832c-568207d18446" containerID="92478fb37e7efbc7d0842cb54c7b2f68a2cad7eacb4afe2ffbe5518aefe7e0a0" exitCode=0 Oct 02 12:29:49 crc kubenswrapper[4766]: I1002 12:29:49.895637 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-f2tmv" event={"ID":"39c9842b-d4c5-4942-832c-568207d18446","Type":"ContainerDied","Data":"92478fb37e7efbc7d0842cb54c7b2f68a2cad7eacb4afe2ffbe5518aefe7e0a0"} Oct 02 12:29:49 crc kubenswrapper[4766]: I1002 12:29:49.895692 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-f2tmv" event={"ID":"39c9842b-d4c5-4942-832c-568207d18446","Type":"ContainerStarted","Data":"9daac95c8f3837a944f14e698586ba3de41fc42f37d426938f0091e01ddd0c47"} Oct 02 12:29:51 crc kubenswrapper[4766]: I1002 12:29:51.314994 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-f2tmv" Oct 02 12:29:51 crc kubenswrapper[4766]: I1002 12:29:51.441122 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnbp5\" (UniqueName: \"kubernetes.io/projected/39c9842b-d4c5-4942-832c-568207d18446-kube-api-access-wnbp5\") pod \"39c9842b-d4c5-4942-832c-568207d18446\" (UID: \"39c9842b-d4c5-4942-832c-568207d18446\") " Oct 02 12:29:51 crc kubenswrapper[4766]: I1002 12:29:51.451490 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39c9842b-d4c5-4942-832c-568207d18446-kube-api-access-wnbp5" (OuterVolumeSpecName: "kube-api-access-wnbp5") pod "39c9842b-d4c5-4942-832c-568207d18446" (UID: "39c9842b-d4c5-4942-832c-568207d18446"). InnerVolumeSpecName "kube-api-access-wnbp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:29:51 crc kubenswrapper[4766]: I1002 12:29:51.545200 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnbp5\" (UniqueName: \"kubernetes.io/projected/39c9842b-d4c5-4942-832c-568207d18446-kube-api-access-wnbp5\") on node \"crc\" DevicePath \"\"" Oct 02 12:29:51 crc kubenswrapper[4766]: I1002 12:29:51.907153 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-f2tmv" event={"ID":"39c9842b-d4c5-4942-832c-568207d18446","Type":"ContainerDied","Data":"9daac95c8f3837a944f14e698586ba3de41fc42f37d426938f0091e01ddd0c47"} Oct 02 12:29:51 crc kubenswrapper[4766]: I1002 12:29:51.907209 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9daac95c8f3837a944f14e698586ba3de41fc42f37d426938f0091e01ddd0c47" Oct 02 12:29:51 crc kubenswrapper[4766]: I1002 12:29:51.907278 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-f2tmv" Oct 02 12:29:53 crc kubenswrapper[4766]: I1002 12:29:53.882081 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:29:53 crc kubenswrapper[4766]: E1002 12:29:53.882782 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:29:58 crc kubenswrapper[4766]: I1002 12:29:58.175672 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-bc2d-account-create-9j459"] Oct 02 12:29:58 crc kubenswrapper[4766]: E1002 12:29:58.176757 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39c9842b-d4c5-4942-832c-568207d18446" containerName="mariadb-database-create" Oct 02 12:29:58 crc kubenswrapper[4766]: I1002 12:29:58.176774 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="39c9842b-d4c5-4942-832c-568207d18446" containerName="mariadb-database-create" Oct 02 12:29:58 crc kubenswrapper[4766]: I1002 12:29:58.176990 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="39c9842b-d4c5-4942-832c-568207d18446" containerName="mariadb-database-create" Oct 02 12:29:58 crc kubenswrapper[4766]: I1002 12:29:58.177757 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bc2d-account-create-9j459" Oct 02 12:29:58 crc kubenswrapper[4766]: I1002 12:29:58.179848 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 02 12:29:58 crc kubenswrapper[4766]: I1002 12:29:58.198936 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-bc2d-account-create-9j459"] Oct 02 12:29:58 crc kubenswrapper[4766]: I1002 12:29:58.277973 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg6mz\" (UniqueName: \"kubernetes.io/projected/2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01-kube-api-access-fg6mz\") pod \"cinder-bc2d-account-create-9j459\" (UID: \"2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01\") " pod="openstack/cinder-bc2d-account-create-9j459" Oct 02 12:29:58 crc kubenswrapper[4766]: I1002 12:29:58.379643 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg6mz\" (UniqueName: \"kubernetes.io/projected/2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01-kube-api-access-fg6mz\") pod \"cinder-bc2d-account-create-9j459\" (UID: \"2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01\") " pod="openstack/cinder-bc2d-account-create-9j459" Oct 02 12:29:58 crc kubenswrapper[4766]: I1002 12:29:58.403047 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg6mz\" (UniqueName: \"kubernetes.io/projected/2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01-kube-api-access-fg6mz\") pod \"cinder-bc2d-account-create-9j459\" (UID: \"2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01\") " pod="openstack/cinder-bc2d-account-create-9j459" Oct 02 12:29:58 crc kubenswrapper[4766]: I1002 12:29:58.499563 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bc2d-account-create-9j459" Oct 02 12:29:58 crc kubenswrapper[4766]: I1002 12:29:58.992296 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-bc2d-account-create-9j459"] Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.005752 4766 generic.go:334] "Generic (PLEG): container finished" podID="2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01" containerID="c3cb913deafbca13578a59bbc38999d89f70caf971421486233a1d745ea80365" exitCode=0 Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.005896 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bc2d-account-create-9j459" event={"ID":"2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01","Type":"ContainerDied","Data":"c3cb913deafbca13578a59bbc38999d89f70caf971421486233a1d745ea80365"} Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.006458 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bc2d-account-create-9j459" event={"ID":"2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01","Type":"ContainerStarted","Data":"3c1f18b3af0aaabf225ac53c3850266b4d7966e6c9031a3cbff373e0acc4789f"} Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.154173 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz"] Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.156473 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz" Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.159812 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.160082 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.171837 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz"] Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.219348 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1948a4f1-8655-488c-8519-4e5e05806567-secret-volume\") pod \"collect-profiles-29323470-7vnzz\" (UID: \"1948a4f1-8655-488c-8519-4e5e05806567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz" Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.219864 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkcpq\" (UniqueName: \"kubernetes.io/projected/1948a4f1-8655-488c-8519-4e5e05806567-kube-api-access-wkcpq\") pod \"collect-profiles-29323470-7vnzz\" (UID: \"1948a4f1-8655-488c-8519-4e5e05806567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz" Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.219929 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1948a4f1-8655-488c-8519-4e5e05806567-config-volume\") pod \"collect-profiles-29323470-7vnzz\" (UID: \"1948a4f1-8655-488c-8519-4e5e05806567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz" Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.322465 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1948a4f1-8655-488c-8519-4e5e05806567-secret-volume\") pod \"collect-profiles-29323470-7vnzz\" (UID: \"1948a4f1-8655-488c-8519-4e5e05806567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz" Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.322957 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkcpq\" (UniqueName: \"kubernetes.io/projected/1948a4f1-8655-488c-8519-4e5e05806567-kube-api-access-wkcpq\") pod \"collect-profiles-29323470-7vnzz\" (UID: \"1948a4f1-8655-488c-8519-4e5e05806567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz" Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.323004 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1948a4f1-8655-488c-8519-4e5e05806567-config-volume\") pod \"collect-profiles-29323470-7vnzz\" (UID: \"1948a4f1-8655-488c-8519-4e5e05806567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz" Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.324897 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1948a4f1-8655-488c-8519-4e5e05806567-config-volume\") pod \"collect-profiles-29323470-7vnzz\" (UID: \"1948a4f1-8655-488c-8519-4e5e05806567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz" Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.335393 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1948a4f1-8655-488c-8519-4e5e05806567-secret-volume\") pod \"collect-profiles-29323470-7vnzz\" (UID: \"1948a4f1-8655-488c-8519-4e5e05806567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz" Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.344210 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkcpq\" (UniqueName: \"kubernetes.io/projected/1948a4f1-8655-488c-8519-4e5e05806567-kube-api-access-wkcpq\") pod \"collect-profiles-29323470-7vnzz\" (UID: \"1948a4f1-8655-488c-8519-4e5e05806567\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz" Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.487917 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz" Oct 02 12:30:00 crc kubenswrapper[4766]: I1002 12:30:00.946047 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz"] Oct 02 12:30:00 crc kubenswrapper[4766]: W1002 12:30:00.949841 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1948a4f1_8655_488c_8519_4e5e05806567.slice/crio-cbd1936edbd73f87c93aad6d131fdda3d6cd4a5ba658a48edd31a0638ab5410f WatchSource:0}: Error finding container cbd1936edbd73f87c93aad6d131fdda3d6cd4a5ba658a48edd31a0638ab5410f: Status 404 returned error can't find the container with id cbd1936edbd73f87c93aad6d131fdda3d6cd4a5ba658a48edd31a0638ab5410f Oct 02 12:30:01 crc kubenswrapper[4766]: I1002 12:30:01.034155 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz" event={"ID":"1948a4f1-8655-488c-8519-4e5e05806567","Type":"ContainerStarted","Data":"cbd1936edbd73f87c93aad6d131fdda3d6cd4a5ba658a48edd31a0638ab5410f"} Oct 02 12:30:01 crc kubenswrapper[4766]: I1002 12:30:01.331748 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bc2d-account-create-9j459" Oct 02 12:30:01 crc kubenswrapper[4766]: I1002 12:30:01.446618 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fg6mz\" (UniqueName: \"kubernetes.io/projected/2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01-kube-api-access-fg6mz\") pod \"2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01\" (UID: \"2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01\") " Oct 02 12:30:01 crc kubenswrapper[4766]: I1002 12:30:01.454682 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01-kube-api-access-fg6mz" (OuterVolumeSpecName: "kube-api-access-fg6mz") pod "2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01" (UID: "2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01"). InnerVolumeSpecName "kube-api-access-fg6mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:01 crc kubenswrapper[4766]: I1002 12:30:01.550180 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fg6mz\" (UniqueName: \"kubernetes.io/projected/2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01-kube-api-access-fg6mz\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:02 crc kubenswrapper[4766]: I1002 12:30:02.049089 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bc2d-account-create-9j459" Oct 02 12:30:02 crc kubenswrapper[4766]: I1002 12:30:02.049075 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bc2d-account-create-9j459" event={"ID":"2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01","Type":"ContainerDied","Data":"3c1f18b3af0aaabf225ac53c3850266b4d7966e6c9031a3cbff373e0acc4789f"} Oct 02 12:30:02 crc kubenswrapper[4766]: I1002 12:30:02.049222 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c1f18b3af0aaabf225ac53c3850266b4d7966e6c9031a3cbff373e0acc4789f" Oct 02 12:30:02 crc kubenswrapper[4766]: I1002 12:30:02.053309 4766 generic.go:334] "Generic (PLEG): container finished" podID="1948a4f1-8655-488c-8519-4e5e05806567" containerID="206d025528e4fbb40e49dd73b157c1bf2fdc662d99e87adc30308eb2fc0b237b" exitCode=0 Oct 02 12:30:02 crc kubenswrapper[4766]: I1002 12:30:02.053374 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz" event={"ID":"1948a4f1-8655-488c-8519-4e5e05806567","Type":"ContainerDied","Data":"206d025528e4fbb40e49dd73b157c1bf2fdc662d99e87adc30308eb2fc0b237b"} Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.426457 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-84v6j"] Oct 02 12:30:03 crc kubenswrapper[4766]: E1002 12:30:03.427494 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01" containerName="mariadb-account-create" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.428010 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01" containerName="mariadb-account-create" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.428225 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01" containerName="mariadb-account-create" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.428919 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.433363 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.433573 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.433772 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xpzgz" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.439056 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-84v6j"] Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.464006 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.596369 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkcpq\" (UniqueName: \"kubernetes.io/projected/1948a4f1-8655-488c-8519-4e5e05806567-kube-api-access-wkcpq\") pod \"1948a4f1-8655-488c-8519-4e5e05806567\" (UID: \"1948a4f1-8655-488c-8519-4e5e05806567\") " Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.596932 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1948a4f1-8655-488c-8519-4e5e05806567-secret-volume\") pod \"1948a4f1-8655-488c-8519-4e5e05806567\" (UID: \"1948a4f1-8655-488c-8519-4e5e05806567\") " Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.597010 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1948a4f1-8655-488c-8519-4e5e05806567-config-volume\") pod \"1948a4f1-8655-488c-8519-4e5e05806567\" (UID: \"1948a4f1-8655-488c-8519-4e5e05806567\") " Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.598069 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1948a4f1-8655-488c-8519-4e5e05806567-config-volume" (OuterVolumeSpecName: "config-volume") pod "1948a4f1-8655-488c-8519-4e5e05806567" (UID: "1948a4f1-8655-488c-8519-4e5e05806567"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.598468 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38f0193d-efaf-4515-9101-767548e3e007-etc-machine-id\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.598561 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-config-data\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.598664 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-db-sync-config-data\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.598739 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-combined-ca-bundle\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.598824 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-scripts\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.598874 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j76sd\" (UniqueName: \"kubernetes.io/projected/38f0193d-efaf-4515-9101-767548e3e007-kube-api-access-j76sd\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.598961 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1948a4f1-8655-488c-8519-4e5e05806567-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.611953 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1948a4f1-8655-488c-8519-4e5e05806567-kube-api-access-wkcpq" (OuterVolumeSpecName: "kube-api-access-wkcpq") pod "1948a4f1-8655-488c-8519-4e5e05806567" (UID: "1948a4f1-8655-488c-8519-4e5e05806567"). InnerVolumeSpecName "kube-api-access-wkcpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.615181 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1948a4f1-8655-488c-8519-4e5e05806567-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1948a4f1-8655-488c-8519-4e5e05806567" (UID: "1948a4f1-8655-488c-8519-4e5e05806567"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.701491 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-db-sync-config-data\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.701600 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-combined-ca-bundle\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.701633 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-scripts\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.701672 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j76sd\" (UniqueName: \"kubernetes.io/projected/38f0193d-efaf-4515-9101-767548e3e007-kube-api-access-j76sd\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.702613 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38f0193d-efaf-4515-9101-767548e3e007-etc-machine-id\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.702662 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-config-data\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.702686 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38f0193d-efaf-4515-9101-767548e3e007-etc-machine-id\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.702779 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1948a4f1-8655-488c-8519-4e5e05806567-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.702803 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkcpq\" (UniqueName: \"kubernetes.io/projected/1948a4f1-8655-488c-8519-4e5e05806567-kube-api-access-wkcpq\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.706616 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-scripts\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.706857 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-combined-ca-bundle\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.706864 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-db-sync-config-data\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.710181 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-config-data\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.723353 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j76sd\" (UniqueName: \"kubernetes.io/projected/38f0193d-efaf-4515-9101-767548e3e007-kube-api-access-j76sd\") pod \"cinder-db-sync-84v6j\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:03 crc kubenswrapper[4766]: I1002 12:30:03.780694 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:04 crc kubenswrapper[4766]: I1002 12:30:04.080668 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz" event={"ID":"1948a4f1-8655-488c-8519-4e5e05806567","Type":"ContainerDied","Data":"cbd1936edbd73f87c93aad6d131fdda3d6cd4a5ba658a48edd31a0638ab5410f"} Oct 02 12:30:04 crc kubenswrapper[4766]: I1002 12:30:04.081095 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbd1936edbd73f87c93aad6d131fdda3d6cd4a5ba658a48edd31a0638ab5410f" Oct 02 12:30:04 crc kubenswrapper[4766]: I1002 12:30:04.080772 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz" Oct 02 12:30:04 crc kubenswrapper[4766]: I1002 12:30:04.331552 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-84v6j"] Oct 02 12:30:04 crc kubenswrapper[4766]: W1002 12:30:04.346400 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38f0193d_efaf_4515_9101_767548e3e007.slice/crio-c6b0269618a82274c45d63b19dd3b49041f0b6b92a424e18650b717a0d9b82d4 WatchSource:0}: Error finding container c6b0269618a82274c45d63b19dd3b49041f0b6b92a424e18650b717a0d9b82d4: Status 404 returned error can't find the container with id c6b0269618a82274c45d63b19dd3b49041f0b6b92a424e18650b717a0d9b82d4 Oct 02 12:30:04 crc kubenswrapper[4766]: I1002 12:30:04.551684 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp"] Oct 02 12:30:04 crc kubenswrapper[4766]: I1002 12:30:04.564575 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323425-z7vrp"] Oct 02 12:30:05 crc kubenswrapper[4766]: I1002 12:30:05.092811 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-84v6j" event={"ID":"38f0193d-efaf-4515-9101-767548e3e007","Type":"ContainerStarted","Data":"4d87032f43fab594831fd93d70bcbae93900e68d70d9a41f5e2eeb3549e3ccc2"} Oct 02 12:30:05 crc kubenswrapper[4766]: I1002 12:30:05.093179 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-84v6j" event={"ID":"38f0193d-efaf-4515-9101-767548e3e007","Type":"ContainerStarted","Data":"c6b0269618a82274c45d63b19dd3b49041f0b6b92a424e18650b717a0d9b82d4"} Oct 02 12:30:05 crc kubenswrapper[4766]: I1002 12:30:05.122909 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-84v6j" podStartSLOduration=2.1228845 podStartE2EDuration="2.1228845s" podCreationTimestamp="2025-10-02 12:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:30:05.113110118 +0000 UTC m=+5920.055981062" watchObservedRunningTime="2025-10-02 12:30:05.1228845 +0000 UTC m=+5920.065755444" Oct 02 12:30:05 crc kubenswrapper[4766]: I1002 12:30:05.887350 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:30:05 crc kubenswrapper[4766]: E1002 12:30:05.887723 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:30:05 crc kubenswrapper[4766]: I1002 12:30:05.897221 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dd6eb36-2135-475c-9b70-610546403d3c" path="/var/lib/kubelet/pods/1dd6eb36-2135-475c-9b70-610546403d3c/volumes" Oct 02 12:30:08 crc kubenswrapper[4766]: I1002 12:30:08.124193 4766 generic.go:334] "Generic (PLEG): container finished" podID="38f0193d-efaf-4515-9101-767548e3e007" containerID="4d87032f43fab594831fd93d70bcbae93900e68d70d9a41f5e2eeb3549e3ccc2" exitCode=0 Oct 02 12:30:08 crc kubenswrapper[4766]: I1002 12:30:08.124326 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-84v6j" event={"ID":"38f0193d-efaf-4515-9101-767548e3e007","Type":"ContainerDied","Data":"4d87032f43fab594831fd93d70bcbae93900e68d70d9a41f5e2eeb3549e3ccc2"} Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.498624 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.542140 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j76sd\" (UniqueName: \"kubernetes.io/projected/38f0193d-efaf-4515-9101-767548e3e007-kube-api-access-j76sd\") pod \"38f0193d-efaf-4515-9101-767548e3e007\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.542235 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38f0193d-efaf-4515-9101-767548e3e007-etc-machine-id\") pod \"38f0193d-efaf-4515-9101-767548e3e007\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.542273 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-config-data\") pod \"38f0193d-efaf-4515-9101-767548e3e007\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.542365 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/38f0193d-efaf-4515-9101-767548e3e007-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "38f0193d-efaf-4515-9101-767548e3e007" (UID: "38f0193d-efaf-4515-9101-767548e3e007"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.542392 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-scripts\") pod \"38f0193d-efaf-4515-9101-767548e3e007\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.542656 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-db-sync-config-data\") pod \"38f0193d-efaf-4515-9101-767548e3e007\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.542768 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-combined-ca-bundle\") pod \"38f0193d-efaf-4515-9101-767548e3e007\" (UID: \"38f0193d-efaf-4515-9101-767548e3e007\") " Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.543951 4766 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/38f0193d-efaf-4515-9101-767548e3e007-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.548440 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-scripts" (OuterVolumeSpecName: "scripts") pod "38f0193d-efaf-4515-9101-767548e3e007" (UID: "38f0193d-efaf-4515-9101-767548e3e007"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.549400 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38f0193d-efaf-4515-9101-767548e3e007-kube-api-access-j76sd" (OuterVolumeSpecName: "kube-api-access-j76sd") pod "38f0193d-efaf-4515-9101-767548e3e007" (UID: "38f0193d-efaf-4515-9101-767548e3e007"). InnerVolumeSpecName "kube-api-access-j76sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.559231 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "38f0193d-efaf-4515-9101-767548e3e007" (UID: "38f0193d-efaf-4515-9101-767548e3e007"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.577416 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38f0193d-efaf-4515-9101-767548e3e007" (UID: "38f0193d-efaf-4515-9101-767548e3e007"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.593891 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-config-data" (OuterVolumeSpecName: "config-data") pod "38f0193d-efaf-4515-9101-767548e3e007" (UID: "38f0193d-efaf-4515-9101-767548e3e007"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.646406 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.646521 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j76sd\" (UniqueName: \"kubernetes.io/projected/38f0193d-efaf-4515-9101-767548e3e007-kube-api-access-j76sd\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.646537 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.646545 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:09 crc kubenswrapper[4766]: I1002 12:30:09.646559 4766 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/38f0193d-efaf-4515-9101-767548e3e007-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.152084 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-84v6j" event={"ID":"38f0193d-efaf-4515-9101-767548e3e007","Type":"ContainerDied","Data":"c6b0269618a82274c45d63b19dd3b49041f0b6b92a424e18650b717a0d9b82d4"} Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.152135 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-84v6j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.152163 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6b0269618a82274c45d63b19dd3b49041f0b6b92a424e18650b717a0d9b82d4" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.472180 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c7d7b9bc7-27r4j"] Oct 02 12:30:10 crc kubenswrapper[4766]: E1002 12:30:10.473671 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38f0193d-efaf-4515-9101-767548e3e007" containerName="cinder-db-sync" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.473698 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="38f0193d-efaf-4515-9101-767548e3e007" containerName="cinder-db-sync" Oct 02 12:30:10 crc kubenswrapper[4766]: E1002 12:30:10.473735 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1948a4f1-8655-488c-8519-4e5e05806567" containerName="collect-profiles" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.473743 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1948a4f1-8655-488c-8519-4e5e05806567" containerName="collect-profiles" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.473980 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1948a4f1-8655-488c-8519-4e5e05806567" containerName="collect-profiles" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.474014 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="38f0193d-efaf-4515-9101-767548e3e007" containerName="cinder-db-sync" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.475770 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.501451 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c7d7b9bc7-27r4j"] Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.585143 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-ovsdbserver-nb\") pod \"dnsmasq-dns-6c7d7b9bc7-27r4j\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.585226 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-ovsdbserver-sb\") pod \"dnsmasq-dns-6c7d7b9bc7-27r4j\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.585289 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt4wx\" (UniqueName: \"kubernetes.io/projected/1ef60cf9-8ee9-452b-b8c5-87e84783901d-kube-api-access-tt4wx\") pod \"dnsmasq-dns-6c7d7b9bc7-27r4j\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.585342 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-dns-svc\") pod \"dnsmasq-dns-6c7d7b9bc7-27r4j\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.585421 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-config\") pod \"dnsmasq-dns-6c7d7b9bc7-27r4j\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.678199 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.680229 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.684305 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.684600 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.686007 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xpzgz" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.687472 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-ovsdbserver-nb\") pod \"dnsmasq-dns-6c7d7b9bc7-27r4j\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.687519 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-ovsdbserver-sb\") pod \"dnsmasq-dns-6c7d7b9bc7-27r4j\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.687576 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt4wx\" (UniqueName: \"kubernetes.io/projected/1ef60cf9-8ee9-452b-b8c5-87e84783901d-kube-api-access-tt4wx\") pod \"dnsmasq-dns-6c7d7b9bc7-27r4j\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.687611 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-dns-svc\") pod \"dnsmasq-dns-6c7d7b9bc7-27r4j\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.687676 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-config\") pod \"dnsmasq-dns-6c7d7b9bc7-27r4j\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.689889 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-ovsdbserver-sb\") pod \"dnsmasq-dns-6c7d7b9bc7-27r4j\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.690655 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.691282 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-config\") pod \"dnsmasq-dns-6c7d7b9bc7-27r4j\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.691490 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-dns-svc\") pod \"dnsmasq-dns-6c7d7b9bc7-27r4j\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.691947 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-ovsdbserver-nb\") pod \"dnsmasq-dns-6c7d7b9bc7-27r4j\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.705406 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.717964 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt4wx\" (UniqueName: \"kubernetes.io/projected/1ef60cf9-8ee9-452b-b8c5-87e84783901d-kube-api-access-tt4wx\") pod \"dnsmasq-dns-6c7d7b9bc7-27r4j\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.792857 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.793014 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdq59\" (UniqueName: \"kubernetes.io/projected/06194cbc-5fd7-4288-9046-51a2575fd81b-kube-api-access-hdq59\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.793049 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-scripts\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.793119 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06194cbc-5fd7-4288-9046-51a2575fd81b-logs\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.793224 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06194cbc-5fd7-4288-9046-51a2575fd81b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.793256 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-config-data-custom\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.793332 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-config-data\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.820012 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.896018 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-config-data-custom\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.896106 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-config-data\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.896141 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.896176 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdq59\" (UniqueName: \"kubernetes.io/projected/06194cbc-5fd7-4288-9046-51a2575fd81b-kube-api-access-hdq59\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.896201 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-scripts\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.896248 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06194cbc-5fd7-4288-9046-51a2575fd81b-logs\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.896309 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06194cbc-5fd7-4288-9046-51a2575fd81b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.896389 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06194cbc-5fd7-4288-9046-51a2575fd81b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.897089 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06194cbc-5fd7-4288-9046-51a2575fd81b-logs\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.901184 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.903424 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-scripts\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.906722 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-config-data\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.918166 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-config-data-custom\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:10 crc kubenswrapper[4766]: I1002 12:30:10.921011 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdq59\" (UniqueName: \"kubernetes.io/projected/06194cbc-5fd7-4288-9046-51a2575fd81b-kube-api-access-hdq59\") pod \"cinder-api-0\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " pod="openstack/cinder-api-0" Oct 02 12:30:11 crc kubenswrapper[4766]: I1002 12:30:11.006164 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 12:30:11 crc kubenswrapper[4766]: I1002 12:30:11.176154 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c7d7b9bc7-27r4j"] Oct 02 12:30:11 crc kubenswrapper[4766]: I1002 12:30:11.635143 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:30:11 crc kubenswrapper[4766]: W1002 12:30:11.677741 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06194cbc_5fd7_4288_9046_51a2575fd81b.slice/crio-4acf3708b897d359ef9a0f2486c1aa20f3ee1f41c2bdbf39a28c1c5954ae3570 WatchSource:0}: Error finding container 4acf3708b897d359ef9a0f2486c1aa20f3ee1f41c2bdbf39a28c1c5954ae3570: Status 404 returned error can't find the container with id 4acf3708b897d359ef9a0f2486c1aa20f3ee1f41c2bdbf39a28c1c5954ae3570 Oct 02 12:30:12 crc kubenswrapper[4766]: I1002 12:30:12.198222 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"06194cbc-5fd7-4288-9046-51a2575fd81b","Type":"ContainerStarted","Data":"4acf3708b897d359ef9a0f2486c1aa20f3ee1f41c2bdbf39a28c1c5954ae3570"} Oct 02 12:30:12 crc kubenswrapper[4766]: I1002 12:30:12.209995 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ef60cf9-8ee9-452b-b8c5-87e84783901d" containerID="06520c87df7275dbd01fedb073480d0eb6a019a46f0914474f56e655dc00f70a" exitCode=0 Oct 02 12:30:12 crc kubenswrapper[4766]: I1002 12:30:12.210055 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" event={"ID":"1ef60cf9-8ee9-452b-b8c5-87e84783901d","Type":"ContainerDied","Data":"06520c87df7275dbd01fedb073480d0eb6a019a46f0914474f56e655dc00f70a"} Oct 02 12:30:12 crc kubenswrapper[4766]: I1002 12:30:12.210090 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" event={"ID":"1ef60cf9-8ee9-452b-b8c5-87e84783901d","Type":"ContainerStarted","Data":"1c8bfb820ab13a6aab658b3be51117c85b9c91bf250be37b3380325dee17cbde"} Oct 02 12:30:13 crc kubenswrapper[4766]: I1002 12:30:13.229352 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" event={"ID":"1ef60cf9-8ee9-452b-b8c5-87e84783901d","Type":"ContainerStarted","Data":"cca9cf9e180da68992d03a262ee7adbd5fe6bdfda996d2fa195b8546edb0f76d"} Oct 02 12:30:13 crc kubenswrapper[4766]: I1002 12:30:13.229892 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:13 crc kubenswrapper[4766]: I1002 12:30:13.233911 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"06194cbc-5fd7-4288-9046-51a2575fd81b","Type":"ContainerStarted","Data":"2e10ca896edeef17259192e71db5885fda6fc3221e0fb147f51d08380d10555a"} Oct 02 12:30:13 crc kubenswrapper[4766]: I1002 12:30:13.249911 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" podStartSLOduration=3.249867291 podStartE2EDuration="3.249867291s" podCreationTimestamp="2025-10-02 12:30:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:30:13.247740252 +0000 UTC m=+5928.190611216" watchObservedRunningTime="2025-10-02 12:30:13.249867291 +0000 UTC m=+5928.192738235" Oct 02 12:30:14 crc kubenswrapper[4766]: I1002 12:30:14.251420 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"06194cbc-5fd7-4288-9046-51a2575fd81b","Type":"ContainerStarted","Data":"f0f308e1eaa8c833c089618b3aa2d7d12798fa4050da5b98e9cb97dcdbcad6ba"} Oct 02 12:30:14 crc kubenswrapper[4766]: I1002 12:30:14.281000 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.280973903 podStartE2EDuration="4.280973903s" podCreationTimestamp="2025-10-02 12:30:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:30:14.271432288 +0000 UTC m=+5929.214303252" watchObservedRunningTime="2025-10-02 12:30:14.280973903 +0000 UTC m=+5929.223844847" Oct 02 12:30:15 crc kubenswrapper[4766]: I1002 12:30:15.262368 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 02 12:30:19 crc kubenswrapper[4766]: I1002 12:30:19.881555 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:30:19 crc kubenswrapper[4766]: E1002 12:30:19.882077 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:30:20 crc kubenswrapper[4766]: I1002 12:30:20.821796 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:30:20 crc kubenswrapper[4766]: I1002 12:30:20.921374 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5db468fcd9-hx5kk"] Oct 02 12:30:20 crc kubenswrapper[4766]: I1002 12:30:20.922313 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" podUID="2c81e7a7-498f-47d8-9a53-6e25bbbcb19d" containerName="dnsmasq-dns" containerID="cri-o://da44561f601cd1c3bff4ac35561fd2fb1764e34645da4a7eb2fa036e31f05826" gracePeriod=10 Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.391734 4766 generic.go:334] "Generic (PLEG): container finished" podID="2c81e7a7-498f-47d8-9a53-6e25bbbcb19d" containerID="da44561f601cd1c3bff4ac35561fd2fb1764e34645da4a7eb2fa036e31f05826" exitCode=0 Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.391922 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" event={"ID":"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d","Type":"ContainerDied","Data":"da44561f601cd1c3bff4ac35561fd2fb1764e34645da4a7eb2fa036e31f05826"} Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.556703 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.666579 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-ovsdbserver-sb\") pod \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.666687 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-dns-svc\") pod \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.666828 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-config\") pod \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.666886 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chbvm\" (UniqueName: \"kubernetes.io/projected/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-kube-api-access-chbvm\") pod \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.666973 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-ovsdbserver-nb\") pod \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\" (UID: \"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d\") " Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.709701 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-kube-api-access-chbvm" (OuterVolumeSpecName: "kube-api-access-chbvm") pod "2c81e7a7-498f-47d8-9a53-6e25bbbcb19d" (UID: "2c81e7a7-498f-47d8-9a53-6e25bbbcb19d"). InnerVolumeSpecName "kube-api-access-chbvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.740687 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2c81e7a7-498f-47d8-9a53-6e25bbbcb19d" (UID: "2c81e7a7-498f-47d8-9a53-6e25bbbcb19d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.769537 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.769593 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chbvm\" (UniqueName: \"kubernetes.io/projected/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-kube-api-access-chbvm\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.773664 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2c81e7a7-498f-47d8-9a53-6e25bbbcb19d" (UID: "2c81e7a7-498f-47d8-9a53-6e25bbbcb19d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.797577 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c81e7a7-498f-47d8-9a53-6e25bbbcb19d" (UID: "2c81e7a7-498f-47d8-9a53-6e25bbbcb19d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.808077 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-config" (OuterVolumeSpecName: "config") pod "2c81e7a7-498f-47d8-9a53-6e25bbbcb19d" (UID: "2c81e7a7-498f-47d8-9a53-6e25bbbcb19d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.870992 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.871032 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:21 crc kubenswrapper[4766]: I1002 12:30:21.871045 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:22 crc kubenswrapper[4766]: I1002 12:30:22.404224 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" event={"ID":"2c81e7a7-498f-47d8-9a53-6e25bbbcb19d","Type":"ContainerDied","Data":"37e3953bab6b7be17079dde1983a385ae266d2fb854dfc2480a635a12a03c4dd"} Oct 02 12:30:22 crc kubenswrapper[4766]: I1002 12:30:22.404304 4766 scope.go:117] "RemoveContainer" containerID="da44561f601cd1c3bff4ac35561fd2fb1764e34645da4a7eb2fa036e31f05826" Oct 02 12:30:22 crc kubenswrapper[4766]: I1002 12:30:22.405453 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db468fcd9-hx5kk" Oct 02 12:30:22 crc kubenswrapper[4766]: I1002 12:30:22.432013 4766 scope.go:117] "RemoveContainer" containerID="8bce77a94b6ebd8897ddfa6d6d3fee3d3c1df00330b09945ee8dd896c6b85ab7" Oct 02 12:30:22 crc kubenswrapper[4766]: I1002 12:30:22.438463 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5db468fcd9-hx5kk"] Oct 02 12:30:22 crc kubenswrapper[4766]: I1002 12:30:22.449988 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5db468fcd9-hx5kk"] Oct 02 12:30:23 crc kubenswrapper[4766]: I1002 12:30:23.031072 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 02 12:30:23 crc kubenswrapper[4766]: I1002 12:30:23.351910 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:30:23 crc kubenswrapper[4766]: I1002 12:30:23.352211 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fa2f5e53-8f06-4248-a150-6a98286af063" containerName="nova-scheduler-scheduler" containerID="cri-o://a247b068558cdfd3750ed66585d2668ad6a4593006df480bb2d08dd9f7ae1aae" gracePeriod=30 Oct 02 12:30:23 crc kubenswrapper[4766]: I1002 12:30:23.363085 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:30:23 crc kubenswrapper[4766]: I1002 12:30:23.363382 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d8ed39b6-8460-4fcd-995e-7f65b15fc4f5" containerName="nova-metadata-log" containerID="cri-o://96afa1d405dd9285accc70f49b7ece06af972846d482100dfcf7868f75129cfc" gracePeriod=30 Oct 02 12:30:23 crc kubenswrapper[4766]: I1002 12:30:23.363454 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d8ed39b6-8460-4fcd-995e-7f65b15fc4f5" containerName="nova-metadata-metadata" containerID="cri-o://8cdaa629e8ce2f86c0bda824a55d1668e40e7731b60f0274cf01b989dd2acbf8" gracePeriod=30 Oct 02 12:30:23 crc kubenswrapper[4766]: I1002 12:30:23.383048 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:30:23 crc kubenswrapper[4766]: I1002 12:30:23.383402 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="ef91c640-4b08-416f-9ff0-d67fca7d5f22" containerName="nova-cell0-conductor-conductor" containerID="cri-o://f873a5f75544bc64020277313c7bd6a384b31cff1da482b92baa87b79a343ced" gracePeriod=30 Oct 02 12:30:23 crc kubenswrapper[4766]: I1002 12:30:23.391889 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:30:23 crc kubenswrapper[4766]: I1002 12:30:23.392142 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1a0fea6f-e88f-4bd5-935a-b55869a434d0" containerName="nova-api-log" containerID="cri-o://e5db32e650fc8416620a5b9ac5433257777a38aef44ecfa4d19022695c9bc169" gracePeriod=30 Oct 02 12:30:23 crc kubenswrapper[4766]: I1002 12:30:23.392637 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1a0fea6f-e88f-4bd5-935a-b55869a434d0" containerName="nova-api-api" containerID="cri-o://052f6c1890c8d89aa97184536b47f034997d104512e606207e1b720b6be62a39" gracePeriod=30 Oct 02 12:30:23 crc kubenswrapper[4766]: I1002 12:30:23.445752 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:30:23 crc kubenswrapper[4766]: I1002 12:30:23.445967 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="48da6d4f-236b-49df-a509-388267f8db55" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://830a8a311242ba410b0f069ddea9cea453171d8b10b81489df17647eefa821e9" gracePeriod=30 Oct 02 12:30:23 crc kubenswrapper[4766]: I1002 12:30:23.896048 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c81e7a7-498f-47d8-9a53-6e25bbbcb19d" path="/var/lib/kubelet/pods/2c81e7a7-498f-47d8-9a53-6e25bbbcb19d/volumes" Oct 02 12:30:24 crc kubenswrapper[4766]: E1002 12:30:24.103946 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f873a5f75544bc64020277313c7bd6a384b31cff1da482b92baa87b79a343ced" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 12:30:24 crc kubenswrapper[4766]: E1002 12:30:24.106435 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f873a5f75544bc64020277313c7bd6a384b31cff1da482b92baa87b79a343ced" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 12:30:24 crc kubenswrapper[4766]: E1002 12:30:24.108261 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f873a5f75544bc64020277313c7bd6a384b31cff1da482b92baa87b79a343ced" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 12:30:24 crc kubenswrapper[4766]: E1002 12:30:24.108311 4766 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="ef91c640-4b08-416f-9ff0-d67fca7d5f22" containerName="nova-cell0-conductor-conductor" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.415195 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.453768 4766 generic.go:334] "Generic (PLEG): container finished" podID="48da6d4f-236b-49df-a509-388267f8db55" containerID="830a8a311242ba410b0f069ddea9cea453171d8b10b81489df17647eefa821e9" exitCode=0 Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.453851 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"48da6d4f-236b-49df-a509-388267f8db55","Type":"ContainerDied","Data":"830a8a311242ba410b0f069ddea9cea453171d8b10b81489df17647eefa821e9"} Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.453878 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"48da6d4f-236b-49df-a509-388267f8db55","Type":"ContainerDied","Data":"69f243bdab651b442736572a1dcddc1e42df45f793eb4fc2393ada6f2d92c548"} Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.453895 4766 scope.go:117] "RemoveContainer" containerID="830a8a311242ba410b0f069ddea9cea453171d8b10b81489df17647eefa821e9" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.454005 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.471274 4766 generic.go:334] "Generic (PLEG): container finished" podID="1a0fea6f-e88f-4bd5-935a-b55869a434d0" containerID="e5db32e650fc8416620a5b9ac5433257777a38aef44ecfa4d19022695c9bc169" exitCode=143 Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.471326 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a0fea6f-e88f-4bd5-935a-b55869a434d0","Type":"ContainerDied","Data":"e5db32e650fc8416620a5b9ac5433257777a38aef44ecfa4d19022695c9bc169"} Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.475717 4766 generic.go:334] "Generic (PLEG): container finished" podID="d8ed39b6-8460-4fcd-995e-7f65b15fc4f5" containerID="96afa1d405dd9285accc70f49b7ece06af972846d482100dfcf7868f75129cfc" exitCode=143 Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.475758 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5","Type":"ContainerDied","Data":"96afa1d405dd9285accc70f49b7ece06af972846d482100dfcf7868f75129cfc"} Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.510213 4766 scope.go:117] "RemoveContainer" containerID="830a8a311242ba410b0f069ddea9cea453171d8b10b81489df17647eefa821e9" Oct 02 12:30:24 crc kubenswrapper[4766]: E1002 12:30:24.510842 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"830a8a311242ba410b0f069ddea9cea453171d8b10b81489df17647eefa821e9\": container with ID starting with 830a8a311242ba410b0f069ddea9cea453171d8b10b81489df17647eefa821e9 not found: ID does not exist" containerID="830a8a311242ba410b0f069ddea9cea453171d8b10b81489df17647eefa821e9" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.510885 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"830a8a311242ba410b0f069ddea9cea453171d8b10b81489df17647eefa821e9"} err="failed to get container status \"830a8a311242ba410b0f069ddea9cea453171d8b10b81489df17647eefa821e9\": rpc error: code = NotFound desc = could not find container \"830a8a311242ba410b0f069ddea9cea453171d8b10b81489df17647eefa821e9\": container with ID starting with 830a8a311242ba410b0f069ddea9cea453171d8b10b81489df17647eefa821e9 not found: ID does not exist" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.534073 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48da6d4f-236b-49df-a509-388267f8db55-config-data\") pod \"48da6d4f-236b-49df-a509-388267f8db55\" (UID: \"48da6d4f-236b-49df-a509-388267f8db55\") " Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.534167 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48da6d4f-236b-49df-a509-388267f8db55-combined-ca-bundle\") pod \"48da6d4f-236b-49df-a509-388267f8db55\" (UID: \"48da6d4f-236b-49df-a509-388267f8db55\") " Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.534383 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2nwc\" (UniqueName: \"kubernetes.io/projected/48da6d4f-236b-49df-a509-388267f8db55-kube-api-access-h2nwc\") pod \"48da6d4f-236b-49df-a509-388267f8db55\" (UID: \"48da6d4f-236b-49df-a509-388267f8db55\") " Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.541406 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48da6d4f-236b-49df-a509-388267f8db55-kube-api-access-h2nwc" (OuterVolumeSpecName: "kube-api-access-h2nwc") pod "48da6d4f-236b-49df-a509-388267f8db55" (UID: "48da6d4f-236b-49df-a509-388267f8db55"). InnerVolumeSpecName "kube-api-access-h2nwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.570813 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48da6d4f-236b-49df-a509-388267f8db55-config-data" (OuterVolumeSpecName: "config-data") pod "48da6d4f-236b-49df-a509-388267f8db55" (UID: "48da6d4f-236b-49df-a509-388267f8db55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.571462 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48da6d4f-236b-49df-a509-388267f8db55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48da6d4f-236b-49df-a509-388267f8db55" (UID: "48da6d4f-236b-49df-a509-388267f8db55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.637420 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48da6d4f-236b-49df-a509-388267f8db55-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.637469 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48da6d4f-236b-49df-a509-388267f8db55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.637489 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2nwc\" (UniqueName: \"kubernetes.io/projected/48da6d4f-236b-49df-a509-388267f8db55-kube-api-access-h2nwc\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.808574 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.825888 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.841229 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:30:24 crc kubenswrapper[4766]: E1002 12:30:24.841967 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c81e7a7-498f-47d8-9a53-6e25bbbcb19d" containerName="init" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.841998 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c81e7a7-498f-47d8-9a53-6e25bbbcb19d" containerName="init" Oct 02 12:30:24 crc kubenswrapper[4766]: E1002 12:30:24.842055 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48da6d4f-236b-49df-a509-388267f8db55" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.842064 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="48da6d4f-236b-49df-a509-388267f8db55" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 12:30:24 crc kubenswrapper[4766]: E1002 12:30:24.842080 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c81e7a7-498f-47d8-9a53-6e25bbbcb19d" containerName="dnsmasq-dns" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.842087 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c81e7a7-498f-47d8-9a53-6e25bbbcb19d" containerName="dnsmasq-dns" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.842347 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c81e7a7-498f-47d8-9a53-6e25bbbcb19d" containerName="dnsmasq-dns" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.842380 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="48da6d4f-236b-49df-a509-388267f8db55" containerName="nova-cell1-novncproxy-novncproxy" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.843697 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.852704 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.853780 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.945162 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/972be125-9e9e-4bc0-b89b-70813ccd3f53-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"972be125-9e9e-4bc0-b89b-70813ccd3f53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.945447 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjrqm\" (UniqueName: \"kubernetes.io/projected/972be125-9e9e-4bc0-b89b-70813ccd3f53-kube-api-access-bjrqm\") pod \"nova-cell1-novncproxy-0\" (UID: \"972be125-9e9e-4bc0-b89b-70813ccd3f53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:30:24 crc kubenswrapper[4766]: I1002 12:30:24.945494 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972be125-9e9e-4bc0-b89b-70813ccd3f53-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"972be125-9e9e-4bc0-b89b-70813ccd3f53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:30:25 crc kubenswrapper[4766]: I1002 12:30:25.047037 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/972be125-9e9e-4bc0-b89b-70813ccd3f53-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"972be125-9e9e-4bc0-b89b-70813ccd3f53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:30:25 crc kubenswrapper[4766]: I1002 12:30:25.047155 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjrqm\" (UniqueName: \"kubernetes.io/projected/972be125-9e9e-4bc0-b89b-70813ccd3f53-kube-api-access-bjrqm\") pod \"nova-cell1-novncproxy-0\" (UID: \"972be125-9e9e-4bc0-b89b-70813ccd3f53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:30:25 crc kubenswrapper[4766]: I1002 12:30:25.047179 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972be125-9e9e-4bc0-b89b-70813ccd3f53-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"972be125-9e9e-4bc0-b89b-70813ccd3f53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:30:25 crc kubenswrapper[4766]: I1002 12:30:25.067374 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/972be125-9e9e-4bc0-b89b-70813ccd3f53-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"972be125-9e9e-4bc0-b89b-70813ccd3f53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:30:25 crc kubenswrapper[4766]: I1002 12:30:25.068930 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/972be125-9e9e-4bc0-b89b-70813ccd3f53-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"972be125-9e9e-4bc0-b89b-70813ccd3f53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:30:25 crc kubenswrapper[4766]: I1002 12:30:25.081214 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjrqm\" (UniqueName: \"kubernetes.io/projected/972be125-9e9e-4bc0-b89b-70813ccd3f53-kube-api-access-bjrqm\") pod \"nova-cell1-novncproxy-0\" (UID: \"972be125-9e9e-4bc0-b89b-70813ccd3f53\") " pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:30:25 crc kubenswrapper[4766]: I1002 12:30:25.179265 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:30:25 crc kubenswrapper[4766]: I1002 12:30:25.709389 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 02 12:30:25 crc kubenswrapper[4766]: I1002 12:30:25.898305 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48da6d4f-236b-49df-a509-388267f8db55" path="/var/lib/kubelet/pods/48da6d4f-236b-49df-a509-388267f8db55/volumes" Oct 02 12:30:26 crc kubenswrapper[4766]: I1002 12:30:26.501120 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"972be125-9e9e-4bc0-b89b-70813ccd3f53","Type":"ContainerStarted","Data":"375a96f38f367c33a73d3474e0826bd5bb136f519c5b2b67624554e6f34a823b"} Oct 02 12:30:26 crc kubenswrapper[4766]: I1002 12:30:26.501179 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"972be125-9e9e-4bc0-b89b-70813ccd3f53","Type":"ContainerStarted","Data":"0b2445293f20233d75d3e8b831033e365cfc647cb9048e72fabc528a1e6578ac"} Oct 02 12:30:26 crc kubenswrapper[4766]: I1002 12:30:26.609939 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.609918947 podStartE2EDuration="2.609918947s" podCreationTimestamp="2025-10-02 12:30:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:30:26.521233687 +0000 UTC m=+5941.464104631" watchObservedRunningTime="2025-10-02 12:30:26.609918947 +0000 UTC m=+5941.552789891" Oct 02 12:30:26 crc kubenswrapper[4766]: I1002 12:30:26.617171 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:30:26 crc kubenswrapper[4766]: I1002 12:30:26.617392 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="a1f7aa24-c544-420c-89da-e6f907a8860c" containerName="nova-cell1-conductor-conductor" containerID="cri-o://d09bdddf38f003fdc9558842b008b6301a3840a1560659c352fb547814605bc0" gracePeriod=30 Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.113669 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.140478 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.229882 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.243136 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a0fea6f-e88f-4bd5-935a-b55869a434d0-logs\") pod \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\" (UID: \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\") " Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.243203 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gz5hw\" (UniqueName: \"kubernetes.io/projected/fa2f5e53-8f06-4248-a150-6a98286af063-kube-api-access-gz5hw\") pod \"fa2f5e53-8f06-4248-a150-6a98286af063\" (UID: \"fa2f5e53-8f06-4248-a150-6a98286af063\") " Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.243270 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a0fea6f-e88f-4bd5-935a-b55869a434d0-config-data\") pod \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\" (UID: \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\") " Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.243308 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-config-data\") pod \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\" (UID: \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\") " Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.243344 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-logs\") pod \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\" (UID: \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\") " Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.243366 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwtm2\" (UniqueName: \"kubernetes.io/projected/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-kube-api-access-jwtm2\") pod \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\" (UID: \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\") " Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.243381 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-combined-ca-bundle\") pod \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\" (UID: \"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5\") " Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.243400 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2f5e53-8f06-4248-a150-6a98286af063-combined-ca-bundle\") pod \"fa2f5e53-8f06-4248-a150-6a98286af063\" (UID: \"fa2f5e53-8f06-4248-a150-6a98286af063\") " Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.243417 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bsrs\" (UniqueName: \"kubernetes.io/projected/1a0fea6f-e88f-4bd5-935a-b55869a434d0-kube-api-access-7bsrs\") pod \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\" (UID: \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\") " Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.243469 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa2f5e53-8f06-4248-a150-6a98286af063-config-data\") pod \"fa2f5e53-8f06-4248-a150-6a98286af063\" (UID: \"fa2f5e53-8f06-4248-a150-6a98286af063\") " Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.243530 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a0fea6f-e88f-4bd5-935a-b55869a434d0-combined-ca-bundle\") pod \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\" (UID: \"1a0fea6f-e88f-4bd5-935a-b55869a434d0\") " Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.244844 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-logs" (OuterVolumeSpecName: "logs") pod "d8ed39b6-8460-4fcd-995e-7f65b15fc4f5" (UID: "d8ed39b6-8460-4fcd-995e-7f65b15fc4f5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.246324 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a0fea6f-e88f-4bd5-935a-b55869a434d0-logs" (OuterVolumeSpecName: "logs") pod "1a0fea6f-e88f-4bd5-935a-b55869a434d0" (UID: "1a0fea6f-e88f-4bd5-935a-b55869a434d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.256207 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-kube-api-access-jwtm2" (OuterVolumeSpecName: "kube-api-access-jwtm2") pod "d8ed39b6-8460-4fcd-995e-7f65b15fc4f5" (UID: "d8ed39b6-8460-4fcd-995e-7f65b15fc4f5"). InnerVolumeSpecName "kube-api-access-jwtm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.256588 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a0fea6f-e88f-4bd5-935a-b55869a434d0-kube-api-access-7bsrs" (OuterVolumeSpecName: "kube-api-access-7bsrs") pod "1a0fea6f-e88f-4bd5-935a-b55869a434d0" (UID: "1a0fea6f-e88f-4bd5-935a-b55869a434d0"). InnerVolumeSpecName "kube-api-access-7bsrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.302756 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa2f5e53-8f06-4248-a150-6a98286af063-kube-api-access-gz5hw" (OuterVolumeSpecName: "kube-api-access-gz5hw") pod "fa2f5e53-8f06-4248-a150-6a98286af063" (UID: "fa2f5e53-8f06-4248-a150-6a98286af063"). InnerVolumeSpecName "kube-api-access-gz5hw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.316469 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8ed39b6-8460-4fcd-995e-7f65b15fc4f5" (UID: "d8ed39b6-8460-4fcd-995e-7f65b15fc4f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.323578 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa2f5e53-8f06-4248-a150-6a98286af063-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa2f5e53-8f06-4248-a150-6a98286af063" (UID: "fa2f5e53-8f06-4248-a150-6a98286af063"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.331375 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa2f5e53-8f06-4248-a150-6a98286af063-config-data" (OuterVolumeSpecName: "config-data") pod "fa2f5e53-8f06-4248-a150-6a98286af063" (UID: "fa2f5e53-8f06-4248-a150-6a98286af063"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.335019 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-config-data" (OuterVolumeSpecName: "config-data") pod "d8ed39b6-8460-4fcd-995e-7f65b15fc4f5" (UID: "d8ed39b6-8460-4fcd-995e-7f65b15fc4f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.339473 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a0fea6f-e88f-4bd5-935a-b55869a434d0-config-data" (OuterVolumeSpecName: "config-data") pod "1a0fea6f-e88f-4bd5-935a-b55869a434d0" (UID: "1a0fea6f-e88f-4bd5-935a-b55869a434d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.340988 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a0fea6f-e88f-4bd5-935a-b55869a434d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a0fea6f-e88f-4bd5-935a-b55869a434d0" (UID: "1a0fea6f-e88f-4bd5-935a-b55869a434d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.349648 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a0fea6f-e88f-4bd5-935a-b55869a434d0-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.351141 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.351207 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.351285 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwtm2\" (UniqueName: \"kubernetes.io/projected/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-kube-api-access-jwtm2\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.351395 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.351465 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2f5e53-8f06-4248-a150-6a98286af063-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.351536 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bsrs\" (UniqueName: \"kubernetes.io/projected/1a0fea6f-e88f-4bd5-935a-b55869a434d0-kube-api-access-7bsrs\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.351592 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa2f5e53-8f06-4248-a150-6a98286af063-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.351644 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a0fea6f-e88f-4bd5-935a-b55869a434d0-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.351705 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gz5hw\" (UniqueName: \"kubernetes.io/projected/fa2f5e53-8f06-4248-a150-6a98286af063-kube-api-access-gz5hw\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.453174 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a0fea6f-e88f-4bd5-935a-b55869a434d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.516600 4766 generic.go:334] "Generic (PLEG): container finished" podID="1a0fea6f-e88f-4bd5-935a-b55869a434d0" containerID="052f6c1890c8d89aa97184536b47f034997d104512e606207e1b720b6be62a39" exitCode=0 Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.516695 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.516719 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a0fea6f-e88f-4bd5-935a-b55869a434d0","Type":"ContainerDied","Data":"052f6c1890c8d89aa97184536b47f034997d104512e606207e1b720b6be62a39"} Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.516754 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a0fea6f-e88f-4bd5-935a-b55869a434d0","Type":"ContainerDied","Data":"e3188d2d994a0d7c2fe9386304fde0f24460bbc196e1e5e0a6cae480c84a2f65"} Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.516771 4766 scope.go:117] "RemoveContainer" containerID="052f6c1890c8d89aa97184536b47f034997d104512e606207e1b720b6be62a39" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.521611 4766 generic.go:334] "Generic (PLEG): container finished" podID="d8ed39b6-8460-4fcd-995e-7f65b15fc4f5" containerID="8cdaa629e8ce2f86c0bda824a55d1668e40e7731b60f0274cf01b989dd2acbf8" exitCode=0 Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.521696 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.521706 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5","Type":"ContainerDied","Data":"8cdaa629e8ce2f86c0bda824a55d1668e40e7731b60f0274cf01b989dd2acbf8"} Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.521855 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8ed39b6-8460-4fcd-995e-7f65b15fc4f5","Type":"ContainerDied","Data":"0ebda6bb4e136c60917c5d589d1bdd2248ed060426aa8ec1150e8b25da1b6819"} Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.532877 4766 generic.go:334] "Generic (PLEG): container finished" podID="fa2f5e53-8f06-4248-a150-6a98286af063" containerID="a247b068558cdfd3750ed66585d2668ad6a4593006df480bb2d08dd9f7ae1aae" exitCode=0 Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.534044 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.534975 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fa2f5e53-8f06-4248-a150-6a98286af063","Type":"ContainerDied","Data":"a247b068558cdfd3750ed66585d2668ad6a4593006df480bb2d08dd9f7ae1aae"} Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.535076 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fa2f5e53-8f06-4248-a150-6a98286af063","Type":"ContainerDied","Data":"176a5a58778acd88cd46145dc01f6fe8462034af85101f400fd00576702af9e0"} Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.546542 4766 scope.go:117] "RemoveContainer" containerID="e5db32e650fc8416620a5b9ac5433257777a38aef44ecfa4d19022695c9bc169" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.600700 4766 scope.go:117] "RemoveContainer" containerID="052f6c1890c8d89aa97184536b47f034997d104512e606207e1b720b6be62a39" Oct 02 12:30:27 crc kubenswrapper[4766]: E1002 12:30:27.604232 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"052f6c1890c8d89aa97184536b47f034997d104512e606207e1b720b6be62a39\": container with ID starting with 052f6c1890c8d89aa97184536b47f034997d104512e606207e1b720b6be62a39 not found: ID does not exist" containerID="052f6c1890c8d89aa97184536b47f034997d104512e606207e1b720b6be62a39" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.604276 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"052f6c1890c8d89aa97184536b47f034997d104512e606207e1b720b6be62a39"} err="failed to get container status \"052f6c1890c8d89aa97184536b47f034997d104512e606207e1b720b6be62a39\": rpc error: code = NotFound desc = could not find container \"052f6c1890c8d89aa97184536b47f034997d104512e606207e1b720b6be62a39\": container with ID starting with 052f6c1890c8d89aa97184536b47f034997d104512e606207e1b720b6be62a39 not found: ID does not exist" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.604302 4766 scope.go:117] "RemoveContainer" containerID="e5db32e650fc8416620a5b9ac5433257777a38aef44ecfa4d19022695c9bc169" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.606571 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:30:27 crc kubenswrapper[4766]: E1002 12:30:27.607738 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5db32e650fc8416620a5b9ac5433257777a38aef44ecfa4d19022695c9bc169\": container with ID starting with e5db32e650fc8416620a5b9ac5433257777a38aef44ecfa4d19022695c9bc169 not found: ID does not exist" containerID="e5db32e650fc8416620a5b9ac5433257777a38aef44ecfa4d19022695c9bc169" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.607801 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5db32e650fc8416620a5b9ac5433257777a38aef44ecfa4d19022695c9bc169"} err="failed to get container status \"e5db32e650fc8416620a5b9ac5433257777a38aef44ecfa4d19022695c9bc169\": rpc error: code = NotFound desc = could not find container \"e5db32e650fc8416620a5b9ac5433257777a38aef44ecfa4d19022695c9bc169\": container with ID starting with e5db32e650fc8416620a5b9ac5433257777a38aef44ecfa4d19022695c9bc169 not found: ID does not exist" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.607841 4766 scope.go:117] "RemoveContainer" containerID="8cdaa629e8ce2f86c0bda824a55d1668e40e7731b60f0274cf01b989dd2acbf8" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.653069 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.685033 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:30:27 crc kubenswrapper[4766]: E1002 12:30:27.685855 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a0fea6f-e88f-4bd5-935a-b55869a434d0" containerName="nova-api-log" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.686021 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a0fea6f-e88f-4bd5-935a-b55869a434d0" containerName="nova-api-log" Oct 02 12:30:27 crc kubenswrapper[4766]: E1002 12:30:27.686106 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa2f5e53-8f06-4248-a150-6a98286af063" containerName="nova-scheduler-scheduler" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.686174 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa2f5e53-8f06-4248-a150-6a98286af063" containerName="nova-scheduler-scheduler" Oct 02 12:30:27 crc kubenswrapper[4766]: E1002 12:30:27.686265 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ed39b6-8460-4fcd-995e-7f65b15fc4f5" containerName="nova-metadata-log" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.686342 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ed39b6-8460-4fcd-995e-7f65b15fc4f5" containerName="nova-metadata-log" Oct 02 12:30:27 crc kubenswrapper[4766]: E1002 12:30:27.686425 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a0fea6f-e88f-4bd5-935a-b55869a434d0" containerName="nova-api-api" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.686545 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a0fea6f-e88f-4bd5-935a-b55869a434d0" containerName="nova-api-api" Oct 02 12:30:27 crc kubenswrapper[4766]: E1002 12:30:27.686658 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ed39b6-8460-4fcd-995e-7f65b15fc4f5" containerName="nova-metadata-metadata" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.686731 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ed39b6-8460-4fcd-995e-7f65b15fc4f5" containerName="nova-metadata-metadata" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.687127 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa2f5e53-8f06-4248-a150-6a98286af063" containerName="nova-scheduler-scheduler" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.687224 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ed39b6-8460-4fcd-995e-7f65b15fc4f5" containerName="nova-metadata-metadata" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.687330 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a0fea6f-e88f-4bd5-935a-b55869a434d0" containerName="nova-api-api" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.687403 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ed39b6-8460-4fcd-995e-7f65b15fc4f5" containerName="nova-metadata-log" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.687459 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a0fea6f-e88f-4bd5-935a-b55869a434d0" containerName="nova-api-log" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.688794 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.699233 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.714740 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.724275 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.730060 4766 scope.go:117] "RemoveContainer" containerID="96afa1d405dd9285accc70f49b7ece06af972846d482100dfcf7868f75129cfc" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.742487 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.751038 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.753774 4766 scope.go:117] "RemoveContainer" containerID="8cdaa629e8ce2f86c0bda824a55d1668e40e7731b60f0274cf01b989dd2acbf8" Oct 02 12:30:27 crc kubenswrapper[4766]: E1002 12:30:27.754314 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cdaa629e8ce2f86c0bda824a55d1668e40e7731b60f0274cf01b989dd2acbf8\": container with ID starting with 8cdaa629e8ce2f86c0bda824a55d1668e40e7731b60f0274cf01b989dd2acbf8 not found: ID does not exist" containerID="8cdaa629e8ce2f86c0bda824a55d1668e40e7731b60f0274cf01b989dd2acbf8" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.754352 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cdaa629e8ce2f86c0bda824a55d1668e40e7731b60f0274cf01b989dd2acbf8"} err="failed to get container status \"8cdaa629e8ce2f86c0bda824a55d1668e40e7731b60f0274cf01b989dd2acbf8\": rpc error: code = NotFound desc = could not find container \"8cdaa629e8ce2f86c0bda824a55d1668e40e7731b60f0274cf01b989dd2acbf8\": container with ID starting with 8cdaa629e8ce2f86c0bda824a55d1668e40e7731b60f0274cf01b989dd2acbf8 not found: ID does not exist" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.754383 4766 scope.go:117] "RemoveContainer" containerID="96afa1d405dd9285accc70f49b7ece06af972846d482100dfcf7868f75129cfc" Oct 02 12:30:27 crc kubenswrapper[4766]: E1002 12:30:27.754770 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96afa1d405dd9285accc70f49b7ece06af972846d482100dfcf7868f75129cfc\": container with ID starting with 96afa1d405dd9285accc70f49b7ece06af972846d482100dfcf7868f75129cfc not found: ID does not exist" containerID="96afa1d405dd9285accc70f49b7ece06af972846d482100dfcf7868f75129cfc" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.754798 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96afa1d405dd9285accc70f49b7ece06af972846d482100dfcf7868f75129cfc"} err="failed to get container status \"96afa1d405dd9285accc70f49b7ece06af972846d482100dfcf7868f75129cfc\": rpc error: code = NotFound desc = could not find container \"96afa1d405dd9285accc70f49b7ece06af972846d482100dfcf7868f75129cfc\": container with ID starting with 96afa1d405dd9285accc70f49b7ece06af972846d482100dfcf7868f75129cfc not found: ID does not exist" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.754871 4766 scope.go:117] "RemoveContainer" containerID="a247b068558cdfd3750ed66585d2668ad6a4593006df480bb2d08dd9f7ae1aae" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.760780 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.768622 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.771071 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.774931 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.777932 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.784639 4766 scope.go:117] "RemoveContainer" containerID="a247b068558cdfd3750ed66585d2668ad6a4593006df480bb2d08dd9f7ae1aae" Oct 02 12:30:27 crc kubenswrapper[4766]: E1002 12:30:27.785048 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a247b068558cdfd3750ed66585d2668ad6a4593006df480bb2d08dd9f7ae1aae\": container with ID starting with a247b068558cdfd3750ed66585d2668ad6a4593006df480bb2d08dd9f7ae1aae not found: ID does not exist" containerID="a247b068558cdfd3750ed66585d2668ad6a4593006df480bb2d08dd9f7ae1aae" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.785090 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a247b068558cdfd3750ed66585d2668ad6a4593006df480bb2d08dd9f7ae1aae"} err="failed to get container status \"a247b068558cdfd3750ed66585d2668ad6a4593006df480bb2d08dd9f7ae1aae\": rpc error: code = NotFound desc = could not find container \"a247b068558cdfd3750ed66585d2668ad6a4593006df480bb2d08dd9f7ae1aae\": container with ID starting with a247b068558cdfd3750ed66585d2668ad6a4593006df480bb2d08dd9f7ae1aae not found: ID does not exist" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.789814 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.795951 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.804296 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.810000 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.860041 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58717615-2658-46d5-9945-b726dc965af3-logs\") pod \"nova-metadata-0\" (UID: \"58717615-2658-46d5-9945-b726dc965af3\") " pod="openstack/nova-metadata-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.860435 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58717615-2658-46d5-9945-b726dc965af3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"58717615-2658-46d5-9945-b726dc965af3\") " pod="openstack/nova-metadata-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.862804 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58717615-2658-46d5-9945-b726dc965af3-config-data\") pod \"nova-metadata-0\" (UID: \"58717615-2658-46d5-9945-b726dc965af3\") " pod="openstack/nova-metadata-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.862995 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz4p4\" (UniqueName: \"kubernetes.io/projected/58717615-2658-46d5-9945-b726dc965af3-kube-api-access-lz4p4\") pod \"nova-metadata-0\" (UID: \"58717615-2658-46d5-9945-b726dc965af3\") " pod="openstack/nova-metadata-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.900000 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a0fea6f-e88f-4bd5-935a-b55869a434d0" path="/var/lib/kubelet/pods/1a0fea6f-e88f-4bd5-935a-b55869a434d0/volumes" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.900807 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8ed39b6-8460-4fcd-995e-7f65b15fc4f5" path="/var/lib/kubelet/pods/d8ed39b6-8460-4fcd-995e-7f65b15fc4f5/volumes" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.901388 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa2f5e53-8f06-4248-a150-6a98286af063" path="/var/lib/kubelet/pods/fa2f5e53-8f06-4248-a150-6a98286af063/volumes" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.964314 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4-config-data\") pod \"nova-scheduler-0\" (UID: \"7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4\") " pod="openstack/nova-scheduler-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.964434 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l27w8\" (UniqueName: \"kubernetes.io/projected/7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4-kube-api-access-l27w8\") pod \"nova-scheduler-0\" (UID: \"7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4\") " pod="openstack/nova-scheduler-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.964469 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4\") " pod="openstack/nova-scheduler-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.964501 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cec8d0e7-a5b3-4f81-8690-1060c5802f29-config-data\") pod \"nova-api-0\" (UID: \"cec8d0e7-a5b3-4f81-8690-1060c5802f29\") " pod="openstack/nova-api-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.966077 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58717615-2658-46d5-9945-b726dc965af3-logs\") pod \"nova-metadata-0\" (UID: \"58717615-2658-46d5-9945-b726dc965af3\") " pod="openstack/nova-metadata-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.966169 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58717615-2658-46d5-9945-b726dc965af3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"58717615-2658-46d5-9945-b726dc965af3\") " pod="openstack/nova-metadata-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.966333 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58717615-2658-46d5-9945-b726dc965af3-config-data\") pod \"nova-metadata-0\" (UID: \"58717615-2658-46d5-9945-b726dc965af3\") " pod="openstack/nova-metadata-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.966403 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxlc6\" (UniqueName: \"kubernetes.io/projected/cec8d0e7-a5b3-4f81-8690-1060c5802f29-kube-api-access-wxlc6\") pod \"nova-api-0\" (UID: \"cec8d0e7-a5b3-4f81-8690-1060c5802f29\") " pod="openstack/nova-api-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.966537 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec8d0e7-a5b3-4f81-8690-1060c5802f29-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cec8d0e7-a5b3-4f81-8690-1060c5802f29\") " pod="openstack/nova-api-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.967236 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz4p4\" (UniqueName: \"kubernetes.io/projected/58717615-2658-46d5-9945-b726dc965af3-kube-api-access-lz4p4\") pod \"nova-metadata-0\" (UID: \"58717615-2658-46d5-9945-b726dc965af3\") " pod="openstack/nova-metadata-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.967270 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cec8d0e7-a5b3-4f81-8690-1060c5802f29-logs\") pod \"nova-api-0\" (UID: \"cec8d0e7-a5b3-4f81-8690-1060c5802f29\") " pod="openstack/nova-api-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.968549 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58717615-2658-46d5-9945-b726dc965af3-logs\") pod \"nova-metadata-0\" (UID: \"58717615-2658-46d5-9945-b726dc965af3\") " pod="openstack/nova-metadata-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.970462 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58717615-2658-46d5-9945-b726dc965af3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"58717615-2658-46d5-9945-b726dc965af3\") " pod="openstack/nova-metadata-0" Oct 02 12:30:27 crc kubenswrapper[4766]: I1002 12:30:27.973200 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58717615-2658-46d5-9945-b726dc965af3-config-data\") pod \"nova-metadata-0\" (UID: \"58717615-2658-46d5-9945-b726dc965af3\") " pod="openstack/nova-metadata-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.007929 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz4p4\" (UniqueName: \"kubernetes.io/projected/58717615-2658-46d5-9945-b726dc965af3-kube-api-access-lz4p4\") pod \"nova-metadata-0\" (UID: \"58717615-2658-46d5-9945-b726dc965af3\") " pod="openstack/nova-metadata-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.033839 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.069148 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxlc6\" (UniqueName: \"kubernetes.io/projected/cec8d0e7-a5b3-4f81-8690-1060c5802f29-kube-api-access-wxlc6\") pod \"nova-api-0\" (UID: \"cec8d0e7-a5b3-4f81-8690-1060c5802f29\") " pod="openstack/nova-api-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.069221 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec8d0e7-a5b3-4f81-8690-1060c5802f29-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cec8d0e7-a5b3-4f81-8690-1060c5802f29\") " pod="openstack/nova-api-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.069373 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cec8d0e7-a5b3-4f81-8690-1060c5802f29-logs\") pod \"nova-api-0\" (UID: \"cec8d0e7-a5b3-4f81-8690-1060c5802f29\") " pod="openstack/nova-api-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.069452 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4-config-data\") pod \"nova-scheduler-0\" (UID: \"7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4\") " pod="openstack/nova-scheduler-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.069559 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l27w8\" (UniqueName: \"kubernetes.io/projected/7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4-kube-api-access-l27w8\") pod \"nova-scheduler-0\" (UID: \"7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4\") " pod="openstack/nova-scheduler-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.069592 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4\") " pod="openstack/nova-scheduler-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.069652 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cec8d0e7-a5b3-4f81-8690-1060c5802f29-config-data\") pod \"nova-api-0\" (UID: \"cec8d0e7-a5b3-4f81-8690-1060c5802f29\") " pod="openstack/nova-api-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.070379 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cec8d0e7-a5b3-4f81-8690-1060c5802f29-logs\") pod \"nova-api-0\" (UID: \"cec8d0e7-a5b3-4f81-8690-1060c5802f29\") " pod="openstack/nova-api-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.073910 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cec8d0e7-a5b3-4f81-8690-1060c5802f29-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"cec8d0e7-a5b3-4f81-8690-1060c5802f29\") " pod="openstack/nova-api-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.077148 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4\") " pod="openstack/nova-scheduler-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.084929 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cec8d0e7-a5b3-4f81-8690-1060c5802f29-config-data\") pod \"nova-api-0\" (UID: \"cec8d0e7-a5b3-4f81-8690-1060c5802f29\") " pod="openstack/nova-api-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.093881 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4-config-data\") pod \"nova-scheduler-0\" (UID: \"7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4\") " pod="openstack/nova-scheduler-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.109813 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l27w8\" (UniqueName: \"kubernetes.io/projected/7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4-kube-api-access-l27w8\") pod \"nova-scheduler-0\" (UID: \"7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4\") " pod="openstack/nova-scheduler-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.110947 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxlc6\" (UniqueName: \"kubernetes.io/projected/cec8d0e7-a5b3-4f81-8690-1060c5802f29-kube-api-access-wxlc6\") pod \"nova-api-0\" (UID: \"cec8d0e7-a5b3-4f81-8690-1060c5802f29\") " pod="openstack/nova-api-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.126643 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.410013 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 02 12:30:28 crc kubenswrapper[4766]: E1002 12:30:28.423235 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d09bdddf38f003fdc9558842b008b6301a3840a1560659c352fb547814605bc0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 12:30:28 crc kubenswrapper[4766]: E1002 12:30:28.427474 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d09bdddf38f003fdc9558842b008b6301a3840a1560659c352fb547814605bc0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 12:30:28 crc kubenswrapper[4766]: E1002 12:30:28.428384 4766 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d09bdddf38f003fdc9558842b008b6301a3840a1560659c352fb547814605bc0 is running failed: container process not found" containerID="d09bdddf38f003fdc9558842b008b6301a3840a1560659c352fb547814605bc0" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Oct 02 12:30:28 crc kubenswrapper[4766]: E1002 12:30:28.428433 4766 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d09bdddf38f003fdc9558842b008b6301a3840a1560659c352fb547814605bc0 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="a1f7aa24-c544-420c-89da-e6f907a8860c" containerName="nova-cell1-conductor-conductor" Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.429576 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.569816 4766 generic.go:334] "Generic (PLEG): container finished" podID="a1f7aa24-c544-420c-89da-e6f907a8860c" containerID="d09bdddf38f003fdc9558842b008b6301a3840a1560659c352fb547814605bc0" exitCode=0 Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.570248 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a1f7aa24-c544-420c-89da-e6f907a8860c","Type":"ContainerDied","Data":"d09bdddf38f003fdc9558842b008b6301a3840a1560659c352fb547814605bc0"} Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.624048 4766 generic.go:334] "Generic (PLEG): container finished" podID="ef91c640-4b08-416f-9ff0-d67fca7d5f22" containerID="f873a5f75544bc64020277313c7bd6a384b31cff1da482b92baa87b79a343ced" exitCode=0 Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.624122 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ef91c640-4b08-416f-9ff0-d67fca7d5f22","Type":"ContainerDied","Data":"f873a5f75544bc64020277313c7bd6a384b31cff1da482b92baa87b79a343ced"} Oct 02 12:30:28 crc kubenswrapper[4766]: W1002 12:30:28.629957 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58717615_2658_46d5_9945_b726dc965af3.slice/crio-4a96d0c148a9b17184b4206867e5f411b32be245dfb6c69eb50dd358ba119170 WatchSource:0}: Error finding container 4a96d0c148a9b17184b4206867e5f411b32be245dfb6c69eb50dd358ba119170: Status 404 returned error can't find the container with id 4a96d0c148a9b17184b4206867e5f411b32be245dfb6c69eb50dd358ba119170 Oct 02 12:30:28 crc kubenswrapper[4766]: I1002 12:30:28.797081 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 02 12:30:28 crc kubenswrapper[4766]: W1002 12:30:28.812734 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bf3ca71_58f7_4730_8ab4_24ff2dbf95c4.slice/crio-8af18348597a8f1513cbc2ec2dbfddb7f952b08ba6a66883807c024a02fef046 WatchSource:0}: Error finding container 8af18348597a8f1513cbc2ec2dbfddb7f952b08ba6a66883807c024a02fef046: Status 404 returned error can't find the container with id 8af18348597a8f1513cbc2ec2dbfddb7f952b08ba6a66883807c024a02fef046 Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.029951 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.051708 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.101335 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef91c640-4b08-416f-9ff0-d67fca7d5f22-config-data\") pod \"ef91c640-4b08-416f-9ff0-d67fca7d5f22\" (UID: \"ef91c640-4b08-416f-9ff0-d67fca7d5f22\") " Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.101409 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7pdz\" (UniqueName: \"kubernetes.io/projected/ef91c640-4b08-416f-9ff0-d67fca7d5f22-kube-api-access-z7pdz\") pod \"ef91c640-4b08-416f-9ff0-d67fca7d5f22\" (UID: \"ef91c640-4b08-416f-9ff0-d67fca7d5f22\") " Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.101635 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef91c640-4b08-416f-9ff0-d67fca7d5f22-combined-ca-bundle\") pod \"ef91c640-4b08-416f-9ff0-d67fca7d5f22\" (UID: \"ef91c640-4b08-416f-9ff0-d67fca7d5f22\") " Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.101677 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f7aa24-c544-420c-89da-e6f907a8860c-config-data\") pod \"a1f7aa24-c544-420c-89da-e6f907a8860c\" (UID: \"a1f7aa24-c544-420c-89da-e6f907a8860c\") " Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.101818 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f7aa24-c544-420c-89da-e6f907a8860c-combined-ca-bundle\") pod \"a1f7aa24-c544-420c-89da-e6f907a8860c\" (UID: \"a1f7aa24-c544-420c-89da-e6f907a8860c\") " Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.101848 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfj6m\" (UniqueName: \"kubernetes.io/projected/a1f7aa24-c544-420c-89da-e6f907a8860c-kube-api-access-pfj6m\") pod \"a1f7aa24-c544-420c-89da-e6f907a8860c\" (UID: \"a1f7aa24-c544-420c-89da-e6f907a8860c\") " Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.115477 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1f7aa24-c544-420c-89da-e6f907a8860c-kube-api-access-pfj6m" (OuterVolumeSpecName: "kube-api-access-pfj6m") pod "a1f7aa24-c544-420c-89da-e6f907a8860c" (UID: "a1f7aa24-c544-420c-89da-e6f907a8860c"). InnerVolumeSpecName "kube-api-access-pfj6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.128066 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef91c640-4b08-416f-9ff0-d67fca7d5f22-kube-api-access-z7pdz" (OuterVolumeSpecName: "kube-api-access-z7pdz") pod "ef91c640-4b08-416f-9ff0-d67fca7d5f22" (UID: "ef91c640-4b08-416f-9ff0-d67fca7d5f22"). InnerVolumeSpecName "kube-api-access-z7pdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.203597 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfj6m\" (UniqueName: \"kubernetes.io/projected/a1f7aa24-c544-420c-89da-e6f907a8860c-kube-api-access-pfj6m\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.203635 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7pdz\" (UniqueName: \"kubernetes.io/projected/ef91c640-4b08-416f-9ff0-d67fca7d5f22-kube-api-access-z7pdz\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.226398 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.230397 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f7aa24-c544-420c-89da-e6f907a8860c-config-data" (OuterVolumeSpecName: "config-data") pod "a1f7aa24-c544-420c-89da-e6f907a8860c" (UID: "a1f7aa24-c544-420c-89da-e6f907a8860c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.238737 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1f7aa24-c544-420c-89da-e6f907a8860c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1f7aa24-c544-420c-89da-e6f907a8860c" (UID: "a1f7aa24-c544-420c-89da-e6f907a8860c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.246919 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef91c640-4b08-416f-9ff0-d67fca7d5f22-config-data" (OuterVolumeSpecName: "config-data") pod "ef91c640-4b08-416f-9ff0-d67fca7d5f22" (UID: "ef91c640-4b08-416f-9ff0-d67fca7d5f22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.249321 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef91c640-4b08-416f-9ff0-d67fca7d5f22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ef91c640-4b08-416f-9ff0-d67fca7d5f22" (UID: "ef91c640-4b08-416f-9ff0-d67fca7d5f22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.305381 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ef91c640-4b08-416f-9ff0-d67fca7d5f22-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.305423 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ef91c640-4b08-416f-9ff0-d67fca7d5f22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.305438 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1f7aa24-c544-420c-89da-e6f907a8860c-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.305449 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1f7aa24-c544-420c-89da-e6f907a8860c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.642542 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58717615-2658-46d5-9945-b726dc965af3","Type":"ContainerStarted","Data":"a7dafd3d17eaf2ada70846e3eb7b445ab7248f2f06950363cde1a60a7ba8194a"} Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.643162 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58717615-2658-46d5-9945-b726dc965af3","Type":"ContainerStarted","Data":"65c588233872560644456aea4ee423ba83804414d515cc01c34305f2174a60c8"} Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.643187 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"58717615-2658-46d5-9945-b726dc965af3","Type":"ContainerStarted","Data":"4a96d0c148a9b17184b4206867e5f411b32be245dfb6c69eb50dd358ba119170"} Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.656920 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cec8d0e7-a5b3-4f81-8690-1060c5802f29","Type":"ContainerStarted","Data":"bb2cca322d0871a96993166a291fc610ad8b0779854f8af96fd0059439b205d4"} Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.656994 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cec8d0e7-a5b3-4f81-8690-1060c5802f29","Type":"ContainerStarted","Data":"b6a4f084dd14eb155f4920a96b8a3cd6e7006de9e2cea9fca1d8353920ac0407"} Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.665245 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4","Type":"ContainerStarted","Data":"164774cdc4ac3bf66d03a4b8d68b89a842c53c38d93f2c2902ecf22ca064ef46"} Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.665308 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4","Type":"ContainerStarted","Data":"8af18348597a8f1513cbc2ec2dbfddb7f952b08ba6a66883807c024a02fef046"} Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.676835 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.676805639 podStartE2EDuration="2.676805639s" podCreationTimestamp="2025-10-02 12:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:30:29.667630595 +0000 UTC m=+5944.610501559" watchObservedRunningTime="2025-10-02 12:30:29.676805639 +0000 UTC m=+5944.619676573" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.679896 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"ef91c640-4b08-416f-9ff0-d67fca7d5f22","Type":"ContainerDied","Data":"597d60ddd2efd285977c9b039df25ec3af9a72bc6a070243c62492088a75f238"} Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.679969 4766 scope.go:117] "RemoveContainer" containerID="f873a5f75544bc64020277313c7bd6a384b31cff1da482b92baa87b79a343ced" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.679908 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.696174 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a1f7aa24-c544-420c-89da-e6f907a8860c","Type":"ContainerDied","Data":"70f27bb4160fcb21db022619abf69bbb979844363c3458288f7980c2b0d03085"} Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.696334 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.696552 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.696525571 podStartE2EDuration="2.696525571s" podCreationTimestamp="2025-10-02 12:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:30:29.691314994 +0000 UTC m=+5944.634185948" watchObservedRunningTime="2025-10-02 12:30:29.696525571 +0000 UTC m=+5944.639396515" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.727519 4766 scope.go:117] "RemoveContainer" containerID="d09bdddf38f003fdc9558842b008b6301a3840a1560659c352fb547814605bc0" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.747115 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.759297 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.777900 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.795746 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.819416 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:30:29 crc kubenswrapper[4766]: E1002 12:30:29.820013 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef91c640-4b08-416f-9ff0-d67fca7d5f22" containerName="nova-cell0-conductor-conductor" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.820034 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef91c640-4b08-416f-9ff0-d67fca7d5f22" containerName="nova-cell0-conductor-conductor" Oct 02 12:30:29 crc kubenswrapper[4766]: E1002 12:30:29.820067 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1f7aa24-c544-420c-89da-e6f907a8860c" containerName="nova-cell1-conductor-conductor" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.820075 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1f7aa24-c544-420c-89da-e6f907a8860c" containerName="nova-cell1-conductor-conductor" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.820345 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1f7aa24-c544-420c-89da-e6f907a8860c" containerName="nova-cell1-conductor-conductor" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.820373 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef91c640-4b08-416f-9ff0-d67fca7d5f22" containerName="nova-cell0-conductor-conductor" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.821310 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.824449 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.832206 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.836438 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.839145 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.868575 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.899869 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1f7aa24-c544-420c-89da-e6f907a8860c" path="/var/lib/kubelet/pods/a1f7aa24-c544-420c-89da-e6f907a8860c/volumes" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.900790 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef91c640-4b08-416f-9ff0-d67fca7d5f22" path="/var/lib/kubelet/pods/ef91c640-4b08-416f-9ff0-d67fca7d5f22/volumes" Oct 02 12:30:29 crc kubenswrapper[4766]: I1002 12:30:29.901384 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.018124 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptjwq\" (UniqueName: \"kubernetes.io/projected/df90d518-ac68-4f3b-97be-d914dbab2a48-kube-api-access-ptjwq\") pod \"nova-cell0-conductor-0\" (UID: \"df90d518-ac68-4f3b-97be-d914dbab2a48\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.018432 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df90d518-ac68-4f3b-97be-d914dbab2a48-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"df90d518-ac68-4f3b-97be-d914dbab2a48\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.018545 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14d108de-2d45-473e-bdcc-cc37740131d0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"14d108de-2d45-473e-bdcc-cc37740131d0\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.018643 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14d108de-2d45-473e-bdcc-cc37740131d0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"14d108de-2d45-473e-bdcc-cc37740131d0\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.018856 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df90d518-ac68-4f3b-97be-d914dbab2a48-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"df90d518-ac68-4f3b-97be-d914dbab2a48\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.019005 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2kkp\" (UniqueName: \"kubernetes.io/projected/14d108de-2d45-473e-bdcc-cc37740131d0-kube-api-access-d2kkp\") pod \"nova-cell1-conductor-0\" (UID: \"14d108de-2d45-473e-bdcc-cc37740131d0\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.120937 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptjwq\" (UniqueName: \"kubernetes.io/projected/df90d518-ac68-4f3b-97be-d914dbab2a48-kube-api-access-ptjwq\") pod \"nova-cell0-conductor-0\" (UID: \"df90d518-ac68-4f3b-97be-d914dbab2a48\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.121230 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df90d518-ac68-4f3b-97be-d914dbab2a48-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"df90d518-ac68-4f3b-97be-d914dbab2a48\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.121346 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14d108de-2d45-473e-bdcc-cc37740131d0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"14d108de-2d45-473e-bdcc-cc37740131d0\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.121429 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14d108de-2d45-473e-bdcc-cc37740131d0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"14d108de-2d45-473e-bdcc-cc37740131d0\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.121588 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df90d518-ac68-4f3b-97be-d914dbab2a48-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"df90d518-ac68-4f3b-97be-d914dbab2a48\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.121724 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2kkp\" (UniqueName: \"kubernetes.io/projected/14d108de-2d45-473e-bdcc-cc37740131d0-kube-api-access-d2kkp\") pod \"nova-cell1-conductor-0\" (UID: \"14d108de-2d45-473e-bdcc-cc37740131d0\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.131024 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df90d518-ac68-4f3b-97be-d914dbab2a48-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"df90d518-ac68-4f3b-97be-d914dbab2a48\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.132986 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df90d518-ac68-4f3b-97be-d914dbab2a48-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"df90d518-ac68-4f3b-97be-d914dbab2a48\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.140466 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14d108de-2d45-473e-bdcc-cc37740131d0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"14d108de-2d45-473e-bdcc-cc37740131d0\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.140879 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14d108de-2d45-473e-bdcc-cc37740131d0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"14d108de-2d45-473e-bdcc-cc37740131d0\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.141643 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptjwq\" (UniqueName: \"kubernetes.io/projected/df90d518-ac68-4f3b-97be-d914dbab2a48-kube-api-access-ptjwq\") pod \"nova-cell0-conductor-0\" (UID: \"df90d518-ac68-4f3b-97be-d914dbab2a48\") " pod="openstack/nova-cell0-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.143570 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2kkp\" (UniqueName: \"kubernetes.io/projected/14d108de-2d45-473e-bdcc-cc37740131d0-kube-api-access-d2kkp\") pod \"nova-cell1-conductor-0\" (UID: \"14d108de-2d45-473e-bdcc-cc37740131d0\") " pod="openstack/nova-cell1-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.164366 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.180830 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.192998 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.684081 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.717565 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"cec8d0e7-a5b3-4f81-8690-1060c5802f29","Type":"ContainerStarted","Data":"62791b697d41434447a89741dd40889fb89e63214c569f0ccb2deb4592cf58d0"} Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.743913 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.743885994 podStartE2EDuration="3.743885994s" podCreationTimestamp="2025-10-02 12:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:30:30.737993085 +0000 UTC m=+5945.680864049" watchObservedRunningTime="2025-10-02 12:30:30.743885994 +0000 UTC m=+5945.686756938" Oct 02 12:30:30 crc kubenswrapper[4766]: I1002 12:30:30.798991 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 02 12:30:31 crc kubenswrapper[4766]: I1002 12:30:31.743888 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"df90d518-ac68-4f3b-97be-d914dbab2a48","Type":"ContainerStarted","Data":"b368b7ea844ac54a2557f953cc3e62f790b7eb9f02371b469452f1ff93186411"} Oct 02 12:30:31 crc kubenswrapper[4766]: I1002 12:30:31.744423 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"df90d518-ac68-4f3b-97be-d914dbab2a48","Type":"ContainerStarted","Data":"c4c9da6aa9c40e0f6921ad946794f1739df20f919b2ea1265afe6766d6dbf7c7"} Oct 02 12:30:31 crc kubenswrapper[4766]: I1002 12:30:31.744552 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 02 12:30:31 crc kubenswrapper[4766]: I1002 12:30:31.750023 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"14d108de-2d45-473e-bdcc-cc37740131d0","Type":"ContainerStarted","Data":"1a3b001a9d5c1564ec0beedcfc4a13a51324c627623e47e87ff004ea3e1b1515"} Oct 02 12:30:31 crc kubenswrapper[4766]: I1002 12:30:31.750613 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"14d108de-2d45-473e-bdcc-cc37740131d0","Type":"ContainerStarted","Data":"7c45a4bc8eb3f9b3520f2129408624f6ff3e5bc59d3aba92dd2e1088ac4943e1"} Oct 02 12:30:31 crc kubenswrapper[4766]: I1002 12:30:31.791356 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.79133018 podStartE2EDuration="2.79133018s" podCreationTimestamp="2025-10-02 12:30:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:30:31.77072644 +0000 UTC m=+5946.713597384" watchObservedRunningTime="2025-10-02 12:30:31.79133018 +0000 UTC m=+5946.734201124" Oct 02 12:30:31 crc kubenswrapper[4766]: I1002 12:30:31.797141 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.797121304 podStartE2EDuration="2.797121304s" podCreationTimestamp="2025-10-02 12:30:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:30:31.788196839 +0000 UTC m=+5946.731067783" watchObservedRunningTime="2025-10-02 12:30:31.797121304 +0000 UTC m=+5946.739992248" Oct 02 12:30:31 crc kubenswrapper[4766]: I1002 12:30:31.881849 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:30:31 crc kubenswrapper[4766]: E1002 12:30:31.882170 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:30:32 crc kubenswrapper[4766]: I1002 12:30:32.760653 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 02 12:30:33 crc kubenswrapper[4766]: I1002 12:30:33.034516 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 12:30:33 crc kubenswrapper[4766]: I1002 12:30:33.035288 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 02 12:30:33 crc kubenswrapper[4766]: I1002 12:30:33.127046 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 02 12:30:35 crc kubenswrapper[4766]: I1002 12:30:35.180052 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:30:35 crc kubenswrapper[4766]: I1002 12:30:35.192760 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:30:35 crc kubenswrapper[4766]: I1002 12:30:35.800404 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 02 12:30:36 crc kubenswrapper[4766]: I1002 12:30:36.196886 4766 scope.go:117] "RemoveContainer" containerID="e2c38d23bfeef4e060c7affdc4da706d2f328cbf308777ca9b947ee92af07a66" Oct 02 12:30:38 crc kubenswrapper[4766]: I1002 12:30:38.034658 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 12:30:38 crc kubenswrapper[4766]: I1002 12:30:38.038300 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 02 12:30:38 crc kubenswrapper[4766]: I1002 12:30:38.126775 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 02 12:30:38 crc kubenswrapper[4766]: I1002 12:30:38.159222 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 02 12:30:38 crc kubenswrapper[4766]: I1002 12:30:38.411251 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 12:30:38 crc kubenswrapper[4766]: I1002 12:30:38.411345 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 02 12:30:38 crc kubenswrapper[4766]: I1002 12:30:38.846411 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 02 12:30:39 crc kubenswrapper[4766]: I1002 12:30:39.075824 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="58717615-2658-46d5-9945-b726dc965af3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.81:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:30:39 crc kubenswrapper[4766]: I1002 12:30:39.116830 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="58717615-2658-46d5-9945-b726dc965af3" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.81:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:30:39 crc kubenswrapper[4766]: I1002 12:30:39.493919 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cec8d0e7-a5b3-4f81-8690-1060c5802f29" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.82:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:30:39 crc kubenswrapper[4766]: I1002 12:30:39.493913 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="cec8d0e7-a5b3-4f81-8690-1060c5802f29" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.82:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 02 12:30:40 crc kubenswrapper[4766]: I1002 12:30:40.198485 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 02 12:30:40 crc kubenswrapper[4766]: I1002 12:30:40.230079 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.452837 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.456824 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.459695 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.470198 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.525336 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d578v\" (UniqueName: \"kubernetes.io/projected/90cdb0af-8900-42a6-9218-b7c57ed42f86-kube-api-access-d578v\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.525423 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-config-data\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.525752 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90cdb0af-8900-42a6-9218-b7c57ed42f86-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.525830 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-scripts\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.525907 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.525967 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.641689 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90cdb0af-8900-42a6-9218-b7c57ed42f86-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.641772 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-scripts\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.641827 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.641872 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.642244 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d578v\" (UniqueName: \"kubernetes.io/projected/90cdb0af-8900-42a6-9218-b7c57ed42f86-kube-api-access-d578v\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.642361 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-config-data\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.642352 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90cdb0af-8900-42a6-9218-b7c57ed42f86-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.649337 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-scripts\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.649814 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-config-data\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.652402 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.653157 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.660493 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d578v\" (UniqueName: \"kubernetes.io/projected/90cdb0af-8900-42a6-9218-b7c57ed42f86-kube-api-access-d578v\") pod \"cinder-scheduler-0\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:43 crc kubenswrapper[4766]: I1002 12:30:43.781170 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 12:30:44 crc kubenswrapper[4766]: I1002 12:30:44.262707 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:30:44 crc kubenswrapper[4766]: W1002 12:30:44.271726 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90cdb0af_8900_42a6_9218_b7c57ed42f86.slice/crio-450d0984e0310d90c0ac7ff12f45b3141ea06824857618ed83aea04a3ec493c4 WatchSource:0}: Error finding container 450d0984e0310d90c0ac7ff12f45b3141ea06824857618ed83aea04a3ec493c4: Status 404 returned error can't find the container with id 450d0984e0310d90c0ac7ff12f45b3141ea06824857618ed83aea04a3ec493c4 Oct 02 12:30:44 crc kubenswrapper[4766]: I1002 12:30:44.881343 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:30:44 crc kubenswrapper[4766]: E1002 12:30:44.882003 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:30:44 crc kubenswrapper[4766]: I1002 12:30:44.897585 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90cdb0af-8900-42a6-9218-b7c57ed42f86","Type":"ContainerStarted","Data":"7b06e89363b0ceff471c9fec0d27001e6e96cd676c14bc69920ae96050055113"} Oct 02 12:30:44 crc kubenswrapper[4766]: I1002 12:30:44.897638 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90cdb0af-8900-42a6-9218-b7c57ed42f86","Type":"ContainerStarted","Data":"450d0984e0310d90c0ac7ff12f45b3141ea06824857618ed83aea04a3ec493c4"} Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.152096 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.152418 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="06194cbc-5fd7-4288-9046-51a2575fd81b" containerName="cinder-api-log" containerID="cri-o://2e10ca896edeef17259192e71db5885fda6fc3221e0fb147f51d08380d10555a" gracePeriod=30 Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.153018 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="06194cbc-5fd7-4288-9046-51a2575fd81b" containerName="cinder-api" containerID="cri-o://f0f308e1eaa8c833c089618b3aa2d7d12798fa4050da5b98e9cb97dcdbcad6ba" gracePeriod=30 Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.858457 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.860896 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.875651 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.894917 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-run\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.895010 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.895082 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-sys\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.895123 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.895186 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.895250 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.895317 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.895361 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-dev\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.895379 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.895434 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.895551 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.895581 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.895628 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.895671 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.895698 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrrsb\" (UniqueName: \"kubernetes.io/projected/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-kube-api-access-vrrsb\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.895724 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.954915 4766 generic.go:334] "Generic (PLEG): container finished" podID="06194cbc-5fd7-4288-9046-51a2575fd81b" containerID="2e10ca896edeef17259192e71db5885fda6fc3221e0fb147f51d08380d10555a" exitCode=143 Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.959655 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.959729 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90cdb0af-8900-42a6-9218-b7c57ed42f86","Type":"ContainerStarted","Data":"b9b757e1765c7d044b40fe63f77fccda8eba03de1b719cff8d6f81bb32fb2952"} Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.959775 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"06194cbc-5fd7-4288-9046-51a2575fd81b","Type":"ContainerDied","Data":"2e10ca896edeef17259192e71db5885fda6fc3221e0fb147f51d08380d10555a"} Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.989487 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.9894518679999997 podStartE2EDuration="2.989451868s" podCreationTimestamp="2025-10-02 12:30:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:30:45.984031904 +0000 UTC m=+5960.926902848" watchObservedRunningTime="2025-10-02 12:30:45.989451868 +0000 UTC m=+5960.932322802" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.998154 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.998250 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.998283 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrrsb\" (UniqueName: \"kubernetes.io/projected/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-kube-api-access-vrrsb\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.998317 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.998367 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-run\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.998394 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.998440 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-sys\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.998490 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.998530 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.998607 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.998675 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.998708 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.998735 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-dev\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.998767 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.998876 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.998920 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.999030 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:45 crc kubenswrapper[4766]: I1002 12:30:45.999075 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:45.999051 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:45.999342 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:45.999331 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:45.999177 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:45.999689 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.000059 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-run\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.001320 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-dev\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.001403 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-sys\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.019997 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.020331 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.020877 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.028144 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.028810 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.029345 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrrsb\" (UniqueName: \"kubernetes.io/projected/49e0dfda-90fe-4595-9d8a-f0ebf15566dd-kube-api-access-vrrsb\") pod \"cinder-volume-volume1-0\" (UID: \"49e0dfda-90fe-4595-9d8a-f0ebf15566dd\") " pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.217973 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.431449 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.434843 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.444706 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.472083 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.514710 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-sys\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.514757 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-scripts\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.514795 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-lib-modules\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.514821 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-ceph\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.514837 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.514870 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzl8v\" (UniqueName: \"kubernetes.io/projected/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-kube-api-access-hzl8v\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.514898 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.514940 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.515074 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.515112 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-dev\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.515148 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-run\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.515185 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.515230 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.515254 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.515382 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.515404 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-config-data\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.617692 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.617752 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.617770 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-dev\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.617801 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-run\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.617821 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.617856 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.617875 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.617896 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.617916 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-config-data\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.617936 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-sys\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.617949 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-scripts\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.617980 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-lib-modules\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.618002 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-ceph\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.618020 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.618050 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzl8v\" (UniqueName: \"kubernetes.io/projected/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-kube-api-access-hzl8v\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.618071 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.618188 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-etc-nvme\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.619169 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.619211 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.619238 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-dev\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.619260 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-run\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.619294 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.619328 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.619666 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-lib-modules\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.621292 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.621486 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-sys\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.625992 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-ceph\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.626326 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-config-data\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.626424 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-scripts\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.633199 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.633915 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-config-data-custom\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.639161 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzl8v\" (UniqueName: \"kubernetes.io/projected/3d28bc87-18c8-41c8-9747-cd0c23d2c98e-kube-api-access-hzl8v\") pod \"cinder-backup-0\" (UID: \"3d28bc87-18c8-41c8-9747-cd0c23d2c98e\") " pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.773236 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 02 12:30:46 crc kubenswrapper[4766]: W1002 12:30:46.840887 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49e0dfda_90fe_4595_9d8a_f0ebf15566dd.slice/crio-f435509d66dda7e5f594ce6c0090d273a6d2414d19aff4f9f46710d68922edd4 WatchSource:0}: Error finding container f435509d66dda7e5f594ce6c0090d273a6d2414d19aff4f9f46710d68922edd4: Status 404 returned error can't find the container with id f435509d66dda7e5f594ce6c0090d273a6d2414d19aff4f9f46710d68922edd4 Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.842011 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Oct 02 12:30:46 crc kubenswrapper[4766]: I1002 12:30:46.970677 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"49e0dfda-90fe-4595-9d8a-f0ebf15566dd","Type":"ContainerStarted","Data":"f435509d66dda7e5f594ce6c0090d273a6d2414d19aff4f9f46710d68922edd4"} Oct 02 12:30:47 crc kubenswrapper[4766]: I1002 12:30:47.380438 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 02 12:30:47 crc kubenswrapper[4766]: I1002 12:30:47.988376 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"3d28bc87-18c8-41c8-9747-cd0c23d2c98e","Type":"ContainerStarted","Data":"0216427e7aea7209bbdb15429eb942507acf5616c828b7ea12fa7bea8a99a39b"} Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.038969 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.039048 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.042984 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.044626 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.344802 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="06194cbc-5fd7-4288-9046-51a2575fd81b" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.79:8776/healthcheck\": read tcp 10.217.0.2:60950->10.217.1.79:8776: read: connection reset by peer" Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.416372 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.417189 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.418779 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.434052 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.785123 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.879467 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.993073 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06194cbc-5fd7-4288-9046-51a2575fd81b-logs\") pod \"06194cbc-5fd7-4288-9046-51a2575fd81b\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.993203 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-combined-ca-bundle\") pod \"06194cbc-5fd7-4288-9046-51a2575fd81b\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.993246 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-scripts\") pod \"06194cbc-5fd7-4288-9046-51a2575fd81b\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.993294 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdq59\" (UniqueName: \"kubernetes.io/projected/06194cbc-5fd7-4288-9046-51a2575fd81b-kube-api-access-hdq59\") pod \"06194cbc-5fd7-4288-9046-51a2575fd81b\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.993385 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-config-data\") pod \"06194cbc-5fd7-4288-9046-51a2575fd81b\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.993451 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-config-data-custom\") pod \"06194cbc-5fd7-4288-9046-51a2575fd81b\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.993480 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06194cbc-5fd7-4288-9046-51a2575fd81b-etc-machine-id\") pod \"06194cbc-5fd7-4288-9046-51a2575fd81b\" (UID: \"06194cbc-5fd7-4288-9046-51a2575fd81b\") " Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.993685 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06194cbc-5fd7-4288-9046-51a2575fd81b-logs" (OuterVolumeSpecName: "logs") pod "06194cbc-5fd7-4288-9046-51a2575fd81b" (UID: "06194cbc-5fd7-4288-9046-51a2575fd81b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.993952 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06194cbc-5fd7-4288-9046-51a2575fd81b-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:48 crc kubenswrapper[4766]: I1002 12:30:48.994296 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/06194cbc-5fd7-4288-9046-51a2575fd81b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "06194cbc-5fd7-4288-9046-51a2575fd81b" (UID: "06194cbc-5fd7-4288-9046-51a2575fd81b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.004869 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-scripts" (OuterVolumeSpecName: "scripts") pod "06194cbc-5fd7-4288-9046-51a2575fd81b" (UID: "06194cbc-5fd7-4288-9046-51a2575fd81b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.005839 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06194cbc-5fd7-4288-9046-51a2575fd81b-kube-api-access-hdq59" (OuterVolumeSpecName: "kube-api-access-hdq59") pod "06194cbc-5fd7-4288-9046-51a2575fd81b" (UID: "06194cbc-5fd7-4288-9046-51a2575fd81b"). InnerVolumeSpecName "kube-api-access-hdq59". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.009341 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "06194cbc-5fd7-4288-9046-51a2575fd81b" (UID: "06194cbc-5fd7-4288-9046-51a2575fd81b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.012566 4766 generic.go:334] "Generic (PLEG): container finished" podID="06194cbc-5fd7-4288-9046-51a2575fd81b" containerID="f0f308e1eaa8c833c089618b3aa2d7d12798fa4050da5b98e9cb97dcdbcad6ba" exitCode=0 Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.012655 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"06194cbc-5fd7-4288-9046-51a2575fd81b","Type":"ContainerDied","Data":"f0f308e1eaa8c833c089618b3aa2d7d12798fa4050da5b98e9cb97dcdbcad6ba"} Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.012697 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"06194cbc-5fd7-4288-9046-51a2575fd81b","Type":"ContainerDied","Data":"4acf3708b897d359ef9a0f2486c1aa20f3ee1f41c2bdbf39a28c1c5954ae3570"} Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.012732 4766 scope.go:117] "RemoveContainer" containerID="f0f308e1eaa8c833c089618b3aa2d7d12798fa4050da5b98e9cb97dcdbcad6ba" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.012842 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.023045 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"3d28bc87-18c8-41c8-9747-cd0c23d2c98e","Type":"ContainerStarted","Data":"9016ce9054769fb8bfa10b52f7d3a0839a409558f4211578916e3192e2ccb060"} Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.027892 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"49e0dfda-90fe-4595-9d8a-f0ebf15566dd","Type":"ContainerStarted","Data":"4c13374d3d7844c112ca378ef571f9e62c3b135be13ee6bdd976da87a2787944"} Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.028752 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.034581 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.039650 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06194cbc-5fd7-4288-9046-51a2575fd81b" (UID: "06194cbc-5fd7-4288-9046-51a2575fd81b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.095747 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.095785 4766 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/06194cbc-5fd7-4288-9046-51a2575fd81b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.095797 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.095808 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.095816 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdq59\" (UniqueName: \"kubernetes.io/projected/06194cbc-5fd7-4288-9046-51a2575fd81b-kube-api-access-hdq59\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.096044 4766 scope.go:117] "RemoveContainer" containerID="2e10ca896edeef17259192e71db5885fda6fc3221e0fb147f51d08380d10555a" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.100790 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-config-data" (OuterVolumeSpecName: "config-data") pod "06194cbc-5fd7-4288-9046-51a2575fd81b" (UID: "06194cbc-5fd7-4288-9046-51a2575fd81b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.132966 4766 scope.go:117] "RemoveContainer" containerID="f0f308e1eaa8c833c089618b3aa2d7d12798fa4050da5b98e9cb97dcdbcad6ba" Oct 02 12:30:49 crc kubenswrapper[4766]: E1002 12:30:49.139015 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0f308e1eaa8c833c089618b3aa2d7d12798fa4050da5b98e9cb97dcdbcad6ba\": container with ID starting with f0f308e1eaa8c833c089618b3aa2d7d12798fa4050da5b98e9cb97dcdbcad6ba not found: ID does not exist" containerID="f0f308e1eaa8c833c089618b3aa2d7d12798fa4050da5b98e9cb97dcdbcad6ba" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.139070 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0f308e1eaa8c833c089618b3aa2d7d12798fa4050da5b98e9cb97dcdbcad6ba"} err="failed to get container status \"f0f308e1eaa8c833c089618b3aa2d7d12798fa4050da5b98e9cb97dcdbcad6ba\": rpc error: code = NotFound desc = could not find container \"f0f308e1eaa8c833c089618b3aa2d7d12798fa4050da5b98e9cb97dcdbcad6ba\": container with ID starting with f0f308e1eaa8c833c089618b3aa2d7d12798fa4050da5b98e9cb97dcdbcad6ba not found: ID does not exist" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.139109 4766 scope.go:117] "RemoveContainer" containerID="2e10ca896edeef17259192e71db5885fda6fc3221e0fb147f51d08380d10555a" Oct 02 12:30:49 crc kubenswrapper[4766]: E1002 12:30:49.141628 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e10ca896edeef17259192e71db5885fda6fc3221e0fb147f51d08380d10555a\": container with ID starting with 2e10ca896edeef17259192e71db5885fda6fc3221e0fb147f51d08380d10555a not found: ID does not exist" containerID="2e10ca896edeef17259192e71db5885fda6fc3221e0fb147f51d08380d10555a" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.141695 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e10ca896edeef17259192e71db5885fda6fc3221e0fb147f51d08380d10555a"} err="failed to get container status \"2e10ca896edeef17259192e71db5885fda6fc3221e0fb147f51d08380d10555a\": rpc error: code = NotFound desc = could not find container \"2e10ca896edeef17259192e71db5885fda6fc3221e0fb147f51d08380d10555a\": container with ID starting with 2e10ca896edeef17259192e71db5885fda6fc3221e0fb147f51d08380d10555a not found: ID does not exist" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.201208 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06194cbc-5fd7-4288-9046-51a2575fd81b-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.368846 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.389169 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.411100 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:30:49 crc kubenswrapper[4766]: E1002 12:30:49.411593 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06194cbc-5fd7-4288-9046-51a2575fd81b" containerName="cinder-api" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.411608 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="06194cbc-5fd7-4288-9046-51a2575fd81b" containerName="cinder-api" Oct 02 12:30:49 crc kubenswrapper[4766]: E1002 12:30:49.411661 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06194cbc-5fd7-4288-9046-51a2575fd81b" containerName="cinder-api-log" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.411668 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="06194cbc-5fd7-4288-9046-51a2575fd81b" containerName="cinder-api-log" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.412254 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="06194cbc-5fd7-4288-9046-51a2575fd81b" containerName="cinder-api-log" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.412276 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="06194cbc-5fd7-4288-9046-51a2575fd81b" containerName="cinder-api" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.413753 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.417021 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.445803 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.506596 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bttcz\" (UniqueName: \"kubernetes.io/projected/40e44d35-376b-45b5-a4e1-8efd82067224-kube-api-access-bttcz\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.506668 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e44d35-376b-45b5-a4e1-8efd82067224-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.506701 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40e44d35-376b-45b5-a4e1-8efd82067224-logs\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.506723 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40e44d35-376b-45b5-a4e1-8efd82067224-scripts\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.506744 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40e44d35-376b-45b5-a4e1-8efd82067224-config-data-custom\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.506765 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40e44d35-376b-45b5-a4e1-8efd82067224-etc-machine-id\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.507037 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40e44d35-376b-45b5-a4e1-8efd82067224-config-data\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.609643 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bttcz\" (UniqueName: \"kubernetes.io/projected/40e44d35-376b-45b5-a4e1-8efd82067224-kube-api-access-bttcz\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.609714 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e44d35-376b-45b5-a4e1-8efd82067224-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.609745 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40e44d35-376b-45b5-a4e1-8efd82067224-logs\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.609768 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40e44d35-376b-45b5-a4e1-8efd82067224-scripts\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.609785 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40e44d35-376b-45b5-a4e1-8efd82067224-config-data-custom\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.609801 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40e44d35-376b-45b5-a4e1-8efd82067224-etc-machine-id\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.609832 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40e44d35-376b-45b5-a4e1-8efd82067224-config-data\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.610645 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/40e44d35-376b-45b5-a4e1-8efd82067224-logs\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.612584 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40e44d35-376b-45b5-a4e1-8efd82067224-etc-machine-id\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.615434 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40e44d35-376b-45b5-a4e1-8efd82067224-config-data\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.616002 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e44d35-376b-45b5-a4e1-8efd82067224-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.627186 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40e44d35-376b-45b5-a4e1-8efd82067224-scripts\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.627474 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40e44d35-376b-45b5-a4e1-8efd82067224-config-data-custom\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.634002 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bttcz\" (UniqueName: \"kubernetes.io/projected/40e44d35-376b-45b5-a4e1-8efd82067224-kube-api-access-bttcz\") pod \"cinder-api-0\" (UID: \"40e44d35-376b-45b5-a4e1-8efd82067224\") " pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.738450 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 02 12:30:49 crc kubenswrapper[4766]: I1002 12:30:49.904873 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06194cbc-5fd7-4288-9046-51a2575fd81b" path="/var/lib/kubelet/pods/06194cbc-5fd7-4288-9046-51a2575fd81b/volumes" Oct 02 12:30:50 crc kubenswrapper[4766]: I1002 12:30:50.042380 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"49e0dfda-90fe-4595-9d8a-f0ebf15566dd","Type":"ContainerStarted","Data":"1be43f8ad45ae8e5cc896633bee49776aeb38f0b94ac32cfa7fa2709b954869d"} Oct 02 12:30:50 crc kubenswrapper[4766]: I1002 12:30:50.052925 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"3d28bc87-18c8-41c8-9747-cd0c23d2c98e","Type":"ContainerStarted","Data":"1ec0fbd7ff9618e3b9696fac4337206f4d43366996f7c2394961bf1cecd5d8b8"} Oct 02 12:30:50 crc kubenswrapper[4766]: I1002 12:30:50.087586 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.819333162 podStartE2EDuration="5.087551056s" podCreationTimestamp="2025-10-02 12:30:45 +0000 UTC" firstStartedPulling="2025-10-02 12:30:46.844241356 +0000 UTC m=+5961.787112300" lastFinishedPulling="2025-10-02 12:30:48.11245925 +0000 UTC m=+5963.055330194" observedRunningTime="2025-10-02 12:30:50.07270311 +0000 UTC m=+5965.015574054" watchObservedRunningTime="2025-10-02 12:30:50.087551056 +0000 UTC m=+5965.030422030" Oct 02 12:30:50 crc kubenswrapper[4766]: I1002 12:30:50.107253 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.367263094 podStartE2EDuration="4.107210075s" podCreationTimestamp="2025-10-02 12:30:46 +0000 UTC" firstStartedPulling="2025-10-02 12:30:47.38378098 +0000 UTC m=+5962.326651924" lastFinishedPulling="2025-10-02 12:30:48.123727961 +0000 UTC m=+5963.066598905" observedRunningTime="2025-10-02 12:30:50.093128514 +0000 UTC m=+5965.035999458" watchObservedRunningTime="2025-10-02 12:30:50.107210075 +0000 UTC m=+5965.050081019" Oct 02 12:30:50 crc kubenswrapper[4766]: I1002 12:30:50.262095 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 02 12:30:51 crc kubenswrapper[4766]: I1002 12:30:51.084479 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"40e44d35-376b-45b5-a4e1-8efd82067224","Type":"ContainerStarted","Data":"927d6f8c818cc00614c0f279144ffeab3b3acf4c0de6d90177b3933a235f04aa"} Oct 02 12:30:51 crc kubenswrapper[4766]: I1002 12:30:51.085196 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"40e44d35-376b-45b5-a4e1-8efd82067224","Type":"ContainerStarted","Data":"5a090333ba0616733cb825916ec7e32dcbad66e731cd8963c5d9d951b1430142"} Oct 02 12:30:51 crc kubenswrapper[4766]: I1002 12:30:51.220217 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:51 crc kubenswrapper[4766]: I1002 12:30:51.774362 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 02 12:30:52 crc kubenswrapper[4766]: I1002 12:30:52.099470 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"40e44d35-376b-45b5-a4e1-8efd82067224","Type":"ContainerStarted","Data":"3802c5c045a2326e46e5dc94ab0e2fdf0e9058f25cbed09463876e7c5c56ffec"} Oct 02 12:30:52 crc kubenswrapper[4766]: I1002 12:30:52.102382 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 02 12:30:54 crc kubenswrapper[4766]: I1002 12:30:54.071050 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 12:30:54 crc kubenswrapper[4766]: I1002 12:30:54.104096 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.104066872 podStartE2EDuration="5.104066872s" podCreationTimestamp="2025-10-02 12:30:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:30:52.129640427 +0000 UTC m=+5967.072511371" watchObservedRunningTime="2025-10-02 12:30:54.104066872 +0000 UTC m=+5969.046937816" Oct 02 12:30:54 crc kubenswrapper[4766]: I1002 12:30:54.135136 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:30:54 crc kubenswrapper[4766]: I1002 12:30:54.141643 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="90cdb0af-8900-42a6-9218-b7c57ed42f86" containerName="cinder-scheduler" containerID="cri-o://7b06e89363b0ceff471c9fec0d27001e6e96cd676c14bc69920ae96050055113" gracePeriod=30 Oct 02 12:30:54 crc kubenswrapper[4766]: I1002 12:30:54.141747 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="90cdb0af-8900-42a6-9218-b7c57ed42f86" containerName="probe" containerID="cri-o://b9b757e1765c7d044b40fe63f77fccda8eba03de1b719cff8d6f81bb32fb2952" gracePeriod=30 Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.157603 4766 generic.go:334] "Generic (PLEG): container finished" podID="90cdb0af-8900-42a6-9218-b7c57ed42f86" containerID="b9b757e1765c7d044b40fe63f77fccda8eba03de1b719cff8d6f81bb32fb2952" exitCode=0 Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.158262 4766 generic.go:334] "Generic (PLEG): container finished" podID="90cdb0af-8900-42a6-9218-b7c57ed42f86" containerID="7b06e89363b0ceff471c9fec0d27001e6e96cd676c14bc69920ae96050055113" exitCode=0 Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.157703 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90cdb0af-8900-42a6-9218-b7c57ed42f86","Type":"ContainerDied","Data":"b9b757e1765c7d044b40fe63f77fccda8eba03de1b719cff8d6f81bb32fb2952"} Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.158333 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90cdb0af-8900-42a6-9218-b7c57ed42f86","Type":"ContainerDied","Data":"7b06e89363b0ceff471c9fec0d27001e6e96cd676c14bc69920ae96050055113"} Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.449894 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.543088 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-config-data-custom\") pod \"90cdb0af-8900-42a6-9218-b7c57ed42f86\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.543151 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-config-data\") pod \"90cdb0af-8900-42a6-9218-b7c57ed42f86\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.543185 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-scripts\") pod \"90cdb0af-8900-42a6-9218-b7c57ed42f86\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.543214 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d578v\" (UniqueName: \"kubernetes.io/projected/90cdb0af-8900-42a6-9218-b7c57ed42f86-kube-api-access-d578v\") pod \"90cdb0af-8900-42a6-9218-b7c57ed42f86\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.550225 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90cdb0af-8900-42a6-9218-b7c57ed42f86-kube-api-access-d578v" (OuterVolumeSpecName: "kube-api-access-d578v") pod "90cdb0af-8900-42a6-9218-b7c57ed42f86" (UID: "90cdb0af-8900-42a6-9218-b7c57ed42f86"). InnerVolumeSpecName "kube-api-access-d578v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.550694 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "90cdb0af-8900-42a6-9218-b7c57ed42f86" (UID: "90cdb0af-8900-42a6-9218-b7c57ed42f86"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.562879 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-scripts" (OuterVolumeSpecName: "scripts") pod "90cdb0af-8900-42a6-9218-b7c57ed42f86" (UID: "90cdb0af-8900-42a6-9218-b7c57ed42f86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.644807 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-combined-ca-bundle\") pod \"90cdb0af-8900-42a6-9218-b7c57ed42f86\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.644896 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90cdb0af-8900-42a6-9218-b7c57ed42f86-etc-machine-id\") pod \"90cdb0af-8900-42a6-9218-b7c57ed42f86\" (UID: \"90cdb0af-8900-42a6-9218-b7c57ed42f86\") " Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.645371 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.645390 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d578v\" (UniqueName: \"kubernetes.io/projected/90cdb0af-8900-42a6-9218-b7c57ed42f86-kube-api-access-d578v\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.645400 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.645500 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90cdb0af-8900-42a6-9218-b7c57ed42f86-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "90cdb0af-8900-42a6-9218-b7c57ed42f86" (UID: "90cdb0af-8900-42a6-9218-b7c57ed42f86"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.669266 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-config-data" (OuterVolumeSpecName: "config-data") pod "90cdb0af-8900-42a6-9218-b7c57ed42f86" (UID: "90cdb0af-8900-42a6-9218-b7c57ed42f86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.713961 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90cdb0af-8900-42a6-9218-b7c57ed42f86" (UID: "90cdb0af-8900-42a6-9218-b7c57ed42f86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.747454 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.747494 4766 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90cdb0af-8900-42a6-9218-b7c57ed42f86-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:55 crc kubenswrapper[4766]: I1002 12:30:55.747525 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90cdb0af-8900-42a6-9218-b7c57ed42f86-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:30:56 crc kubenswrapper[4766]: E1002 12:30:56.024085 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90cdb0af_8900_42a6_9218_b7c57ed42f86.slice\": RecentStats: unable to find data in memory cache]" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.180406 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90cdb0af-8900-42a6-9218-b7c57ed42f86","Type":"ContainerDied","Data":"450d0984e0310d90c0ac7ff12f45b3141ea06824857618ed83aea04a3ec493c4"} Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.181106 4766 scope.go:117] "RemoveContainer" containerID="b9b757e1765c7d044b40fe63f77fccda8eba03de1b719cff8d6f81bb32fb2952" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.181411 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.220215 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.235930 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.237717 4766 scope.go:117] "RemoveContainer" containerID="7b06e89363b0ceff471c9fec0d27001e6e96cd676c14bc69920ae96050055113" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.280537 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:30:56 crc kubenswrapper[4766]: E1002 12:30:56.281494 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90cdb0af-8900-42a6-9218-b7c57ed42f86" containerName="cinder-scheduler" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.281535 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="90cdb0af-8900-42a6-9218-b7c57ed42f86" containerName="cinder-scheduler" Oct 02 12:30:56 crc kubenswrapper[4766]: E1002 12:30:56.281577 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90cdb0af-8900-42a6-9218-b7c57ed42f86" containerName="probe" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.281588 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="90cdb0af-8900-42a6-9218-b7c57ed42f86" containerName="probe" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.282126 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="90cdb0af-8900-42a6-9218-b7c57ed42f86" containerName="cinder-scheduler" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.282151 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="90cdb0af-8900-42a6-9218-b7c57ed42f86" containerName="probe" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.314519 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.314735 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.320737 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.507111 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd39d44b-ae8b-45fe-b571-6825a7febb30-scripts\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.507247 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnk76\" (UniqueName: \"kubernetes.io/projected/fd39d44b-ae8b-45fe-b571-6825a7febb30-kube-api-access-gnk76\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.507393 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd39d44b-ae8b-45fe-b571-6825a7febb30-config-data\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.507440 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd39d44b-ae8b-45fe-b571-6825a7febb30-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.507558 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd39d44b-ae8b-45fe-b571-6825a7febb30-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.507661 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd39d44b-ae8b-45fe-b571-6825a7febb30-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.524118 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.609973 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd39d44b-ae8b-45fe-b571-6825a7febb30-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.610092 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd39d44b-ae8b-45fe-b571-6825a7febb30-scripts\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.610138 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnk76\" (UniqueName: \"kubernetes.io/projected/fd39d44b-ae8b-45fe-b571-6825a7febb30-kube-api-access-gnk76\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.610238 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd39d44b-ae8b-45fe-b571-6825a7febb30-config-data\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.610278 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd39d44b-ae8b-45fe-b571-6825a7febb30-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.610326 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd39d44b-ae8b-45fe-b571-6825a7febb30-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.610420 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd39d44b-ae8b-45fe-b571-6825a7febb30-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.617289 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd39d44b-ae8b-45fe-b571-6825a7febb30-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.617314 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd39d44b-ae8b-45fe-b571-6825a7febb30-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.618386 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd39d44b-ae8b-45fe-b571-6825a7febb30-config-data\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.618654 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd39d44b-ae8b-45fe-b571-6825a7febb30-scripts\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.634892 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnk76\" (UniqueName: \"kubernetes.io/projected/fd39d44b-ae8b-45fe-b571-6825a7febb30-kube-api-access-gnk76\") pod \"cinder-scheduler-0\" (UID: \"fd39d44b-ae8b-45fe-b571-6825a7febb30\") " pod="openstack/cinder-scheduler-0" Oct 02 12:30:56 crc kubenswrapper[4766]: I1002 12:30:56.640916 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 02 12:30:57 crc kubenswrapper[4766]: I1002 12:30:57.019707 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 02 12:30:57 crc kubenswrapper[4766]: I1002 12:30:57.150776 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 02 12:30:57 crc kubenswrapper[4766]: I1002 12:30:57.198674 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fd39d44b-ae8b-45fe-b571-6825a7febb30","Type":"ContainerStarted","Data":"d081b794cc9b15a82c1317151e55f109e75d60ad9ccb776febdeca20c5d11775"} Oct 02 12:30:57 crc kubenswrapper[4766]: I1002 12:30:57.898016 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90cdb0af-8900-42a6-9218-b7c57ed42f86" path="/var/lib/kubelet/pods/90cdb0af-8900-42a6-9218-b7c57ed42f86/volumes" Oct 02 12:30:58 crc kubenswrapper[4766]: I1002 12:30:58.213852 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fd39d44b-ae8b-45fe-b571-6825a7febb30","Type":"ContainerStarted","Data":"32293b379ac395f35c7321ecaf04b3126e0719fa3ce92076d93afac6219ccfdf"} Oct 02 12:30:58 crc kubenswrapper[4766]: I1002 12:30:58.882431 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:30:59 crc kubenswrapper[4766]: I1002 12:30:59.230073 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fd39d44b-ae8b-45fe-b571-6825a7febb30","Type":"ContainerStarted","Data":"1e3942c9805f61905ecf0fb58bd08b10ade27840d5c80fd96dbecbbd178320f2"} Oct 02 12:30:59 crc kubenswrapper[4766]: I1002 12:30:59.232998 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"d94a111146843f4af084462f95b5d87bd572303698d956b1f0dbe284471be49c"} Oct 02 12:30:59 crc kubenswrapper[4766]: I1002 12:30:59.257448 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.2574229470000002 podStartE2EDuration="3.257422947s" podCreationTimestamp="2025-10-02 12:30:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:30:59.248802581 +0000 UTC m=+5974.191673525" watchObservedRunningTime="2025-10-02 12:30:59.257422947 +0000 UTC m=+5974.200293891" Oct 02 12:31:01 crc kubenswrapper[4766]: I1002 12:31:01.641801 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 02 12:31:01 crc kubenswrapper[4766]: I1002 12:31:01.913218 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 02 12:31:06 crc kubenswrapper[4766]: I1002 12:31:06.908520 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 02 12:31:17 crc kubenswrapper[4766]: I1002 12:31:17.099743 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-rsxk9"] Oct 02 12:31:17 crc kubenswrapper[4766]: I1002 12:31:17.116007 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-rsxk9"] Oct 02 12:31:17 crc kubenswrapper[4766]: I1002 12:31:17.896652 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b184e35-c4de-43ab-afe1-439ce0de43ab" path="/var/lib/kubelet/pods/7b184e35-c4de-43ab-afe1-439ce0de43ab/volumes" Oct 02 12:31:27 crc kubenswrapper[4766]: I1002 12:31:27.030224 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-a884-account-create-gvlx4"] Oct 02 12:31:27 crc kubenswrapper[4766]: I1002 12:31:27.039727 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-a884-account-create-gvlx4"] Oct 02 12:31:27 crc kubenswrapper[4766]: I1002 12:31:27.898827 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a8ac68a-691d-4b52-9f2b-7343fe201c62" path="/var/lib/kubelet/pods/8a8ac68a-691d-4b52-9f2b-7343fe201c62/volumes" Oct 02 12:31:34 crc kubenswrapper[4766]: I1002 12:31:34.040392 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-th9pl"] Oct 02 12:31:34 crc kubenswrapper[4766]: I1002 12:31:34.050484 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-th9pl"] Oct 02 12:31:35 crc kubenswrapper[4766]: I1002 12:31:35.896417 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85a70b5d-e9b9-4ff9-a10c-33f95a054491" path="/var/lib/kubelet/pods/85a70b5d-e9b9-4ff9-a10c-33f95a054491/volumes" Oct 02 12:31:36 crc kubenswrapper[4766]: I1002 12:31:36.390855 4766 scope.go:117] "RemoveContainer" containerID="f152fe8f4dd1181b805f35d64d5d7a68433133d0624fa6882386389e9b4e4373" Oct 02 12:31:36 crc kubenswrapper[4766]: I1002 12:31:36.418636 4766 scope.go:117] "RemoveContainer" containerID="85f9c4808df8eb59856f69020186fb2783f9e1d86173430f018310f898750ac7" Oct 02 12:31:36 crc kubenswrapper[4766]: I1002 12:31:36.496496 4766 scope.go:117] "RemoveContainer" containerID="711e5bfa63fa16d5c228a12b4a0e9ca42cf3d441d86f57ea0f5694c80bda342e" Oct 02 12:31:48 crc kubenswrapper[4766]: I1002 12:31:48.049391 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-qsk9z"] Oct 02 12:31:48 crc kubenswrapper[4766]: I1002 12:31:48.060202 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-qsk9z"] Oct 02 12:31:49 crc kubenswrapper[4766]: I1002 12:31:49.894679 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ea34ce-4dc9-4338-b967-8b43085a24e5" path="/var/lib/kubelet/pods/20ea34ce-4dc9-4338-b967-8b43085a24e5/volumes" Oct 02 12:32:36 crc kubenswrapper[4766]: I1002 12:32:36.725788 4766 scope.go:117] "RemoveContainer" containerID="459a4ccb91c089449067908175650f9513330b3c6c26139d096e120e22de3a34" Oct 02 12:32:36 crc kubenswrapper[4766]: I1002 12:32:36.760898 4766 scope.go:117] "RemoveContainer" containerID="5ebea97cc994cee1df6417d576a82a2eeaba50ae9b30c0b3f253285fdfbb66a4" Oct 02 12:32:36 crc kubenswrapper[4766]: I1002 12:32:36.786492 4766 scope.go:117] "RemoveContainer" containerID="361cc80e8fc351e0092760f1e57526b19f060f043dc42c4124362a8155e2c2de" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.052730 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-q6w2f"] Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.055464 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.062620 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.062758 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-6ht8z" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.072292 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q6w2f"] Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.119882 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-8px54"] Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.123522 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.131395 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8px54"] Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.203732 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/059b4130-8ca1-4df7-87ca-762fbcf3048e-scripts\") pod \"ovn-controller-q6w2f\" (UID: \"059b4130-8ca1-4df7-87ca-762fbcf3048e\") " pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.203808 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m9zt\" (UniqueName: \"kubernetes.io/projected/059b4130-8ca1-4df7-87ca-762fbcf3048e-kube-api-access-7m9zt\") pod \"ovn-controller-q6w2f\" (UID: \"059b4130-8ca1-4df7-87ca-762fbcf3048e\") " pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.203866 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/059b4130-8ca1-4df7-87ca-762fbcf3048e-var-run\") pod \"ovn-controller-q6w2f\" (UID: \"059b4130-8ca1-4df7-87ca-762fbcf3048e\") " pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.203956 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/059b4130-8ca1-4df7-87ca-762fbcf3048e-var-log-ovn\") pod \"ovn-controller-q6w2f\" (UID: \"059b4130-8ca1-4df7-87ca-762fbcf3048e\") " pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.204006 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b8e19d03-175e-4bdf-8675-cab23ab974ea-etc-ovs\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.204268 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8e19d03-175e-4bdf-8675-cab23ab974ea-scripts\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.204383 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b8e19d03-175e-4bdf-8675-cab23ab974ea-var-lib\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.204781 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/059b4130-8ca1-4df7-87ca-762fbcf3048e-var-run-ovn\") pod \"ovn-controller-q6w2f\" (UID: \"059b4130-8ca1-4df7-87ca-762fbcf3048e\") " pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.205031 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf9cb\" (UniqueName: \"kubernetes.io/projected/b8e19d03-175e-4bdf-8675-cab23ab974ea-kube-api-access-nf9cb\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.205074 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b8e19d03-175e-4bdf-8675-cab23ab974ea-var-run\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.205095 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b8e19d03-175e-4bdf-8675-cab23ab974ea-var-log\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.306952 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf9cb\" (UniqueName: \"kubernetes.io/projected/b8e19d03-175e-4bdf-8675-cab23ab974ea-kube-api-access-nf9cb\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.307013 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b8e19d03-175e-4bdf-8675-cab23ab974ea-var-run\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.307042 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b8e19d03-175e-4bdf-8675-cab23ab974ea-var-log\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.307096 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/059b4130-8ca1-4df7-87ca-762fbcf3048e-scripts\") pod \"ovn-controller-q6w2f\" (UID: \"059b4130-8ca1-4df7-87ca-762fbcf3048e\") " pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.307455 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m9zt\" (UniqueName: \"kubernetes.io/projected/059b4130-8ca1-4df7-87ca-762fbcf3048e-kube-api-access-7m9zt\") pod \"ovn-controller-q6w2f\" (UID: \"059b4130-8ca1-4df7-87ca-762fbcf3048e\") " pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.307605 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/059b4130-8ca1-4df7-87ca-762fbcf3048e-var-run\") pod \"ovn-controller-q6w2f\" (UID: \"059b4130-8ca1-4df7-87ca-762fbcf3048e\") " pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.307766 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/059b4130-8ca1-4df7-87ca-762fbcf3048e-var-log-ovn\") pod \"ovn-controller-q6w2f\" (UID: \"059b4130-8ca1-4df7-87ca-762fbcf3048e\") " pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.307890 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b8e19d03-175e-4bdf-8675-cab23ab974ea-etc-ovs\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.308028 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8e19d03-175e-4bdf-8675-cab23ab974ea-scripts\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.307616 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b8e19d03-175e-4bdf-8675-cab23ab974ea-var-run\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.307822 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/059b4130-8ca1-4df7-87ca-762fbcf3048e-var-log-ovn\") pod \"ovn-controller-q6w2f\" (UID: \"059b4130-8ca1-4df7-87ca-762fbcf3048e\") " pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.307497 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b8e19d03-175e-4bdf-8675-cab23ab974ea-var-log\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.307966 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b8e19d03-175e-4bdf-8675-cab23ab974ea-etc-ovs\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.307684 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/059b4130-8ca1-4df7-87ca-762fbcf3048e-var-run\") pod \"ovn-controller-q6w2f\" (UID: \"059b4130-8ca1-4df7-87ca-762fbcf3048e\") " pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.308155 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b8e19d03-175e-4bdf-8675-cab23ab974ea-var-lib\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.308527 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/059b4130-8ca1-4df7-87ca-762fbcf3048e-var-run-ovn\") pod \"ovn-controller-q6w2f\" (UID: \"059b4130-8ca1-4df7-87ca-762fbcf3048e\") " pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.308641 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b8e19d03-175e-4bdf-8675-cab23ab974ea-var-lib\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.308710 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/059b4130-8ca1-4df7-87ca-762fbcf3048e-var-run-ovn\") pod \"ovn-controller-q6w2f\" (UID: \"059b4130-8ca1-4df7-87ca-762fbcf3048e\") " pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.309658 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/059b4130-8ca1-4df7-87ca-762fbcf3048e-scripts\") pod \"ovn-controller-q6w2f\" (UID: \"059b4130-8ca1-4df7-87ca-762fbcf3048e\") " pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.309831 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8e19d03-175e-4bdf-8675-cab23ab974ea-scripts\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.328578 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf9cb\" (UniqueName: \"kubernetes.io/projected/b8e19d03-175e-4bdf-8675-cab23ab974ea-kube-api-access-nf9cb\") pod \"ovn-controller-ovs-8px54\" (UID: \"b8e19d03-175e-4bdf-8675-cab23ab974ea\") " pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.339115 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m9zt\" (UniqueName: \"kubernetes.io/projected/059b4130-8ca1-4df7-87ca-762fbcf3048e-kube-api-access-7m9zt\") pod \"ovn-controller-q6w2f\" (UID: \"059b4130-8ca1-4df7-87ca-762fbcf3048e\") " pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.383469 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.454709 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:52 crc kubenswrapper[4766]: I1002 12:32:52.974690 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q6w2f"] Oct 02 12:32:53 crc kubenswrapper[4766]: I1002 12:32:53.403187 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-8px54"] Oct 02 12:32:53 crc kubenswrapper[4766]: W1002 12:32:53.413961 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8e19d03_175e_4bdf_8675_cab23ab974ea.slice/crio-63e5fa7b34327c1c8b937b881ebcd4f0f91bc787c263f1dbd939e9aebc8366f9 WatchSource:0}: Error finding container 63e5fa7b34327c1c8b937b881ebcd4f0f91bc787c263f1dbd939e9aebc8366f9: Status 404 returned error can't find the container with id 63e5fa7b34327c1c8b937b881ebcd4f0f91bc787c263f1dbd939e9aebc8366f9 Oct 02 12:32:53 crc kubenswrapper[4766]: I1002 12:32:53.595221 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8px54" event={"ID":"b8e19d03-175e-4bdf-8675-cab23ab974ea","Type":"ContainerStarted","Data":"63e5fa7b34327c1c8b937b881ebcd4f0f91bc787c263f1dbd939e9aebc8366f9"} Oct 02 12:32:53 crc kubenswrapper[4766]: I1002 12:32:53.600585 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q6w2f" event={"ID":"059b4130-8ca1-4df7-87ca-762fbcf3048e","Type":"ContainerStarted","Data":"cf184f559c296c0a7ff0b08ef9329d39ce4965396ec216a1d280246784d9f3ce"} Oct 02 12:32:53 crc kubenswrapper[4766]: I1002 12:32:53.600620 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q6w2f" event={"ID":"059b4130-8ca1-4df7-87ca-762fbcf3048e","Type":"ContainerStarted","Data":"bb2531eecb967d2180cb84aa20f964eb39ee6fe4e5e92f2f912a79e5c4b7730e"} Oct 02 12:32:53 crc kubenswrapper[4766]: I1002 12:32:53.600762 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-q6w2f" Oct 02 12:32:53 crc kubenswrapper[4766]: I1002 12:32:53.624484 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-q6w2f" podStartSLOduration=1.6244573020000002 podStartE2EDuration="1.624457302s" podCreationTimestamp="2025-10-02 12:32:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:32:53.617763219 +0000 UTC m=+6088.560634163" watchObservedRunningTime="2025-10-02 12:32:53.624457302 +0000 UTC m=+6088.567328246" Oct 02 12:32:54 crc kubenswrapper[4766]: I1002 12:32:54.616117 4766 generic.go:334] "Generic (PLEG): container finished" podID="b8e19d03-175e-4bdf-8675-cab23ab974ea" containerID="f17bcb6d93cf5ddccb62301803aad231fec9653004f474401ae90b76016f7039" exitCode=0 Oct 02 12:32:54 crc kubenswrapper[4766]: I1002 12:32:54.616248 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8px54" event={"ID":"b8e19d03-175e-4bdf-8675-cab23ab974ea","Type":"ContainerDied","Data":"f17bcb6d93cf5ddccb62301803aad231fec9653004f474401ae90b76016f7039"} Oct 02 12:32:54 crc kubenswrapper[4766]: I1002 12:32:54.794571 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-z5dll"] Oct 02 12:32:54 crc kubenswrapper[4766]: I1002 12:32:54.796251 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-z5dll" Oct 02 12:32:54 crc kubenswrapper[4766]: I1002 12:32:54.798916 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 02 12:32:54 crc kubenswrapper[4766]: I1002 12:32:54.805872 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-z5dll"] Oct 02 12:32:54 crc kubenswrapper[4766]: I1002 12:32:54.877853 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25979d18-2dce-4710-9661-e1272a2935ea-config\") pod \"ovn-controller-metrics-z5dll\" (UID: \"25979d18-2dce-4710-9661-e1272a2935ea\") " pod="openstack/ovn-controller-metrics-z5dll" Oct 02 12:32:54 crc kubenswrapper[4766]: I1002 12:32:54.877987 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/25979d18-2dce-4710-9661-e1272a2935ea-ovn-rundir\") pod \"ovn-controller-metrics-z5dll\" (UID: \"25979d18-2dce-4710-9661-e1272a2935ea\") " pod="openstack/ovn-controller-metrics-z5dll" Oct 02 12:32:54 crc kubenswrapper[4766]: I1002 12:32:54.878349 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnl66\" (UniqueName: \"kubernetes.io/projected/25979d18-2dce-4710-9661-e1272a2935ea-kube-api-access-dnl66\") pod \"ovn-controller-metrics-z5dll\" (UID: \"25979d18-2dce-4710-9661-e1272a2935ea\") " pod="openstack/ovn-controller-metrics-z5dll" Oct 02 12:32:54 crc kubenswrapper[4766]: I1002 12:32:54.878385 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/25979d18-2dce-4710-9661-e1272a2935ea-ovs-rundir\") pod \"ovn-controller-metrics-z5dll\" (UID: \"25979d18-2dce-4710-9661-e1272a2935ea\") " pod="openstack/ovn-controller-metrics-z5dll" Oct 02 12:32:54 crc kubenswrapper[4766]: I1002 12:32:54.980392 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnl66\" (UniqueName: \"kubernetes.io/projected/25979d18-2dce-4710-9661-e1272a2935ea-kube-api-access-dnl66\") pod \"ovn-controller-metrics-z5dll\" (UID: \"25979d18-2dce-4710-9661-e1272a2935ea\") " pod="openstack/ovn-controller-metrics-z5dll" Oct 02 12:32:54 crc kubenswrapper[4766]: I1002 12:32:54.980778 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/25979d18-2dce-4710-9661-e1272a2935ea-ovs-rundir\") pod \"ovn-controller-metrics-z5dll\" (UID: \"25979d18-2dce-4710-9661-e1272a2935ea\") " pod="openstack/ovn-controller-metrics-z5dll" Oct 02 12:32:54 crc kubenswrapper[4766]: I1002 12:32:54.980845 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25979d18-2dce-4710-9661-e1272a2935ea-config\") pod \"ovn-controller-metrics-z5dll\" (UID: \"25979d18-2dce-4710-9661-e1272a2935ea\") " pod="openstack/ovn-controller-metrics-z5dll" Oct 02 12:32:54 crc kubenswrapper[4766]: I1002 12:32:54.980926 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/25979d18-2dce-4710-9661-e1272a2935ea-ovn-rundir\") pod \"ovn-controller-metrics-z5dll\" (UID: \"25979d18-2dce-4710-9661-e1272a2935ea\") " pod="openstack/ovn-controller-metrics-z5dll" Oct 02 12:32:54 crc kubenswrapper[4766]: I1002 12:32:54.981154 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/25979d18-2dce-4710-9661-e1272a2935ea-ovn-rundir\") pod \"ovn-controller-metrics-z5dll\" (UID: \"25979d18-2dce-4710-9661-e1272a2935ea\") " pod="openstack/ovn-controller-metrics-z5dll" Oct 02 12:32:54 crc kubenswrapper[4766]: I1002 12:32:54.981177 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/25979d18-2dce-4710-9661-e1272a2935ea-ovs-rundir\") pod \"ovn-controller-metrics-z5dll\" (UID: \"25979d18-2dce-4710-9661-e1272a2935ea\") " pod="openstack/ovn-controller-metrics-z5dll" Oct 02 12:32:54 crc kubenswrapper[4766]: I1002 12:32:54.981820 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25979d18-2dce-4710-9661-e1272a2935ea-config\") pod \"ovn-controller-metrics-z5dll\" (UID: \"25979d18-2dce-4710-9661-e1272a2935ea\") " pod="openstack/ovn-controller-metrics-z5dll" Oct 02 12:32:55 crc kubenswrapper[4766]: I1002 12:32:55.010181 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnl66\" (UniqueName: \"kubernetes.io/projected/25979d18-2dce-4710-9661-e1272a2935ea-kube-api-access-dnl66\") pod \"ovn-controller-metrics-z5dll\" (UID: \"25979d18-2dce-4710-9661-e1272a2935ea\") " pod="openstack/ovn-controller-metrics-z5dll" Oct 02 12:32:55 crc kubenswrapper[4766]: I1002 12:32:55.139175 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-z5dll" Oct 02 12:32:55 crc kubenswrapper[4766]: I1002 12:32:55.506586 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-rh557"] Oct 02 12:32:55 crc kubenswrapper[4766]: I1002 12:32:55.508372 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-rh557" Oct 02 12:32:55 crc kubenswrapper[4766]: I1002 12:32:55.519045 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-rh557"] Oct 02 12:32:55 crc kubenswrapper[4766]: I1002 12:32:55.596335 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bctll\" (UniqueName: \"kubernetes.io/projected/37995d46-e73a-436f-8eae-da2c72de6a66-kube-api-access-bctll\") pod \"octavia-db-create-rh557\" (UID: \"37995d46-e73a-436f-8eae-da2c72de6a66\") " pod="openstack/octavia-db-create-rh557" Oct 02 12:32:55 crc kubenswrapper[4766]: I1002 12:32:55.636347 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8px54" event={"ID":"b8e19d03-175e-4bdf-8675-cab23ab974ea","Type":"ContainerStarted","Data":"a0750a2edbeb7f4006098d98b13c2f859051f04c77c6bf6725e7e75e7651224b"} Oct 02 12:32:55 crc kubenswrapper[4766]: I1002 12:32:55.636417 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-8px54" event={"ID":"b8e19d03-175e-4bdf-8675-cab23ab974ea","Type":"ContainerStarted","Data":"b404f223d1f402662081693f3e099be3d495d926d4eff43b1a84bdfae2c9a255"} Oct 02 12:32:55 crc kubenswrapper[4766]: I1002 12:32:55.636669 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:55 crc kubenswrapper[4766]: I1002 12:32:55.637176 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:32:55 crc kubenswrapper[4766]: I1002 12:32:55.659230 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-z5dll"] Oct 02 12:32:55 crc kubenswrapper[4766]: I1002 12:32:55.678749 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-8px54" podStartSLOduration=3.678721524 podStartE2EDuration="3.678721524s" podCreationTimestamp="2025-10-02 12:32:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:32:55.666018448 +0000 UTC m=+6090.608889412" watchObservedRunningTime="2025-10-02 12:32:55.678721524 +0000 UTC m=+6090.621592468" Oct 02 12:32:55 crc kubenswrapper[4766]: I1002 12:32:55.699759 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bctll\" (UniqueName: \"kubernetes.io/projected/37995d46-e73a-436f-8eae-da2c72de6a66-kube-api-access-bctll\") pod \"octavia-db-create-rh557\" (UID: \"37995d46-e73a-436f-8eae-da2c72de6a66\") " pod="openstack/octavia-db-create-rh557" Oct 02 12:32:55 crc kubenswrapper[4766]: I1002 12:32:55.724259 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bctll\" (UniqueName: \"kubernetes.io/projected/37995d46-e73a-436f-8eae-da2c72de6a66-kube-api-access-bctll\") pod \"octavia-db-create-rh557\" (UID: \"37995d46-e73a-436f-8eae-da2c72de6a66\") " pod="openstack/octavia-db-create-rh557" Oct 02 12:32:55 crc kubenswrapper[4766]: I1002 12:32:55.831812 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-rh557" Oct 02 12:32:56 crc kubenswrapper[4766]: I1002 12:32:56.367012 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-rh557"] Oct 02 12:32:56 crc kubenswrapper[4766]: W1002 12:32:56.373880 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37995d46_e73a_436f_8eae_da2c72de6a66.slice/crio-80c5cba1369a4a36ed7db5166201cb40598984f52017933d65e16e337d31323f WatchSource:0}: Error finding container 80c5cba1369a4a36ed7db5166201cb40598984f52017933d65e16e337d31323f: Status 404 returned error can't find the container with id 80c5cba1369a4a36ed7db5166201cb40598984f52017933d65e16e337d31323f Oct 02 12:32:56 crc kubenswrapper[4766]: I1002 12:32:56.648550 4766 generic.go:334] "Generic (PLEG): container finished" podID="37995d46-e73a-436f-8eae-da2c72de6a66" containerID="20a7a021c4471352b2cfdb347a2a41a4cfe27de6b6bfe57c9a1e5ef4a32d2ce7" exitCode=0 Oct 02 12:32:56 crc kubenswrapper[4766]: I1002 12:32:56.649191 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-rh557" event={"ID":"37995d46-e73a-436f-8eae-da2c72de6a66","Type":"ContainerDied","Data":"20a7a021c4471352b2cfdb347a2a41a4cfe27de6b6bfe57c9a1e5ef4a32d2ce7"} Oct 02 12:32:56 crc kubenswrapper[4766]: I1002 12:32:56.649232 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-rh557" event={"ID":"37995d46-e73a-436f-8eae-da2c72de6a66","Type":"ContainerStarted","Data":"80c5cba1369a4a36ed7db5166201cb40598984f52017933d65e16e337d31323f"} Oct 02 12:32:56 crc kubenswrapper[4766]: I1002 12:32:56.652779 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-z5dll" event={"ID":"25979d18-2dce-4710-9661-e1272a2935ea","Type":"ContainerStarted","Data":"aefdcdd72ffa5f26d93eb4169fe8d39acbde19e0b7aa6628b0df4fc01bbdfd3c"} Oct 02 12:32:56 crc kubenswrapper[4766]: I1002 12:32:56.652821 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-z5dll" event={"ID":"25979d18-2dce-4710-9661-e1272a2935ea","Type":"ContainerStarted","Data":"0c55db124ea8f5ff9fd6a807565d4482108a373b30f5d43560cfed919db0530d"} Oct 02 12:32:56 crc kubenswrapper[4766]: I1002 12:32:56.689777 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-z5dll" podStartSLOduration=2.689742279 podStartE2EDuration="2.689742279s" podCreationTimestamp="2025-10-02 12:32:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:32:56.686176225 +0000 UTC m=+6091.629047189" watchObservedRunningTime="2025-10-02 12:32:56.689742279 +0000 UTC m=+6091.632613223" Oct 02 12:32:58 crc kubenswrapper[4766]: I1002 12:32:58.072839 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-rh557" Oct 02 12:32:58 crc kubenswrapper[4766]: I1002 12:32:58.267640 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bctll\" (UniqueName: \"kubernetes.io/projected/37995d46-e73a-436f-8eae-da2c72de6a66-kube-api-access-bctll\") pod \"37995d46-e73a-436f-8eae-da2c72de6a66\" (UID: \"37995d46-e73a-436f-8eae-da2c72de6a66\") " Oct 02 12:32:58 crc kubenswrapper[4766]: I1002 12:32:58.273910 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37995d46-e73a-436f-8eae-da2c72de6a66-kube-api-access-bctll" (OuterVolumeSpecName: "kube-api-access-bctll") pod "37995d46-e73a-436f-8eae-da2c72de6a66" (UID: "37995d46-e73a-436f-8eae-da2c72de6a66"). InnerVolumeSpecName "kube-api-access-bctll". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:32:58 crc kubenswrapper[4766]: I1002 12:32:58.371577 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bctll\" (UniqueName: \"kubernetes.io/projected/37995d46-e73a-436f-8eae-da2c72de6a66-kube-api-access-bctll\") on node \"crc\" DevicePath \"\"" Oct 02 12:32:58 crc kubenswrapper[4766]: I1002 12:32:58.675258 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-rh557" event={"ID":"37995d46-e73a-436f-8eae-da2c72de6a66","Type":"ContainerDied","Data":"80c5cba1369a4a36ed7db5166201cb40598984f52017933d65e16e337d31323f"} Oct 02 12:32:58 crc kubenswrapper[4766]: I1002 12:32:58.675311 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80c5cba1369a4a36ed7db5166201cb40598984f52017933d65e16e337d31323f" Oct 02 12:32:58 crc kubenswrapper[4766]: I1002 12:32:58.675386 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-rh557" Oct 02 12:33:07 crc kubenswrapper[4766]: I1002 12:33:07.551652 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-97ae-account-create-4bbqr"] Oct 02 12:33:07 crc kubenswrapper[4766]: E1002 12:33:07.552838 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37995d46-e73a-436f-8eae-da2c72de6a66" containerName="mariadb-database-create" Oct 02 12:33:07 crc kubenswrapper[4766]: I1002 12:33:07.552858 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="37995d46-e73a-436f-8eae-da2c72de6a66" containerName="mariadb-database-create" Oct 02 12:33:07 crc kubenswrapper[4766]: I1002 12:33:07.553176 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="37995d46-e73a-436f-8eae-da2c72de6a66" containerName="mariadb-database-create" Oct 02 12:33:07 crc kubenswrapper[4766]: I1002 12:33:07.554000 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-97ae-account-create-4bbqr" Oct 02 12:33:07 crc kubenswrapper[4766]: I1002 12:33:07.557840 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Oct 02 12:33:07 crc kubenswrapper[4766]: I1002 12:33:07.565551 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-97ae-account-create-4bbqr"] Oct 02 12:33:07 crc kubenswrapper[4766]: I1002 12:33:07.590540 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqxp5\" (UniqueName: \"kubernetes.io/projected/8ac43509-36b7-4ec3-b5f4-814dfe7990c3-kube-api-access-rqxp5\") pod \"octavia-97ae-account-create-4bbqr\" (UID: \"8ac43509-36b7-4ec3-b5f4-814dfe7990c3\") " pod="openstack/octavia-97ae-account-create-4bbqr" Oct 02 12:33:07 crc kubenswrapper[4766]: I1002 12:33:07.692763 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqxp5\" (UniqueName: \"kubernetes.io/projected/8ac43509-36b7-4ec3-b5f4-814dfe7990c3-kube-api-access-rqxp5\") pod \"octavia-97ae-account-create-4bbqr\" (UID: \"8ac43509-36b7-4ec3-b5f4-814dfe7990c3\") " pod="openstack/octavia-97ae-account-create-4bbqr" Oct 02 12:33:07 crc kubenswrapper[4766]: I1002 12:33:07.722603 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqxp5\" (UniqueName: \"kubernetes.io/projected/8ac43509-36b7-4ec3-b5f4-814dfe7990c3-kube-api-access-rqxp5\") pod \"octavia-97ae-account-create-4bbqr\" (UID: \"8ac43509-36b7-4ec3-b5f4-814dfe7990c3\") " pod="openstack/octavia-97ae-account-create-4bbqr" Oct 02 12:33:07 crc kubenswrapper[4766]: I1002 12:33:07.914043 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-97ae-account-create-4bbqr" Oct 02 12:33:08 crc kubenswrapper[4766]: I1002 12:33:08.428482 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-97ae-account-create-4bbqr"] Oct 02 12:33:08 crc kubenswrapper[4766]: W1002 12:33:08.442867 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ac43509_36b7_4ec3_b5f4_814dfe7990c3.slice/crio-b69b6bcf007baa737ddcb1ff2766425ebadaa929af144a3b7704fb8469cb1f8d WatchSource:0}: Error finding container b69b6bcf007baa737ddcb1ff2766425ebadaa929af144a3b7704fb8469cb1f8d: Status 404 returned error can't find the container with id b69b6bcf007baa737ddcb1ff2766425ebadaa929af144a3b7704fb8469cb1f8d Oct 02 12:33:08 crc kubenswrapper[4766]: I1002 12:33:08.795686 4766 generic.go:334] "Generic (PLEG): container finished" podID="8ac43509-36b7-4ec3-b5f4-814dfe7990c3" containerID="04bf335cb0a12d07614f8fc9c5970b17ac1c372dfcbaf5429b39e15443313c21" exitCode=0 Oct 02 12:33:08 crc kubenswrapper[4766]: I1002 12:33:08.795934 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-97ae-account-create-4bbqr" event={"ID":"8ac43509-36b7-4ec3-b5f4-814dfe7990c3","Type":"ContainerDied","Data":"04bf335cb0a12d07614f8fc9c5970b17ac1c372dfcbaf5429b39e15443313c21"} Oct 02 12:33:08 crc kubenswrapper[4766]: I1002 12:33:08.796069 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-97ae-account-create-4bbqr" event={"ID":"8ac43509-36b7-4ec3-b5f4-814dfe7990c3","Type":"ContainerStarted","Data":"b69b6bcf007baa737ddcb1ff2766425ebadaa929af144a3b7704fb8469cb1f8d"} Oct 02 12:33:10 crc kubenswrapper[4766]: I1002 12:33:10.196155 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-97ae-account-create-4bbqr" Oct 02 12:33:10 crc kubenswrapper[4766]: I1002 12:33:10.363236 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqxp5\" (UniqueName: \"kubernetes.io/projected/8ac43509-36b7-4ec3-b5f4-814dfe7990c3-kube-api-access-rqxp5\") pod \"8ac43509-36b7-4ec3-b5f4-814dfe7990c3\" (UID: \"8ac43509-36b7-4ec3-b5f4-814dfe7990c3\") " Oct 02 12:33:10 crc kubenswrapper[4766]: I1002 12:33:10.369692 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ac43509-36b7-4ec3-b5f4-814dfe7990c3-kube-api-access-rqxp5" (OuterVolumeSpecName: "kube-api-access-rqxp5") pod "8ac43509-36b7-4ec3-b5f4-814dfe7990c3" (UID: "8ac43509-36b7-4ec3-b5f4-814dfe7990c3"). InnerVolumeSpecName "kube-api-access-rqxp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:33:10 crc kubenswrapper[4766]: I1002 12:33:10.467177 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqxp5\" (UniqueName: \"kubernetes.io/projected/8ac43509-36b7-4ec3-b5f4-814dfe7990c3-kube-api-access-rqxp5\") on node \"crc\" DevicePath \"\"" Oct 02 12:33:10 crc kubenswrapper[4766]: I1002 12:33:10.815522 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-97ae-account-create-4bbqr" event={"ID":"8ac43509-36b7-4ec3-b5f4-814dfe7990c3","Type":"ContainerDied","Data":"b69b6bcf007baa737ddcb1ff2766425ebadaa929af144a3b7704fb8469cb1f8d"} Oct 02 12:33:10 crc kubenswrapper[4766]: I1002 12:33:10.815576 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b69b6bcf007baa737ddcb1ff2766425ebadaa929af144a3b7704fb8469cb1f8d" Oct 02 12:33:10 crc kubenswrapper[4766]: I1002 12:33:10.815586 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-97ae-account-create-4bbqr" Oct 02 12:33:14 crc kubenswrapper[4766]: I1002 12:33:14.545654 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-4n6rv"] Oct 02 12:33:14 crc kubenswrapper[4766]: E1002 12:33:14.546832 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac43509-36b7-4ec3-b5f4-814dfe7990c3" containerName="mariadb-account-create" Oct 02 12:33:14 crc kubenswrapper[4766]: I1002 12:33:14.546852 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac43509-36b7-4ec3-b5f4-814dfe7990c3" containerName="mariadb-account-create" Oct 02 12:33:14 crc kubenswrapper[4766]: I1002 12:33:14.547117 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ac43509-36b7-4ec3-b5f4-814dfe7990c3" containerName="mariadb-account-create" Oct 02 12:33:14 crc kubenswrapper[4766]: I1002 12:33:14.548286 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-4n6rv" Oct 02 12:33:14 crc kubenswrapper[4766]: I1002 12:33:14.557963 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-4n6rv"] Oct 02 12:33:14 crc kubenswrapper[4766]: I1002 12:33:14.700459 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhhj6\" (UniqueName: \"kubernetes.io/projected/d6e6ee14-5258-4176-8127-49abee572c04-kube-api-access-fhhj6\") pod \"octavia-persistence-db-create-4n6rv\" (UID: \"d6e6ee14-5258-4176-8127-49abee572c04\") " pod="openstack/octavia-persistence-db-create-4n6rv" Oct 02 12:33:14 crc kubenswrapper[4766]: I1002 12:33:14.802714 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhhj6\" (UniqueName: \"kubernetes.io/projected/d6e6ee14-5258-4176-8127-49abee572c04-kube-api-access-fhhj6\") pod \"octavia-persistence-db-create-4n6rv\" (UID: \"d6e6ee14-5258-4176-8127-49abee572c04\") " pod="openstack/octavia-persistence-db-create-4n6rv" Oct 02 12:33:14 crc kubenswrapper[4766]: I1002 12:33:14.839392 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhhj6\" (UniqueName: \"kubernetes.io/projected/d6e6ee14-5258-4176-8127-49abee572c04-kube-api-access-fhhj6\") pod \"octavia-persistence-db-create-4n6rv\" (UID: \"d6e6ee14-5258-4176-8127-49abee572c04\") " pod="openstack/octavia-persistence-db-create-4n6rv" Oct 02 12:33:14 crc kubenswrapper[4766]: I1002 12:33:14.927817 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-4n6rv" Oct 02 12:33:15 crc kubenswrapper[4766]: I1002 12:33:15.394991 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-4n6rv"] Oct 02 12:33:15 crc kubenswrapper[4766]: I1002 12:33:15.876537 4766 generic.go:334] "Generic (PLEG): container finished" podID="d6e6ee14-5258-4176-8127-49abee572c04" containerID="51e66b40b6f8ce6661eda5149f4d3812bcd4a39b4ace1cfd5a0c2cc0bc95789b" exitCode=0 Oct 02 12:33:15 crc kubenswrapper[4766]: I1002 12:33:15.876639 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-4n6rv" event={"ID":"d6e6ee14-5258-4176-8127-49abee572c04","Type":"ContainerDied","Data":"51e66b40b6f8ce6661eda5149f4d3812bcd4a39b4ace1cfd5a0c2cc0bc95789b"} Oct 02 12:33:15 crc kubenswrapper[4766]: I1002 12:33:15.876964 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-4n6rv" event={"ID":"d6e6ee14-5258-4176-8127-49abee572c04","Type":"ContainerStarted","Data":"9e334c9ab71bb6aba0bb4d401d2e163d1b0f5073a291e5a24dc445fc21e5b0f4"} Oct 02 12:33:17 crc kubenswrapper[4766]: I1002 12:33:17.300715 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-4n6rv" Oct 02 12:33:17 crc kubenswrapper[4766]: I1002 12:33:17.374695 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhhj6\" (UniqueName: \"kubernetes.io/projected/d6e6ee14-5258-4176-8127-49abee572c04-kube-api-access-fhhj6\") pod \"d6e6ee14-5258-4176-8127-49abee572c04\" (UID: \"d6e6ee14-5258-4176-8127-49abee572c04\") " Oct 02 12:33:17 crc kubenswrapper[4766]: I1002 12:33:17.382589 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e6ee14-5258-4176-8127-49abee572c04-kube-api-access-fhhj6" (OuterVolumeSpecName: "kube-api-access-fhhj6") pod "d6e6ee14-5258-4176-8127-49abee572c04" (UID: "d6e6ee14-5258-4176-8127-49abee572c04"). InnerVolumeSpecName "kube-api-access-fhhj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:33:17 crc kubenswrapper[4766]: I1002 12:33:17.477463 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhhj6\" (UniqueName: \"kubernetes.io/projected/d6e6ee14-5258-4176-8127-49abee572c04-kube-api-access-fhhj6\") on node \"crc\" DevicePath \"\"" Oct 02 12:33:17 crc kubenswrapper[4766]: I1002 12:33:17.909013 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-4n6rv" event={"ID":"d6e6ee14-5258-4176-8127-49abee572c04","Type":"ContainerDied","Data":"9e334c9ab71bb6aba0bb4d401d2e163d1b0f5073a291e5a24dc445fc21e5b0f4"} Oct 02 12:33:17 crc kubenswrapper[4766]: I1002 12:33:17.909067 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e334c9ab71bb6aba0bb4d401d2e163d1b0f5073a291e5a24dc445fc21e5b0f4" Oct 02 12:33:17 crc kubenswrapper[4766]: I1002 12:33:17.909137 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-4n6rv" Oct 02 12:33:24 crc kubenswrapper[4766]: I1002 12:33:24.432563 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:33:24 crc kubenswrapper[4766]: I1002 12:33:24.433628 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:33:25 crc kubenswrapper[4766]: I1002 12:33:25.731837 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-d15f-account-create-8k9hq"] Oct 02 12:33:25 crc kubenswrapper[4766]: E1002 12:33:25.732321 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e6ee14-5258-4176-8127-49abee572c04" containerName="mariadb-database-create" Oct 02 12:33:25 crc kubenswrapper[4766]: I1002 12:33:25.732336 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e6ee14-5258-4176-8127-49abee572c04" containerName="mariadb-database-create" Oct 02 12:33:25 crc kubenswrapper[4766]: I1002 12:33:25.732587 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e6ee14-5258-4176-8127-49abee572c04" containerName="mariadb-database-create" Oct 02 12:33:25 crc kubenswrapper[4766]: I1002 12:33:25.733332 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-d15f-account-create-8k9hq" Oct 02 12:33:25 crc kubenswrapper[4766]: I1002 12:33:25.735789 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Oct 02 12:33:25 crc kubenswrapper[4766]: I1002 12:33:25.745486 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-d15f-account-create-8k9hq"] Oct 02 12:33:25 crc kubenswrapper[4766]: I1002 12:33:25.880314 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vsp4\" (UniqueName: \"kubernetes.io/projected/a98e8533-c85c-4959-bf17-1165d6b90c8a-kube-api-access-9vsp4\") pod \"octavia-d15f-account-create-8k9hq\" (UID: \"a98e8533-c85c-4959-bf17-1165d6b90c8a\") " pod="openstack/octavia-d15f-account-create-8k9hq" Oct 02 12:33:25 crc kubenswrapper[4766]: I1002 12:33:25.982862 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vsp4\" (UniqueName: \"kubernetes.io/projected/a98e8533-c85c-4959-bf17-1165d6b90c8a-kube-api-access-9vsp4\") pod \"octavia-d15f-account-create-8k9hq\" (UID: \"a98e8533-c85c-4959-bf17-1165d6b90c8a\") " pod="openstack/octavia-d15f-account-create-8k9hq" Oct 02 12:33:26 crc kubenswrapper[4766]: I1002 12:33:26.034513 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vsp4\" (UniqueName: \"kubernetes.io/projected/a98e8533-c85c-4959-bf17-1165d6b90c8a-kube-api-access-9vsp4\") pod \"octavia-d15f-account-create-8k9hq\" (UID: \"a98e8533-c85c-4959-bf17-1165d6b90c8a\") " pod="openstack/octavia-d15f-account-create-8k9hq" Oct 02 12:33:26 crc kubenswrapper[4766]: I1002 12:33:26.060100 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-d15f-account-create-8k9hq" Oct 02 12:33:26 crc kubenswrapper[4766]: I1002 12:33:26.584938 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-d15f-account-create-8k9hq"] Oct 02 12:33:26 crc kubenswrapper[4766]: I1002 12:33:26.593664 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.026260 4766 generic.go:334] "Generic (PLEG): container finished" podID="a98e8533-c85c-4959-bf17-1165d6b90c8a" containerID="d383351a3ea48b912726f76dd6360dedee50ae871d77eefa0844628306f87093" exitCode=0 Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.026325 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-d15f-account-create-8k9hq" event={"ID":"a98e8533-c85c-4959-bf17-1165d6b90c8a","Type":"ContainerDied","Data":"d383351a3ea48b912726f76dd6360dedee50ae871d77eefa0844628306f87093"} Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.026664 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-d15f-account-create-8k9hq" event={"ID":"a98e8533-c85c-4959-bf17-1165d6b90c8a","Type":"ContainerStarted","Data":"c0145fc0b1a521549a868a1033c986a87348883e98acb2675ae87fabd854e949"} Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.435227 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-q6w2f" podUID="059b4130-8ca1-4df7-87ca-762fbcf3048e" containerName="ovn-controller" probeResult="failure" output=< Oct 02 12:33:27 crc kubenswrapper[4766]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 02 12:33:27 crc kubenswrapper[4766]: > Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.510390 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.526273 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-8px54" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.668305 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-q6w2f-config-6ncr5"] Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.669903 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.673289 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.691979 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q6w2f-config-6ncr5"] Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.822942 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecd25446-9c51-4555-a41b-fcb80939353b-scripts\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.823050 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmc7f\" (UniqueName: \"kubernetes.io/projected/ecd25446-9c51-4555-a41b-fcb80939353b-kube-api-access-fmc7f\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.823122 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-run-ovn\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.823220 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-log-ovn\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.823262 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-run\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.823297 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ecd25446-9c51-4555-a41b-fcb80939353b-additional-scripts\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.941741 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-log-ovn\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.942160 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-run\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.942230 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ecd25446-9c51-4555-a41b-fcb80939353b-additional-scripts\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.942686 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-log-ovn\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.942920 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecd25446-9c51-4555-a41b-fcb80939353b-scripts\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.943013 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmc7f\" (UniqueName: \"kubernetes.io/projected/ecd25446-9c51-4555-a41b-fcb80939353b-kube-api-access-fmc7f\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.943028 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-run\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.943199 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-run-ovn\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.945180 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ecd25446-9c51-4555-a41b-fcb80939353b-additional-scripts\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.945311 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-run-ovn\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.947041 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecd25446-9c51-4555-a41b-fcb80939353b-scripts\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.967423 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmc7f\" (UniqueName: \"kubernetes.io/projected/ecd25446-9c51-4555-a41b-fcb80939353b-kube-api-access-fmc7f\") pod \"ovn-controller-q6w2f-config-6ncr5\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:27 crc kubenswrapper[4766]: I1002 12:33:27.999357 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:28 crc kubenswrapper[4766]: I1002 12:33:28.508103 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-d15f-account-create-8k9hq" Oct 02 12:33:28 crc kubenswrapper[4766]: I1002 12:33:28.646796 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-q6w2f-config-6ncr5"] Oct 02 12:33:28 crc kubenswrapper[4766]: W1002 12:33:28.647792 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecd25446_9c51_4555_a41b_fcb80939353b.slice/crio-628d991e2905c64722ad1c9b919e46be95c56bf7cbe8f51249ed3a9c373b7b2e WatchSource:0}: Error finding container 628d991e2905c64722ad1c9b919e46be95c56bf7cbe8f51249ed3a9c373b7b2e: Status 404 returned error can't find the container with id 628d991e2905c64722ad1c9b919e46be95c56bf7cbe8f51249ed3a9c373b7b2e Oct 02 12:33:28 crc kubenswrapper[4766]: I1002 12:33:28.656793 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vsp4\" (UniqueName: \"kubernetes.io/projected/a98e8533-c85c-4959-bf17-1165d6b90c8a-kube-api-access-9vsp4\") pod \"a98e8533-c85c-4959-bf17-1165d6b90c8a\" (UID: \"a98e8533-c85c-4959-bf17-1165d6b90c8a\") " Oct 02 12:33:28 crc kubenswrapper[4766]: I1002 12:33:28.664846 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a98e8533-c85c-4959-bf17-1165d6b90c8a-kube-api-access-9vsp4" (OuterVolumeSpecName: "kube-api-access-9vsp4") pod "a98e8533-c85c-4959-bf17-1165d6b90c8a" (UID: "a98e8533-c85c-4959-bf17-1165d6b90c8a"). InnerVolumeSpecName "kube-api-access-9vsp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:33:28 crc kubenswrapper[4766]: I1002 12:33:28.760087 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vsp4\" (UniqueName: \"kubernetes.io/projected/a98e8533-c85c-4959-bf17-1165d6b90c8a-kube-api-access-9vsp4\") on node \"crc\" DevicePath \"\"" Oct 02 12:33:29 crc kubenswrapper[4766]: I1002 12:33:29.050545 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q6w2f-config-6ncr5" event={"ID":"ecd25446-9c51-4555-a41b-fcb80939353b","Type":"ContainerStarted","Data":"68876f42998c9251e9cf4c9513dd01c7de078e952ab8d829ea92b87c1ac568b6"} Oct 02 12:33:29 crc kubenswrapper[4766]: I1002 12:33:29.051040 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q6w2f-config-6ncr5" event={"ID":"ecd25446-9c51-4555-a41b-fcb80939353b","Type":"ContainerStarted","Data":"628d991e2905c64722ad1c9b919e46be95c56bf7cbe8f51249ed3a9c373b7b2e"} Oct 02 12:33:29 crc kubenswrapper[4766]: I1002 12:33:29.053570 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-d15f-account-create-8k9hq" event={"ID":"a98e8533-c85c-4959-bf17-1165d6b90c8a","Type":"ContainerDied","Data":"c0145fc0b1a521549a868a1033c986a87348883e98acb2675ae87fabd854e949"} Oct 02 12:33:29 crc kubenswrapper[4766]: I1002 12:33:29.053625 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-d15f-account-create-8k9hq" Oct 02 12:33:29 crc kubenswrapper[4766]: I1002 12:33:29.053648 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0145fc0b1a521549a868a1033c986a87348883e98acb2675ae87fabd854e949" Oct 02 12:33:29 crc kubenswrapper[4766]: I1002 12:33:29.074642 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-q6w2f-config-6ncr5" podStartSLOduration=2.074610547 podStartE2EDuration="2.074610547s" podCreationTimestamp="2025-10-02 12:33:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:33:29.070862547 +0000 UTC m=+6124.013733491" watchObservedRunningTime="2025-10-02 12:33:29.074610547 +0000 UTC m=+6124.017481491" Oct 02 12:33:30 crc kubenswrapper[4766]: I1002 12:33:30.064964 4766 generic.go:334] "Generic (PLEG): container finished" podID="ecd25446-9c51-4555-a41b-fcb80939353b" containerID="68876f42998c9251e9cf4c9513dd01c7de078e952ab8d829ea92b87c1ac568b6" exitCode=0 Oct 02 12:33:30 crc kubenswrapper[4766]: I1002 12:33:30.065078 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q6w2f-config-6ncr5" event={"ID":"ecd25446-9c51-4555-a41b-fcb80939353b","Type":"ContainerDied","Data":"68876f42998c9251e9cf4c9513dd01c7de078e952ab8d829ea92b87c1ac568b6"} Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.455771 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.624475 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ecd25446-9c51-4555-a41b-fcb80939353b-additional-scripts\") pod \"ecd25446-9c51-4555-a41b-fcb80939353b\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.624583 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmc7f\" (UniqueName: \"kubernetes.io/projected/ecd25446-9c51-4555-a41b-fcb80939353b-kube-api-access-fmc7f\") pod \"ecd25446-9c51-4555-a41b-fcb80939353b\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.624699 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-run-ovn\") pod \"ecd25446-9c51-4555-a41b-fcb80939353b\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.624734 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-log-ovn\") pod \"ecd25446-9c51-4555-a41b-fcb80939353b\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.624776 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-run\") pod \"ecd25446-9c51-4555-a41b-fcb80939353b\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.624916 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ecd25446-9c51-4555-a41b-fcb80939353b" (UID: "ecd25446-9c51-4555-a41b-fcb80939353b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.625008 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-run" (OuterVolumeSpecName: "var-run") pod "ecd25446-9c51-4555-a41b-fcb80939353b" (UID: "ecd25446-9c51-4555-a41b-fcb80939353b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.625030 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ecd25446-9c51-4555-a41b-fcb80939353b" (UID: "ecd25446-9c51-4555-a41b-fcb80939353b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.625043 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecd25446-9c51-4555-a41b-fcb80939353b-scripts\") pod \"ecd25446-9c51-4555-a41b-fcb80939353b\" (UID: \"ecd25446-9c51-4555-a41b-fcb80939353b\") " Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.625401 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecd25446-9c51-4555-a41b-fcb80939353b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ecd25446-9c51-4555-a41b-fcb80939353b" (UID: "ecd25446-9c51-4555-a41b-fcb80939353b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.626159 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecd25446-9c51-4555-a41b-fcb80939353b-scripts" (OuterVolumeSpecName: "scripts") pod "ecd25446-9c51-4555-a41b-fcb80939353b" (UID: "ecd25446-9c51-4555-a41b-fcb80939353b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.626387 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecd25446-9c51-4555-a41b-fcb80939353b-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.626412 4766 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ecd25446-9c51-4555-a41b-fcb80939353b-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.626429 4766 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.626444 4766 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.626456 4766 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ecd25446-9c51-4555-a41b-fcb80939353b-var-run\") on node \"crc\" DevicePath \"\"" Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.631362 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecd25446-9c51-4555-a41b-fcb80939353b-kube-api-access-fmc7f" (OuterVolumeSpecName: "kube-api-access-fmc7f") pod "ecd25446-9c51-4555-a41b-fcb80939353b" (UID: "ecd25446-9c51-4555-a41b-fcb80939353b"). InnerVolumeSpecName "kube-api-access-fmc7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:33:31 crc kubenswrapper[4766]: I1002 12:33:31.728192 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmc7f\" (UniqueName: \"kubernetes.io/projected/ecd25446-9c51-4555-a41b-fcb80939353b-kube-api-access-fmc7f\") on node \"crc\" DevicePath \"\"" Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.090527 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-q6w2f-config-6ncr5" event={"ID":"ecd25446-9c51-4555-a41b-fcb80939353b","Type":"ContainerDied","Data":"628d991e2905c64722ad1c9b919e46be95c56bf7cbe8f51249ed3a9c373b7b2e"} Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.090588 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="628d991e2905c64722ad1c9b919e46be95c56bf7cbe8f51249ed3a9c373b7b2e" Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.090590 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-q6w2f-config-6ncr5" Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.170445 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-q6w2f-config-6ncr5"] Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.179315 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-q6w2f-config-6ncr5"] Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.429799 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-q6w2f" Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.787673 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-846f489cc6-ggdkp"] Oct 02 12:33:32 crc kubenswrapper[4766]: E1002 12:33:32.788790 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a98e8533-c85c-4959-bf17-1165d6b90c8a" containerName="mariadb-account-create" Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.788809 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a98e8533-c85c-4959-bf17-1165d6b90c8a" containerName="mariadb-account-create" Oct 02 12:33:32 crc kubenswrapper[4766]: E1002 12:33:32.788851 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd25446-9c51-4555-a41b-fcb80939353b" containerName="ovn-config" Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.788857 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd25446-9c51-4555-a41b-fcb80939353b" containerName="ovn-config" Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.789056 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a98e8533-c85c-4959-bf17-1165d6b90c8a" containerName="mariadb-account-create" Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.789066 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecd25446-9c51-4555-a41b-fcb80939353b" containerName="ovn-config" Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.795416 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.799447 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.799839 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.800019 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-nx7jt" Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.826230 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-846f489cc6-ggdkp"] Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.962281 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3-config-data\") pod \"octavia-api-846f489cc6-ggdkp\" (UID: \"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3\") " pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.962336 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3-config-data-merged\") pod \"octavia-api-846f489cc6-ggdkp\" (UID: \"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3\") " pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.962383 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3-scripts\") pod \"octavia-api-846f489cc6-ggdkp\" (UID: \"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3\") " pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.962405 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3-combined-ca-bundle\") pod \"octavia-api-846f489cc6-ggdkp\" (UID: \"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3\") " pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:32 crc kubenswrapper[4766]: I1002 12:33:32.962546 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3-octavia-run\") pod \"octavia-api-846f489cc6-ggdkp\" (UID: \"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3\") " pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:33 crc kubenswrapper[4766]: I1002 12:33:33.064636 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3-combined-ca-bundle\") pod \"octavia-api-846f489cc6-ggdkp\" (UID: \"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3\") " pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:33 crc kubenswrapper[4766]: I1002 12:33:33.064740 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3-octavia-run\") pod \"octavia-api-846f489cc6-ggdkp\" (UID: \"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3\") " pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:33 crc kubenswrapper[4766]: I1002 12:33:33.064944 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3-config-data\") pod \"octavia-api-846f489cc6-ggdkp\" (UID: \"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3\") " pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:33 crc kubenswrapper[4766]: I1002 12:33:33.064968 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3-config-data-merged\") pod \"octavia-api-846f489cc6-ggdkp\" (UID: \"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3\") " pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:33 crc kubenswrapper[4766]: I1002 12:33:33.065000 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3-scripts\") pod \"octavia-api-846f489cc6-ggdkp\" (UID: \"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3\") " pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:33 crc kubenswrapper[4766]: I1002 12:33:33.066299 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3-octavia-run\") pod \"octavia-api-846f489cc6-ggdkp\" (UID: \"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3\") " pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:33 crc kubenswrapper[4766]: I1002 12:33:33.066681 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3-config-data-merged\") pod \"octavia-api-846f489cc6-ggdkp\" (UID: \"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3\") " pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:33 crc kubenswrapper[4766]: I1002 12:33:33.072911 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3-combined-ca-bundle\") pod \"octavia-api-846f489cc6-ggdkp\" (UID: \"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3\") " pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:33 crc kubenswrapper[4766]: I1002 12:33:33.073709 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3-config-data\") pod \"octavia-api-846f489cc6-ggdkp\" (UID: \"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3\") " pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:33 crc kubenswrapper[4766]: I1002 12:33:33.075355 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3-scripts\") pod \"octavia-api-846f489cc6-ggdkp\" (UID: \"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3\") " pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:33 crc kubenswrapper[4766]: I1002 12:33:33.134161 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:33 crc kubenswrapper[4766]: I1002 12:33:33.449128 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-846f489cc6-ggdkp"] Oct 02 12:33:33 crc kubenswrapper[4766]: W1002 12:33:33.451892 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfb5d33e_3fbd_4bef_9d65_1e0ca95f61f3.slice/crio-36e31ae6865f014939b40a4c0f25d337f2b5cf24162d26deb13e2ec0a52534b5 WatchSource:0}: Error finding container 36e31ae6865f014939b40a4c0f25d337f2b5cf24162d26deb13e2ec0a52534b5: Status 404 returned error can't find the container with id 36e31ae6865f014939b40a4c0f25d337f2b5cf24162d26deb13e2ec0a52534b5 Oct 02 12:33:33 crc kubenswrapper[4766]: I1002 12:33:33.456546 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:33:33 crc kubenswrapper[4766]: I1002 12:33:33.899049 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecd25446-9c51-4555-a41b-fcb80939353b" path="/var/lib/kubelet/pods/ecd25446-9c51-4555-a41b-fcb80939353b/volumes" Oct 02 12:33:34 crc kubenswrapper[4766]: I1002 12:33:34.113955 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-846f489cc6-ggdkp" event={"ID":"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3","Type":"ContainerStarted","Data":"36e31ae6865f014939b40a4c0f25d337f2b5cf24162d26deb13e2ec0a52534b5"} Oct 02 12:33:42 crc kubenswrapper[4766]: I1002 12:33:42.221931 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-846f489cc6-ggdkp" event={"ID":"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3","Type":"ContainerStarted","Data":"fe5a41794b1b76a7cf3dc729b7603feaec77799be51068f19fc317ea3f6d68de"} Oct 02 12:33:43 crc kubenswrapper[4766]: I1002 12:33:43.233587 4766 generic.go:334] "Generic (PLEG): container finished" podID="cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3" containerID="fe5a41794b1b76a7cf3dc729b7603feaec77799be51068f19fc317ea3f6d68de" exitCode=0 Oct 02 12:33:43 crc kubenswrapper[4766]: I1002 12:33:43.233936 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-846f489cc6-ggdkp" event={"ID":"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3","Type":"ContainerDied","Data":"fe5a41794b1b76a7cf3dc729b7603feaec77799be51068f19fc317ea3f6d68de"} Oct 02 12:33:44 crc kubenswrapper[4766]: I1002 12:33:44.250149 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-846f489cc6-ggdkp" event={"ID":"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3","Type":"ContainerStarted","Data":"a568582749b2a01bd37e8b283097d05ca44041da54f3cb700bd66614b18f1e8f"} Oct 02 12:33:44 crc kubenswrapper[4766]: I1002 12:33:44.250711 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-846f489cc6-ggdkp" event={"ID":"cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3","Type":"ContainerStarted","Data":"7cb501762303580320be497b5185dc60bf429079c86fb27bc2c45ecb898f4573"} Oct 02 12:33:44 crc kubenswrapper[4766]: I1002 12:33:44.250937 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:44 crc kubenswrapper[4766]: I1002 12:33:44.250957 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.081738 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-846f489cc6-ggdkp" podStartSLOduration=11.551447419 podStartE2EDuration="20.08171234s" podCreationTimestamp="2025-10-02 12:33:32 +0000 UTC" firstStartedPulling="2025-10-02 12:33:33.456233399 +0000 UTC m=+6128.399104333" lastFinishedPulling="2025-10-02 12:33:41.98649831 +0000 UTC m=+6136.929369254" observedRunningTime="2025-10-02 12:33:44.285356108 +0000 UTC m=+6139.228227052" watchObservedRunningTime="2025-10-02 12:33:52.08171234 +0000 UTC m=+6147.024583284" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.093951 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-r2wtj"] Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.097572 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-r2wtj" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.108571 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.108769 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.109034 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.127141 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-r2wtj"] Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.162439 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e436f55e-dc19-4d4b-be98-d024fb589618-config-data\") pod \"octavia-rsyslog-r2wtj\" (UID: \"e436f55e-dc19-4d4b-be98-d024fb589618\") " pod="openstack/octavia-rsyslog-r2wtj" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.163161 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e436f55e-dc19-4d4b-be98-d024fb589618-scripts\") pod \"octavia-rsyslog-r2wtj\" (UID: \"e436f55e-dc19-4d4b-be98-d024fb589618\") " pod="openstack/octavia-rsyslog-r2wtj" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.163242 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e436f55e-dc19-4d4b-be98-d024fb589618-hm-ports\") pod \"octavia-rsyslog-r2wtj\" (UID: \"e436f55e-dc19-4d4b-be98-d024fb589618\") " pod="openstack/octavia-rsyslog-r2wtj" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.163372 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e436f55e-dc19-4d4b-be98-d024fb589618-config-data-merged\") pod \"octavia-rsyslog-r2wtj\" (UID: \"e436f55e-dc19-4d4b-be98-d024fb589618\") " pod="openstack/octavia-rsyslog-r2wtj" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.264722 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e436f55e-dc19-4d4b-be98-d024fb589618-hm-ports\") pod \"octavia-rsyslog-r2wtj\" (UID: \"e436f55e-dc19-4d4b-be98-d024fb589618\") " pod="openstack/octavia-rsyslog-r2wtj" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.264787 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e436f55e-dc19-4d4b-be98-d024fb589618-config-data-merged\") pod \"octavia-rsyslog-r2wtj\" (UID: \"e436f55e-dc19-4d4b-be98-d024fb589618\") " pod="openstack/octavia-rsyslog-r2wtj" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.264818 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e436f55e-dc19-4d4b-be98-d024fb589618-config-data\") pod \"octavia-rsyslog-r2wtj\" (UID: \"e436f55e-dc19-4d4b-be98-d024fb589618\") " pod="openstack/octavia-rsyslog-r2wtj" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.264982 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e436f55e-dc19-4d4b-be98-d024fb589618-scripts\") pod \"octavia-rsyslog-r2wtj\" (UID: \"e436f55e-dc19-4d4b-be98-d024fb589618\") " pod="openstack/octavia-rsyslog-r2wtj" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.265895 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/e436f55e-dc19-4d4b-be98-d024fb589618-hm-ports\") pod \"octavia-rsyslog-r2wtj\" (UID: \"e436f55e-dc19-4d4b-be98-d024fb589618\") " pod="openstack/octavia-rsyslog-r2wtj" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.266194 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/e436f55e-dc19-4d4b-be98-d024fb589618-config-data-merged\") pod \"octavia-rsyslog-r2wtj\" (UID: \"e436f55e-dc19-4d4b-be98-d024fb589618\") " pod="openstack/octavia-rsyslog-r2wtj" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.272772 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e436f55e-dc19-4d4b-be98-d024fb589618-config-data\") pod \"octavia-rsyslog-r2wtj\" (UID: \"e436f55e-dc19-4d4b-be98-d024fb589618\") " pod="openstack/octavia-rsyslog-r2wtj" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.274496 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e436f55e-dc19-4d4b-be98-d024fb589618-scripts\") pod \"octavia-rsyslog-r2wtj\" (UID: \"e436f55e-dc19-4d4b-be98-d024fb589618\") " pod="openstack/octavia-rsyslog-r2wtj" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.425080 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-r2wtj" Oct 02 12:33:52 crc kubenswrapper[4766]: I1002 12:33:52.939935 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-r2wtj"] Oct 02 12:33:53 crc kubenswrapper[4766]: I1002 12:33:53.081802 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:53 crc kubenswrapper[4766]: I1002 12:33:53.314627 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-846f489cc6-ggdkp" Oct 02 12:33:53 crc kubenswrapper[4766]: I1002 12:33:53.355385 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-r2wtj" event={"ID":"e436f55e-dc19-4d4b-be98-d024fb589618","Type":"ContainerStarted","Data":"6be5f2ff1f0e70e94498c1a53b3eefd388b87e204f323b5d194daefb63e1fbe7"} Oct 02 12:33:53 crc kubenswrapper[4766]: I1002 12:33:53.667611 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-qntgm"] Oct 02 12:33:53 crc kubenswrapper[4766]: I1002 12:33:53.672407 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-qntgm" Oct 02 12:33:53 crc kubenswrapper[4766]: I1002 12:33:53.676534 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 02 12:33:53 crc kubenswrapper[4766]: I1002 12:33:53.688592 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-qntgm"] Oct 02 12:33:53 crc kubenswrapper[4766]: I1002 12:33:53.829353 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/edab4cbb-3e4f-4a92-9045-8f239ecf24bf-amphora-image\") pod \"octavia-image-upload-59f8cff499-qntgm\" (UID: \"edab4cbb-3e4f-4a92-9045-8f239ecf24bf\") " pod="openstack/octavia-image-upload-59f8cff499-qntgm" Oct 02 12:33:53 crc kubenswrapper[4766]: I1002 12:33:53.829452 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/edab4cbb-3e4f-4a92-9045-8f239ecf24bf-httpd-config\") pod \"octavia-image-upload-59f8cff499-qntgm\" (UID: \"edab4cbb-3e4f-4a92-9045-8f239ecf24bf\") " pod="openstack/octavia-image-upload-59f8cff499-qntgm" Oct 02 12:33:53 crc kubenswrapper[4766]: I1002 12:33:53.931380 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/edab4cbb-3e4f-4a92-9045-8f239ecf24bf-amphora-image\") pod \"octavia-image-upload-59f8cff499-qntgm\" (UID: \"edab4cbb-3e4f-4a92-9045-8f239ecf24bf\") " pod="openstack/octavia-image-upload-59f8cff499-qntgm" Oct 02 12:33:53 crc kubenswrapper[4766]: I1002 12:33:53.931483 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/edab4cbb-3e4f-4a92-9045-8f239ecf24bf-httpd-config\") pod \"octavia-image-upload-59f8cff499-qntgm\" (UID: \"edab4cbb-3e4f-4a92-9045-8f239ecf24bf\") " pod="openstack/octavia-image-upload-59f8cff499-qntgm" Oct 02 12:33:53 crc kubenswrapper[4766]: I1002 12:33:53.934228 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/edab4cbb-3e4f-4a92-9045-8f239ecf24bf-amphora-image\") pod \"octavia-image-upload-59f8cff499-qntgm\" (UID: \"edab4cbb-3e4f-4a92-9045-8f239ecf24bf\") " pod="openstack/octavia-image-upload-59f8cff499-qntgm" Oct 02 12:33:53 crc kubenswrapper[4766]: I1002 12:33:53.943617 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/edab4cbb-3e4f-4a92-9045-8f239ecf24bf-httpd-config\") pod \"octavia-image-upload-59f8cff499-qntgm\" (UID: \"edab4cbb-3e4f-4a92-9045-8f239ecf24bf\") " pod="openstack/octavia-image-upload-59f8cff499-qntgm" Oct 02 12:33:54 crc kubenswrapper[4766]: I1002 12:33:54.011427 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-qntgm" Oct 02 12:33:54 crc kubenswrapper[4766]: I1002 12:33:54.432842 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:33:54 crc kubenswrapper[4766]: I1002 12:33:54.432946 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:33:54 crc kubenswrapper[4766]: I1002 12:33:54.510178 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-qntgm"] Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.384018 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-qntgm" event={"ID":"edab4cbb-3e4f-4a92-9045-8f239ecf24bf","Type":"ContainerStarted","Data":"63b8a36d31d01d6c9663a8b8b9b1de6e87477e3a5ae76f4adc382251b7986600"} Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.474899 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-8pj2m"] Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.477108 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-8pj2m" Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.486081 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-8pj2m"] Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.487004 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.571171 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-config-data-merged\") pod \"octavia-db-sync-8pj2m\" (UID: \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\") " pod="openstack/octavia-db-sync-8pj2m" Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.571704 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-config-data\") pod \"octavia-db-sync-8pj2m\" (UID: \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\") " pod="openstack/octavia-db-sync-8pj2m" Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.571787 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-combined-ca-bundle\") pod \"octavia-db-sync-8pj2m\" (UID: \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\") " pod="openstack/octavia-db-sync-8pj2m" Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.571856 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-scripts\") pod \"octavia-db-sync-8pj2m\" (UID: \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\") " pod="openstack/octavia-db-sync-8pj2m" Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.674105 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-config-data\") pod \"octavia-db-sync-8pj2m\" (UID: \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\") " pod="openstack/octavia-db-sync-8pj2m" Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.674221 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-combined-ca-bundle\") pod \"octavia-db-sync-8pj2m\" (UID: \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\") " pod="openstack/octavia-db-sync-8pj2m" Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.674265 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-scripts\") pod \"octavia-db-sync-8pj2m\" (UID: \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\") " pod="openstack/octavia-db-sync-8pj2m" Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.674315 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-config-data-merged\") pod \"octavia-db-sync-8pj2m\" (UID: \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\") " pod="openstack/octavia-db-sync-8pj2m" Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.677988 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-config-data-merged\") pod \"octavia-db-sync-8pj2m\" (UID: \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\") " pod="openstack/octavia-db-sync-8pj2m" Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.685269 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-scripts\") pod \"octavia-db-sync-8pj2m\" (UID: \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\") " pod="openstack/octavia-db-sync-8pj2m" Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.690012 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-config-data\") pod \"octavia-db-sync-8pj2m\" (UID: \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\") " pod="openstack/octavia-db-sync-8pj2m" Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.711793 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-combined-ca-bundle\") pod \"octavia-db-sync-8pj2m\" (UID: \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\") " pod="openstack/octavia-db-sync-8pj2m" Oct 02 12:33:55 crc kubenswrapper[4766]: I1002 12:33:55.820092 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-8pj2m" Oct 02 12:33:57 crc kubenswrapper[4766]: I1002 12:33:57.485317 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-8pj2m"] Oct 02 12:33:57 crc kubenswrapper[4766]: W1002 12:33:57.549116 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff13d18f_5c99_46f5_88ba_c6b3ff4ad9c4.slice/crio-52975ce7a90b303be3473ab6355862b70ed6167b38f729a0ae75aca72082085a WatchSource:0}: Error finding container 52975ce7a90b303be3473ab6355862b70ed6167b38f729a0ae75aca72082085a: Status 404 returned error can't find the container with id 52975ce7a90b303be3473ab6355862b70ed6167b38f729a0ae75aca72082085a Oct 02 12:33:58 crc kubenswrapper[4766]: I1002 12:33:58.430361 4766 generic.go:334] "Generic (PLEG): container finished" podID="ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4" containerID="01bcca35808068d85f7c0b71d608222e9ab4e1a7b1cf02d06f450e137eaa3f2d" exitCode=0 Oct 02 12:33:58 crc kubenswrapper[4766]: I1002 12:33:58.430587 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-8pj2m" event={"ID":"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4","Type":"ContainerDied","Data":"01bcca35808068d85f7c0b71d608222e9ab4e1a7b1cf02d06f450e137eaa3f2d"} Oct 02 12:33:58 crc kubenswrapper[4766]: I1002 12:33:58.431078 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-8pj2m" event={"ID":"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4","Type":"ContainerStarted","Data":"52975ce7a90b303be3473ab6355862b70ed6167b38f729a0ae75aca72082085a"} Oct 02 12:33:58 crc kubenswrapper[4766]: I1002 12:33:58.434372 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-r2wtj" event={"ID":"e436f55e-dc19-4d4b-be98-d024fb589618","Type":"ContainerStarted","Data":"55a572ce95696bc75abca8ba354d0ca2c2de01d03c92f0417c4c5bc44ebab785"} Oct 02 12:33:59 crc kubenswrapper[4766]: I1002 12:33:59.453014 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-8pj2m" event={"ID":"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4","Type":"ContainerStarted","Data":"a6325c657d10489588d6f282b23902a2dd33c7cba6b68c4663c0b406718e617f"} Oct 02 12:33:59 crc kubenswrapper[4766]: I1002 12:33:59.489049 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-8pj2m" podStartSLOduration=4.48902438 podStartE2EDuration="4.48902438s" podCreationTimestamp="2025-10-02 12:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:33:59.477321296 +0000 UTC m=+6154.420192240" watchObservedRunningTime="2025-10-02 12:33:59.48902438 +0000 UTC m=+6154.431895324" Oct 02 12:34:00 crc kubenswrapper[4766]: I1002 12:34:00.482251 4766 generic.go:334] "Generic (PLEG): container finished" podID="e436f55e-dc19-4d4b-be98-d024fb589618" containerID="55a572ce95696bc75abca8ba354d0ca2c2de01d03c92f0417c4c5bc44ebab785" exitCode=0 Oct 02 12:34:00 crc kubenswrapper[4766]: I1002 12:34:00.482366 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-r2wtj" event={"ID":"e436f55e-dc19-4d4b-be98-d024fb589618","Type":"ContainerDied","Data":"55a572ce95696bc75abca8ba354d0ca2c2de01d03c92f0417c4c5bc44ebab785"} Oct 02 12:34:01 crc kubenswrapper[4766]: I1002 12:34:01.514620 4766 generic.go:334] "Generic (PLEG): container finished" podID="ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4" containerID="a6325c657d10489588d6f282b23902a2dd33c7cba6b68c4663c0b406718e617f" exitCode=0 Oct 02 12:34:01 crc kubenswrapper[4766]: I1002 12:34:01.514708 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-8pj2m" event={"ID":"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4","Type":"ContainerDied","Data":"a6325c657d10489588d6f282b23902a2dd33c7cba6b68c4663c0b406718e617f"} Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.053055 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-t6x4h"] Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.058302 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.063657 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.064061 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.064211 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.082771 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-t6x4h"] Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.181839 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-hm-ports\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.181905 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-amphora-certs\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.182055 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-config-data-merged\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.182171 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-config-data\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.182597 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-combined-ca-bundle\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.182682 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-scripts\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.285000 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-hm-ports\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.285066 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-amphora-certs\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.285109 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-config-data-merged\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.285134 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-config-data\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.285624 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-config-data-merged\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.285752 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-combined-ca-bundle\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.285787 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-scripts\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.288069 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-hm-ports\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.296613 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-combined-ca-bundle\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.296874 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-config-data\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.297218 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-amphora-certs\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.298044 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c16aa8e-cc08-497e-8ad3-db18bbc82afd-scripts\") pod \"octavia-housekeeping-t6x4h\" (UID: \"0c16aa8e-cc08-497e-8ad3-db18bbc82afd\") " pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:03 crc kubenswrapper[4766]: I1002 12:34:03.408482 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:04 crc kubenswrapper[4766]: I1002 12:34:04.786307 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-cqtgm"] Oct 02 12:34:04 crc kubenswrapper[4766]: I1002 12:34:04.789831 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:04 crc kubenswrapper[4766]: I1002 12:34:04.794860 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Oct 02 12:34:04 crc kubenswrapper[4766]: I1002 12:34:04.796300 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Oct 02 12:34:04 crc kubenswrapper[4766]: I1002 12:34:04.803929 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-cqtgm"] Oct 02 12:34:04 crc kubenswrapper[4766]: I1002 12:34:04.939289 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/888c9f82-8929-4dda-b89a-cbd917f2026d-config-data\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:04 crc kubenswrapper[4766]: I1002 12:34:04.939450 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/888c9f82-8929-4dda-b89a-cbd917f2026d-hm-ports\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:04 crc kubenswrapper[4766]: I1002 12:34:04.939564 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/888c9f82-8929-4dda-b89a-cbd917f2026d-amphora-certs\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:04 crc kubenswrapper[4766]: I1002 12:34:04.939762 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/888c9f82-8929-4dda-b89a-cbd917f2026d-scripts\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:04 crc kubenswrapper[4766]: I1002 12:34:04.939818 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/888c9f82-8929-4dda-b89a-cbd917f2026d-combined-ca-bundle\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:04 crc kubenswrapper[4766]: I1002 12:34:04.939854 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/888c9f82-8929-4dda-b89a-cbd917f2026d-config-data-merged\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:05 crc kubenswrapper[4766]: I1002 12:34:05.042147 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/888c9f82-8929-4dda-b89a-cbd917f2026d-hm-ports\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:05 crc kubenswrapper[4766]: I1002 12:34:05.042263 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/888c9f82-8929-4dda-b89a-cbd917f2026d-amphora-certs\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:05 crc kubenswrapper[4766]: I1002 12:34:05.042297 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/888c9f82-8929-4dda-b89a-cbd917f2026d-scripts\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:05 crc kubenswrapper[4766]: I1002 12:34:05.042376 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/888c9f82-8929-4dda-b89a-cbd917f2026d-combined-ca-bundle\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:05 crc kubenswrapper[4766]: I1002 12:34:05.042403 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/888c9f82-8929-4dda-b89a-cbd917f2026d-config-data-merged\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:05 crc kubenswrapper[4766]: I1002 12:34:05.042464 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/888c9f82-8929-4dda-b89a-cbd917f2026d-config-data\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:05 crc kubenswrapper[4766]: I1002 12:34:05.043557 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/888c9f82-8929-4dda-b89a-cbd917f2026d-config-data-merged\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:05 crc kubenswrapper[4766]: I1002 12:34:05.044202 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/888c9f82-8929-4dda-b89a-cbd917f2026d-hm-ports\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:05 crc kubenswrapper[4766]: I1002 12:34:05.049954 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/888c9f82-8929-4dda-b89a-cbd917f2026d-scripts\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:05 crc kubenswrapper[4766]: I1002 12:34:05.050865 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/888c9f82-8929-4dda-b89a-cbd917f2026d-amphora-certs\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:05 crc kubenswrapper[4766]: I1002 12:34:05.058697 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/888c9f82-8929-4dda-b89a-cbd917f2026d-combined-ca-bundle\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:05 crc kubenswrapper[4766]: I1002 12:34:05.067096 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/888c9f82-8929-4dda-b89a-cbd917f2026d-config-data\") pod \"octavia-healthmanager-cqtgm\" (UID: \"888c9f82-8929-4dda-b89a-cbd917f2026d\") " pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:05 crc kubenswrapper[4766]: I1002 12:34:05.128944 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:06 crc kubenswrapper[4766]: I1002 12:34:06.012318 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-8pj2m" Oct 02 12:34:06 crc kubenswrapper[4766]: I1002 12:34:06.166237 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-combined-ca-bundle\") pod \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\" (UID: \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\") " Oct 02 12:34:06 crc kubenswrapper[4766]: I1002 12:34:06.166564 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-scripts\") pod \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\" (UID: \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\") " Oct 02 12:34:06 crc kubenswrapper[4766]: I1002 12:34:06.166634 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-config-data-merged\") pod \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\" (UID: \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\") " Oct 02 12:34:06 crc kubenswrapper[4766]: I1002 12:34:06.166691 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-config-data\") pod \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\" (UID: \"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4\") " Oct 02 12:34:06 crc kubenswrapper[4766]: I1002 12:34:06.171651 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-scripts" (OuterVolumeSpecName: "scripts") pod "ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4" (UID: "ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:34:06 crc kubenswrapper[4766]: I1002 12:34:06.171804 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-config-data" (OuterVolumeSpecName: "config-data") pod "ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4" (UID: "ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:34:06 crc kubenswrapper[4766]: I1002 12:34:06.214715 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4" (UID: "ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:34:06 crc kubenswrapper[4766]: I1002 12:34:06.215944 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4" (UID: "ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:34:06 crc kubenswrapper[4766]: I1002 12:34:06.269535 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:34:06 crc kubenswrapper[4766]: I1002 12:34:06.269593 4766 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-config-data-merged\") on node \"crc\" DevicePath \"\"" Oct 02 12:34:06 crc kubenswrapper[4766]: I1002 12:34:06.269611 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:34:06 crc kubenswrapper[4766]: I1002 12:34:06.269623 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:34:06 crc kubenswrapper[4766]: I1002 12:34:06.579322 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-8pj2m" event={"ID":"ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4","Type":"ContainerDied","Data":"52975ce7a90b303be3473ab6355862b70ed6167b38f729a0ae75aca72082085a"} Oct 02 12:34:06 crc kubenswrapper[4766]: I1002 12:34:06.579376 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52975ce7a90b303be3473ab6355862b70ed6167b38f729a0ae75aca72082085a" Oct 02 12:34:06 crc kubenswrapper[4766]: I1002 12:34:06.579451 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-8pj2m" Oct 02 12:34:07 crc kubenswrapper[4766]: I1002 12:34:07.299247 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-cqtgm"] Oct 02 12:34:07 crc kubenswrapper[4766]: I1002 12:34:07.390660 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-t6x4h"] Oct 02 12:34:07 crc kubenswrapper[4766]: I1002 12:34:07.607096 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-t6x4h" event={"ID":"0c16aa8e-cc08-497e-8ad3-db18bbc82afd","Type":"ContainerStarted","Data":"a7f469bb71e71a819be74257c9fab2bb9e9b5cfff237aeac8bc37fb53ed72d44"} Oct 02 12:34:07 crc kubenswrapper[4766]: I1002 12:34:07.613117 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-qntgm" event={"ID":"edab4cbb-3e4f-4a92-9045-8f239ecf24bf","Type":"ContainerStarted","Data":"796bf35ff7adb487f210cba8ffd1ab21efbf9b6e9432bb83bc39acf1ed73fc07"} Oct 02 12:34:07 crc kubenswrapper[4766]: I1002 12:34:07.620965 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-r2wtj" event={"ID":"e436f55e-dc19-4d4b-be98-d024fb589618","Type":"ContainerStarted","Data":"d78f73756299ec4fbca7d4556a4c0e3e2220ed8649926bbc33d2ffdf79a75789"} Oct 02 12:34:07 crc kubenswrapper[4766]: I1002 12:34:07.622015 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-r2wtj" Oct 02 12:34:07 crc kubenswrapper[4766]: I1002 12:34:07.630329 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-cqtgm" event={"ID":"888c9f82-8929-4dda-b89a-cbd917f2026d","Type":"ContainerStarted","Data":"3dc8b97fc27ce344eb1d165c12f305e732d4b687733db24ff631b20f3f48524a"} Oct 02 12:34:07 crc kubenswrapper[4766]: I1002 12:34:07.672582 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-r2wtj" podStartSLOduration=1.988061064 podStartE2EDuration="15.672554914s" podCreationTimestamp="2025-10-02 12:33:52 +0000 UTC" firstStartedPulling="2025-10-02 12:33:52.95583267 +0000 UTC m=+6147.898703614" lastFinishedPulling="2025-10-02 12:34:06.64032652 +0000 UTC m=+6161.583197464" observedRunningTime="2025-10-02 12:34:07.666613654 +0000 UTC m=+6162.609484608" watchObservedRunningTime="2025-10-02 12:34:07.672554914 +0000 UTC m=+6162.615425858" Oct 02 12:34:08 crc kubenswrapper[4766]: I1002 12:34:08.643061 4766 generic.go:334] "Generic (PLEG): container finished" podID="edab4cbb-3e4f-4a92-9045-8f239ecf24bf" containerID="796bf35ff7adb487f210cba8ffd1ab21efbf9b6e9432bb83bc39acf1ed73fc07" exitCode=0 Oct 02 12:34:08 crc kubenswrapper[4766]: I1002 12:34:08.643304 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-qntgm" event={"ID":"edab4cbb-3e4f-4a92-9045-8f239ecf24bf","Type":"ContainerDied","Data":"796bf35ff7adb487f210cba8ffd1ab21efbf9b6e9432bb83bc39acf1ed73fc07"} Oct 02 12:34:08 crc kubenswrapper[4766]: I1002 12:34:08.646211 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-cqtgm" event={"ID":"888c9f82-8929-4dda-b89a-cbd917f2026d","Type":"ContainerStarted","Data":"0b299ba4bf0a05ee6b05e46f7643d39e90f27678875409036ce76213784b2880"} Oct 02 12:34:08 crc kubenswrapper[4766]: I1002 12:34:08.817988 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-7j2sx"] Oct 02 12:34:08 crc kubenswrapper[4766]: E1002 12:34:08.819864 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4" containerName="octavia-db-sync" Oct 02 12:34:08 crc kubenswrapper[4766]: I1002 12:34:08.819893 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4" containerName="octavia-db-sync" Oct 02 12:34:08 crc kubenswrapper[4766]: E1002 12:34:08.819927 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4" containerName="init" Oct 02 12:34:08 crc kubenswrapper[4766]: I1002 12:34:08.819933 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4" containerName="init" Oct 02 12:34:08 crc kubenswrapper[4766]: I1002 12:34:08.820194 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4" containerName="octavia-db-sync" Oct 02 12:34:08 crc kubenswrapper[4766]: I1002 12:34:08.821435 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:08 crc kubenswrapper[4766]: I1002 12:34:08.826411 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Oct 02 12:34:08 crc kubenswrapper[4766]: I1002 12:34:08.827075 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Oct 02 12:34:08 crc kubenswrapper[4766]: I1002 12:34:08.832483 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-7j2sx"] Oct 02 12:34:08 crc kubenswrapper[4766]: I1002 12:34:08.938445 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/958c4797-4e76-4c64-9e99-8059508526c6-scripts\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:08 crc kubenswrapper[4766]: I1002 12:34:08.938663 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/958c4797-4e76-4c64-9e99-8059508526c6-config-data-merged\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:08 crc kubenswrapper[4766]: I1002 12:34:08.938991 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958c4797-4e76-4c64-9e99-8059508526c6-combined-ca-bundle\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:08 crc kubenswrapper[4766]: I1002 12:34:08.939096 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/958c4797-4e76-4c64-9e99-8059508526c6-hm-ports\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:08 crc kubenswrapper[4766]: I1002 12:34:08.939145 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958c4797-4e76-4c64-9e99-8059508526c6-config-data\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:08 crc kubenswrapper[4766]: I1002 12:34:08.939859 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/958c4797-4e76-4c64-9e99-8059508526c6-amphora-certs\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:09 crc kubenswrapper[4766]: I1002 12:34:09.042420 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/958c4797-4e76-4c64-9e99-8059508526c6-config-data-merged\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:09 crc kubenswrapper[4766]: I1002 12:34:09.042526 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958c4797-4e76-4c64-9e99-8059508526c6-combined-ca-bundle\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:09 crc kubenswrapper[4766]: I1002 12:34:09.042556 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/958c4797-4e76-4c64-9e99-8059508526c6-hm-ports\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:09 crc kubenswrapper[4766]: I1002 12:34:09.042582 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958c4797-4e76-4c64-9e99-8059508526c6-config-data\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:09 crc kubenswrapper[4766]: I1002 12:34:09.042644 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/958c4797-4e76-4c64-9e99-8059508526c6-amphora-certs\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:09 crc kubenswrapper[4766]: I1002 12:34:09.043717 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/958c4797-4e76-4c64-9e99-8059508526c6-config-data-merged\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:09 crc kubenswrapper[4766]: I1002 12:34:09.044010 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/958c4797-4e76-4c64-9e99-8059508526c6-hm-ports\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:09 crc kubenswrapper[4766]: I1002 12:34:09.044043 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/958c4797-4e76-4c64-9e99-8059508526c6-scripts\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:09 crc kubenswrapper[4766]: I1002 12:34:09.050527 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/958c4797-4e76-4c64-9e99-8059508526c6-amphora-certs\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:09 crc kubenswrapper[4766]: I1002 12:34:09.050695 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/958c4797-4e76-4c64-9e99-8059508526c6-config-data\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:09 crc kubenswrapper[4766]: I1002 12:34:09.051206 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/958c4797-4e76-4c64-9e99-8059508526c6-scripts\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:09 crc kubenswrapper[4766]: I1002 12:34:09.061486 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/958c4797-4e76-4c64-9e99-8059508526c6-combined-ca-bundle\") pod \"octavia-worker-7j2sx\" (UID: \"958c4797-4e76-4c64-9e99-8059508526c6\") " pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:09 crc kubenswrapper[4766]: I1002 12:34:09.147145 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:10 crc kubenswrapper[4766]: I1002 12:34:10.036799 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-7j2sx"] Oct 02 12:34:10 crc kubenswrapper[4766]: W1002 12:34:10.037210 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod958c4797_4e76_4c64_9e99_8059508526c6.slice/crio-6cabf0c90e81e281b73b9e7ea8f31303eefa265f02180ee041a10599c6ea4e4f WatchSource:0}: Error finding container 6cabf0c90e81e281b73b9e7ea8f31303eefa265f02180ee041a10599c6ea4e4f: Status 404 returned error can't find the container with id 6cabf0c90e81e281b73b9e7ea8f31303eefa265f02180ee041a10599c6ea4e4f Oct 02 12:34:10 crc kubenswrapper[4766]: I1002 12:34:10.699166 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-7j2sx" event={"ID":"958c4797-4e76-4c64-9e99-8059508526c6","Type":"ContainerStarted","Data":"6cabf0c90e81e281b73b9e7ea8f31303eefa265f02180ee041a10599c6ea4e4f"} Oct 02 12:34:10 crc kubenswrapper[4766]: I1002 12:34:10.702232 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-t6x4h" event={"ID":"0c16aa8e-cc08-497e-8ad3-db18bbc82afd","Type":"ContainerStarted","Data":"ad8dfe83b5d971778ea1ae4afc58807380afcd55f48079c42608a5c0f97b9cac"} Oct 02 12:34:11 crc kubenswrapper[4766]: E1002 12:34:11.391465 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c16aa8e_cc08_497e_8ad3_db18bbc82afd.slice/crio-conmon-ad8dfe83b5d971778ea1ae4afc58807380afcd55f48079c42608a5c0f97b9cac.scope\": RecentStats: unable to find data in memory cache]" Oct 02 12:34:11 crc kubenswrapper[4766]: I1002 12:34:11.723450 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-qntgm" event={"ID":"edab4cbb-3e4f-4a92-9045-8f239ecf24bf","Type":"ContainerStarted","Data":"383004f972090951e5ac23a1d8dc273087c1f0bfd4ca7cf0e9d677e8a3fbf8a6"} Oct 02 12:34:11 crc kubenswrapper[4766]: I1002 12:34:11.727128 4766 generic.go:334] "Generic (PLEG): container finished" podID="888c9f82-8929-4dda-b89a-cbd917f2026d" containerID="0b299ba4bf0a05ee6b05e46f7643d39e90f27678875409036ce76213784b2880" exitCode=0 Oct 02 12:34:11 crc kubenswrapper[4766]: I1002 12:34:11.727221 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-cqtgm" event={"ID":"888c9f82-8929-4dda-b89a-cbd917f2026d","Type":"ContainerDied","Data":"0b299ba4bf0a05ee6b05e46f7643d39e90f27678875409036ce76213784b2880"} Oct 02 12:34:11 crc kubenswrapper[4766]: I1002 12:34:11.733050 4766 generic.go:334] "Generic (PLEG): container finished" podID="0c16aa8e-cc08-497e-8ad3-db18bbc82afd" containerID="ad8dfe83b5d971778ea1ae4afc58807380afcd55f48079c42608a5c0f97b9cac" exitCode=0 Oct 02 12:34:11 crc kubenswrapper[4766]: I1002 12:34:11.733124 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-t6x4h" event={"ID":"0c16aa8e-cc08-497e-8ad3-db18bbc82afd","Type":"ContainerDied","Data":"ad8dfe83b5d971778ea1ae4afc58807380afcd55f48079c42608a5c0f97b9cac"} Oct 02 12:34:11 crc kubenswrapper[4766]: I1002 12:34:11.759898 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-qntgm" podStartSLOduration=2.572385923 podStartE2EDuration="18.759867169s" podCreationTimestamp="2025-10-02 12:33:53 +0000 UTC" firstStartedPulling="2025-10-02 12:33:54.515241501 +0000 UTC m=+6149.458112445" lastFinishedPulling="2025-10-02 12:34:10.702722747 +0000 UTC m=+6165.645593691" observedRunningTime="2025-10-02 12:34:11.744562219 +0000 UTC m=+6166.687433173" watchObservedRunningTime="2025-10-02 12:34:11.759867169 +0000 UTC m=+6166.702738113" Oct 02 12:34:12 crc kubenswrapper[4766]: I1002 12:34:12.746527 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-t6x4h" event={"ID":"0c16aa8e-cc08-497e-8ad3-db18bbc82afd","Type":"ContainerStarted","Data":"e4d5aa711f0d3395f0f8e5b1674dfffe89227dbe5ab157c0dacb86cdfbdac61a"} Oct 02 12:34:12 crc kubenswrapper[4766]: I1002 12:34:12.747355 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:12 crc kubenswrapper[4766]: I1002 12:34:12.749830 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-cqtgm" event={"ID":"888c9f82-8929-4dda-b89a-cbd917f2026d","Type":"ContainerStarted","Data":"b0b0bcdc020a9593c4a52c173bb62a8f829d199ad3532e6804216f05832be3c1"} Oct 02 12:34:12 crc kubenswrapper[4766]: I1002 12:34:12.750093 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:12 crc kubenswrapper[4766]: I1002 12:34:12.752613 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-7j2sx" event={"ID":"958c4797-4e76-4c64-9e99-8059508526c6","Type":"ContainerStarted","Data":"9f784efcb5f95ec370b9a2e4fe9cd90be20e9139beb007cff635ed039c841a3e"} Oct 02 12:34:12 crc kubenswrapper[4766]: I1002 12:34:12.792591 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-t6x4h" podStartSLOduration=7.821810191 podStartE2EDuration="9.792557948s" podCreationTimestamp="2025-10-02 12:34:03 +0000 UTC" firstStartedPulling="2025-10-02 12:34:07.410217911 +0000 UTC m=+6162.353088855" lastFinishedPulling="2025-10-02 12:34:09.380965668 +0000 UTC m=+6164.323836612" observedRunningTime="2025-10-02 12:34:12.774312754 +0000 UTC m=+6167.717183728" watchObservedRunningTime="2025-10-02 12:34:12.792557948 +0000 UTC m=+6167.735428892" Oct 02 12:34:12 crc kubenswrapper[4766]: I1002 12:34:12.836472 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-cqtgm" podStartSLOduration=8.836444374 podStartE2EDuration="8.836444374s" podCreationTimestamp="2025-10-02 12:34:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:34:12.823717826 +0000 UTC m=+6167.766588770" watchObservedRunningTime="2025-10-02 12:34:12.836444374 +0000 UTC m=+6167.779315308" Oct 02 12:34:13 crc kubenswrapper[4766]: I1002 12:34:13.767746 4766 generic.go:334] "Generic (PLEG): container finished" podID="958c4797-4e76-4c64-9e99-8059508526c6" containerID="9f784efcb5f95ec370b9a2e4fe9cd90be20e9139beb007cff635ed039c841a3e" exitCode=0 Oct 02 12:34:13 crc kubenswrapper[4766]: I1002 12:34:13.769976 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-7j2sx" event={"ID":"958c4797-4e76-4c64-9e99-8059508526c6","Type":"ContainerDied","Data":"9f784efcb5f95ec370b9a2e4fe9cd90be20e9139beb007cff635ed039c841a3e"} Oct 02 12:34:14 crc kubenswrapper[4766]: I1002 12:34:14.783137 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-7j2sx" event={"ID":"958c4797-4e76-4c64-9e99-8059508526c6","Type":"ContainerStarted","Data":"9661bca7f3181ee8bce10faaebd76df9a4fd8eec6a73ab330d4e8b229d1deb4c"} Oct 02 12:34:14 crc kubenswrapper[4766]: I1002 12:34:14.783804 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:14 crc kubenswrapper[4766]: I1002 12:34:14.809785 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-7j2sx" podStartSLOduration=5.343040201 podStartE2EDuration="6.809760603s" podCreationTimestamp="2025-10-02 12:34:08 +0000 UTC" firstStartedPulling="2025-10-02 12:34:10.040704261 +0000 UTC m=+6164.983575205" lastFinishedPulling="2025-10-02 12:34:11.507424663 +0000 UTC m=+6166.450295607" observedRunningTime="2025-10-02 12:34:14.802915593 +0000 UTC m=+6169.745786537" watchObservedRunningTime="2025-10-02 12:34:14.809760603 +0000 UTC m=+6169.752631547" Oct 02 12:34:16 crc kubenswrapper[4766]: I1002 12:34:16.058361 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-9sl5k"] Oct 02 12:34:16 crc kubenswrapper[4766]: I1002 12:34:16.068399 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-9sl5k"] Oct 02 12:34:17 crc kubenswrapper[4766]: I1002 12:34:17.906556 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcf42f52-167c-4156-81a6-931d97a25017" path="/var/lib/kubelet/pods/fcf42f52-167c-4156-81a6-931d97a25017/volumes" Oct 02 12:34:18 crc kubenswrapper[4766]: I1002 12:34:18.446181 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-t6x4h" Oct 02 12:34:20 crc kubenswrapper[4766]: I1002 12:34:20.164903 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-cqtgm" Oct 02 12:34:22 crc kubenswrapper[4766]: I1002 12:34:22.461488 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-r2wtj" Oct 02 12:34:24 crc kubenswrapper[4766]: I1002 12:34:24.181676 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-7j2sx" Oct 02 12:34:24 crc kubenswrapper[4766]: I1002 12:34:24.432330 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:34:24 crc kubenswrapper[4766]: I1002 12:34:24.432428 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:34:24 crc kubenswrapper[4766]: I1002 12:34:24.432497 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 12:34:24 crc kubenswrapper[4766]: I1002 12:34:24.433567 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d94a111146843f4af084462f95b5d87bd572303698d956b1f0dbe284471be49c"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:34:24 crc kubenswrapper[4766]: I1002 12:34:24.433638 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://d94a111146843f4af084462f95b5d87bd572303698d956b1f0dbe284471be49c" gracePeriod=600 Oct 02 12:34:24 crc kubenswrapper[4766]: I1002 12:34:24.914283 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="d94a111146843f4af084462f95b5d87bd572303698d956b1f0dbe284471be49c" exitCode=0 Oct 02 12:34:24 crc kubenswrapper[4766]: I1002 12:34:24.914364 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"d94a111146843f4af084462f95b5d87bd572303698d956b1f0dbe284471be49c"} Oct 02 12:34:24 crc kubenswrapper[4766]: I1002 12:34:24.914862 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b"} Oct 02 12:34:24 crc kubenswrapper[4766]: I1002 12:34:24.914889 4766 scope.go:117] "RemoveContainer" containerID="f02842b8c9eb6febfb992c73a60dead7e4a40ff0e104397e272d1050593c0984" Oct 02 12:34:27 crc kubenswrapper[4766]: I1002 12:34:27.042439 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-7a42-account-create-m774k"] Oct 02 12:34:27 crc kubenswrapper[4766]: I1002 12:34:27.056149 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-7a42-account-create-m774k"] Oct 02 12:34:27 crc kubenswrapper[4766]: I1002 12:34:27.896668 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="723650c5-e986-47e1-adc6-bf11bb38b84c" path="/var/lib/kubelet/pods/723650c5-e986-47e1-adc6-bf11bb38b84c/volumes" Oct 02 12:34:31 crc kubenswrapper[4766]: I1002 12:34:31.022242 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-qntgm"] Oct 02 12:34:31 crc kubenswrapper[4766]: I1002 12:34:31.023230 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-qntgm" podUID="edab4cbb-3e4f-4a92-9045-8f239ecf24bf" containerName="octavia-amphora-httpd" containerID="cri-o://383004f972090951e5ac23a1d8dc273087c1f0bfd4ca7cf0e9d677e8a3fbf8a6" gracePeriod=30 Oct 02 12:34:31 crc kubenswrapper[4766]: I1002 12:34:31.667829 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-qntgm" Oct 02 12:34:31 crc kubenswrapper[4766]: I1002 12:34:31.725064 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/edab4cbb-3e4f-4a92-9045-8f239ecf24bf-httpd-config\") pod \"edab4cbb-3e4f-4a92-9045-8f239ecf24bf\" (UID: \"edab4cbb-3e4f-4a92-9045-8f239ecf24bf\") " Oct 02 12:34:31 crc kubenswrapper[4766]: I1002 12:34:31.725753 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/edab4cbb-3e4f-4a92-9045-8f239ecf24bf-amphora-image\") pod \"edab4cbb-3e4f-4a92-9045-8f239ecf24bf\" (UID: \"edab4cbb-3e4f-4a92-9045-8f239ecf24bf\") " Oct 02 12:34:31 crc kubenswrapper[4766]: I1002 12:34:31.765905 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edab4cbb-3e4f-4a92-9045-8f239ecf24bf-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "edab4cbb-3e4f-4a92-9045-8f239ecf24bf" (UID: "edab4cbb-3e4f-4a92-9045-8f239ecf24bf"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:34:31 crc kubenswrapper[4766]: I1002 12:34:31.798763 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edab4cbb-3e4f-4a92-9045-8f239ecf24bf-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "edab4cbb-3e4f-4a92-9045-8f239ecf24bf" (UID: "edab4cbb-3e4f-4a92-9045-8f239ecf24bf"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:34:31 crc kubenswrapper[4766]: I1002 12:34:31.829318 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/edab4cbb-3e4f-4a92-9045-8f239ecf24bf-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:34:31 crc kubenswrapper[4766]: I1002 12:34:31.829374 4766 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/edab4cbb-3e4f-4a92-9045-8f239ecf24bf-amphora-image\") on node \"crc\" DevicePath \"\"" Oct 02 12:34:32 crc kubenswrapper[4766]: E1002 12:34:32.023817 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedab4cbb_3e4f_4a92_9045_8f239ecf24bf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedab4cbb_3e4f_4a92_9045_8f239ecf24bf.slice/crio-63b8a36d31d01d6c9663a8b8b9b1de6e87477e3a5ae76f4adc382251b7986600\": RecentStats: unable to find data in memory cache]" Oct 02 12:34:32 crc kubenswrapper[4766]: I1002 12:34:32.032582 4766 generic.go:334] "Generic (PLEG): container finished" podID="edab4cbb-3e4f-4a92-9045-8f239ecf24bf" containerID="383004f972090951e5ac23a1d8dc273087c1f0bfd4ca7cf0e9d677e8a3fbf8a6" exitCode=0 Oct 02 12:34:32 crc kubenswrapper[4766]: I1002 12:34:32.032634 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-qntgm" event={"ID":"edab4cbb-3e4f-4a92-9045-8f239ecf24bf","Type":"ContainerDied","Data":"383004f972090951e5ac23a1d8dc273087c1f0bfd4ca7cf0e9d677e8a3fbf8a6"} Oct 02 12:34:32 crc kubenswrapper[4766]: I1002 12:34:32.032718 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-qntgm" event={"ID":"edab4cbb-3e4f-4a92-9045-8f239ecf24bf","Type":"ContainerDied","Data":"63b8a36d31d01d6c9663a8b8b9b1de6e87477e3a5ae76f4adc382251b7986600"} Oct 02 12:34:32 crc kubenswrapper[4766]: I1002 12:34:32.032744 4766 scope.go:117] "RemoveContainer" containerID="383004f972090951e5ac23a1d8dc273087c1f0bfd4ca7cf0e9d677e8a3fbf8a6" Oct 02 12:34:32 crc kubenswrapper[4766]: I1002 12:34:32.033051 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-qntgm" Oct 02 12:34:32 crc kubenswrapper[4766]: I1002 12:34:32.050875 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-wq959"] Oct 02 12:34:32 crc kubenswrapper[4766]: I1002 12:34:32.066015 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-wq959"] Oct 02 12:34:32 crc kubenswrapper[4766]: I1002 12:34:32.070076 4766 scope.go:117] "RemoveContainer" containerID="796bf35ff7adb487f210cba8ffd1ab21efbf9b6e9432bb83bc39acf1ed73fc07" Oct 02 12:34:32 crc kubenswrapper[4766]: I1002 12:34:32.085665 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-qntgm"] Oct 02 12:34:32 crc kubenswrapper[4766]: I1002 12:34:32.098313 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-qntgm"] Oct 02 12:34:32 crc kubenswrapper[4766]: I1002 12:34:32.107683 4766 scope.go:117] "RemoveContainer" containerID="383004f972090951e5ac23a1d8dc273087c1f0bfd4ca7cf0e9d677e8a3fbf8a6" Oct 02 12:34:32 crc kubenswrapper[4766]: E1002 12:34:32.108332 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"383004f972090951e5ac23a1d8dc273087c1f0bfd4ca7cf0e9d677e8a3fbf8a6\": container with ID starting with 383004f972090951e5ac23a1d8dc273087c1f0bfd4ca7cf0e9d677e8a3fbf8a6 not found: ID does not exist" containerID="383004f972090951e5ac23a1d8dc273087c1f0bfd4ca7cf0e9d677e8a3fbf8a6" Oct 02 12:34:32 crc kubenswrapper[4766]: I1002 12:34:32.108381 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"383004f972090951e5ac23a1d8dc273087c1f0bfd4ca7cf0e9d677e8a3fbf8a6"} err="failed to get container status \"383004f972090951e5ac23a1d8dc273087c1f0bfd4ca7cf0e9d677e8a3fbf8a6\": rpc error: code = NotFound desc = could not find container \"383004f972090951e5ac23a1d8dc273087c1f0bfd4ca7cf0e9d677e8a3fbf8a6\": container with ID starting with 383004f972090951e5ac23a1d8dc273087c1f0bfd4ca7cf0e9d677e8a3fbf8a6 not found: ID does not exist" Oct 02 12:34:32 crc kubenswrapper[4766]: I1002 12:34:32.108406 4766 scope.go:117] "RemoveContainer" containerID="796bf35ff7adb487f210cba8ffd1ab21efbf9b6e9432bb83bc39acf1ed73fc07" Oct 02 12:34:32 crc kubenswrapper[4766]: E1002 12:34:32.108892 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"796bf35ff7adb487f210cba8ffd1ab21efbf9b6e9432bb83bc39acf1ed73fc07\": container with ID starting with 796bf35ff7adb487f210cba8ffd1ab21efbf9b6e9432bb83bc39acf1ed73fc07 not found: ID does not exist" containerID="796bf35ff7adb487f210cba8ffd1ab21efbf9b6e9432bb83bc39acf1ed73fc07" Oct 02 12:34:32 crc kubenswrapper[4766]: I1002 12:34:32.108911 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"796bf35ff7adb487f210cba8ffd1ab21efbf9b6e9432bb83bc39acf1ed73fc07"} err="failed to get container status \"796bf35ff7adb487f210cba8ffd1ab21efbf9b6e9432bb83bc39acf1ed73fc07\": rpc error: code = NotFound desc = could not find container \"796bf35ff7adb487f210cba8ffd1ab21efbf9b6e9432bb83bc39acf1ed73fc07\": container with ID starting with 796bf35ff7adb487f210cba8ffd1ab21efbf9b6e9432bb83bc39acf1ed73fc07 not found: ID does not exist" Oct 02 12:34:33 crc kubenswrapper[4766]: I1002 12:34:33.898639 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a01af80c-f331-4cac-8f4e-7653f4b4f296" path="/var/lib/kubelet/pods/a01af80c-f331-4cac-8f4e-7653f4b4f296/volumes" Oct 02 12:34:33 crc kubenswrapper[4766]: I1002 12:34:33.900595 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edab4cbb-3e4f-4a92-9045-8f239ecf24bf" path="/var/lib/kubelet/pods/edab4cbb-3e4f-4a92-9045-8f239ecf24bf/volumes" Oct 02 12:34:36 crc kubenswrapper[4766]: I1002 12:34:36.257076 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-2l26f"] Oct 02 12:34:36 crc kubenswrapper[4766]: E1002 12:34:36.258152 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edab4cbb-3e4f-4a92-9045-8f239ecf24bf" containerName="octavia-amphora-httpd" Oct 02 12:34:36 crc kubenswrapper[4766]: I1002 12:34:36.258176 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="edab4cbb-3e4f-4a92-9045-8f239ecf24bf" containerName="octavia-amphora-httpd" Oct 02 12:34:36 crc kubenswrapper[4766]: E1002 12:34:36.258192 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edab4cbb-3e4f-4a92-9045-8f239ecf24bf" containerName="init" Oct 02 12:34:36 crc kubenswrapper[4766]: I1002 12:34:36.258200 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="edab4cbb-3e4f-4a92-9045-8f239ecf24bf" containerName="init" Oct 02 12:34:36 crc kubenswrapper[4766]: I1002 12:34:36.258541 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="edab4cbb-3e4f-4a92-9045-8f239ecf24bf" containerName="octavia-amphora-httpd" Oct 02 12:34:36 crc kubenswrapper[4766]: I1002 12:34:36.260047 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-2l26f" Oct 02 12:34:36 crc kubenswrapper[4766]: I1002 12:34:36.263202 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Oct 02 12:34:36 crc kubenswrapper[4766]: I1002 12:34:36.280388 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-2l26f"] Oct 02 12:34:36 crc kubenswrapper[4766]: I1002 12:34:36.438002 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16a82893-bce9-4426-ba47-7da418e9ba66-httpd-config\") pod \"octavia-image-upload-59f8cff499-2l26f\" (UID: \"16a82893-bce9-4426-ba47-7da418e9ba66\") " pod="openstack/octavia-image-upload-59f8cff499-2l26f" Oct 02 12:34:36 crc kubenswrapper[4766]: I1002 12:34:36.438077 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/16a82893-bce9-4426-ba47-7da418e9ba66-amphora-image\") pod \"octavia-image-upload-59f8cff499-2l26f\" (UID: \"16a82893-bce9-4426-ba47-7da418e9ba66\") " pod="openstack/octavia-image-upload-59f8cff499-2l26f" Oct 02 12:34:36 crc kubenswrapper[4766]: I1002 12:34:36.540904 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16a82893-bce9-4426-ba47-7da418e9ba66-httpd-config\") pod \"octavia-image-upload-59f8cff499-2l26f\" (UID: \"16a82893-bce9-4426-ba47-7da418e9ba66\") " pod="openstack/octavia-image-upload-59f8cff499-2l26f" Oct 02 12:34:36 crc kubenswrapper[4766]: I1002 12:34:36.540975 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/16a82893-bce9-4426-ba47-7da418e9ba66-amphora-image\") pod \"octavia-image-upload-59f8cff499-2l26f\" (UID: \"16a82893-bce9-4426-ba47-7da418e9ba66\") " pod="openstack/octavia-image-upload-59f8cff499-2l26f" Oct 02 12:34:36 crc kubenswrapper[4766]: I1002 12:34:36.541656 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/16a82893-bce9-4426-ba47-7da418e9ba66-amphora-image\") pod \"octavia-image-upload-59f8cff499-2l26f\" (UID: \"16a82893-bce9-4426-ba47-7da418e9ba66\") " pod="openstack/octavia-image-upload-59f8cff499-2l26f" Oct 02 12:34:36 crc kubenswrapper[4766]: I1002 12:34:36.555722 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/16a82893-bce9-4426-ba47-7da418e9ba66-httpd-config\") pod \"octavia-image-upload-59f8cff499-2l26f\" (UID: \"16a82893-bce9-4426-ba47-7da418e9ba66\") " pod="openstack/octavia-image-upload-59f8cff499-2l26f" Oct 02 12:34:36 crc kubenswrapper[4766]: I1002 12:34:36.611545 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-2l26f" Oct 02 12:34:36 crc kubenswrapper[4766]: I1002 12:34:36.943640 4766 scope.go:117] "RemoveContainer" containerID="c62bec45e41c683cb15e79bf201131637cef78af02e507b6817e786cb4106d29" Oct 02 12:34:36 crc kubenswrapper[4766]: I1002 12:34:36.969355 4766 scope.go:117] "RemoveContainer" containerID="d6a55397fb39c5e4d77dc188c1335d17402b7ee6dc6da06eb032277a50320162" Oct 02 12:34:37 crc kubenswrapper[4766]: I1002 12:34:37.026708 4766 scope.go:117] "RemoveContainer" containerID="73f38845f3f0c4f3c2a10872d1205ad9ce3d06f4bb05ffeb25129dcc07cc73d7" Oct 02 12:34:37 crc kubenswrapper[4766]: I1002 12:34:37.129198 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-2l26f"] Oct 02 12:34:37 crc kubenswrapper[4766]: W1002 12:34:37.141968 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16a82893_bce9_4426_ba47_7da418e9ba66.slice/crio-3be1a3ef38e32444bcdfa37d3c35026a42a91d4fb5cd9ebcfb52a1f41b0d7e2a WatchSource:0}: Error finding container 3be1a3ef38e32444bcdfa37d3c35026a42a91d4fb5cd9ebcfb52a1f41b0d7e2a: Status 404 returned error can't find the container with id 3be1a3ef38e32444bcdfa37d3c35026a42a91d4fb5cd9ebcfb52a1f41b0d7e2a Oct 02 12:34:38 crc kubenswrapper[4766]: I1002 12:34:38.150398 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-2l26f" event={"ID":"16a82893-bce9-4426-ba47-7da418e9ba66","Type":"ContainerStarted","Data":"ba8bc4e2b8a342323fa44a1cdbb7741c3270b776a91abfad23f529d3b1d894fd"} Oct 02 12:34:38 crc kubenswrapper[4766]: I1002 12:34:38.150910 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-2l26f" event={"ID":"16a82893-bce9-4426-ba47-7da418e9ba66","Type":"ContainerStarted","Data":"3be1a3ef38e32444bcdfa37d3c35026a42a91d4fb5cd9ebcfb52a1f41b0d7e2a"} Oct 02 12:34:39 crc kubenswrapper[4766]: I1002 12:34:39.167508 4766 generic.go:334] "Generic (PLEG): container finished" podID="16a82893-bce9-4426-ba47-7da418e9ba66" containerID="ba8bc4e2b8a342323fa44a1cdbb7741c3270b776a91abfad23f529d3b1d894fd" exitCode=0 Oct 02 12:34:39 crc kubenswrapper[4766]: I1002 12:34:39.167804 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-2l26f" event={"ID":"16a82893-bce9-4426-ba47-7da418e9ba66","Type":"ContainerDied","Data":"ba8bc4e2b8a342323fa44a1cdbb7741c3270b776a91abfad23f529d3b1d894fd"} Oct 02 12:34:42 crc kubenswrapper[4766]: I1002 12:34:42.202840 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-2l26f" event={"ID":"16a82893-bce9-4426-ba47-7da418e9ba66","Type":"ContainerStarted","Data":"007618ce18dd78b6657496539bed92d6bb894f2c9cace7ea8aa92661f6a47070"} Oct 02 12:34:42 crc kubenswrapper[4766]: I1002 12:34:42.232717 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-2l26f" podStartSLOduration=2.246366162 podStartE2EDuration="6.232686691s" podCreationTimestamp="2025-10-02 12:34:36 +0000 UTC" firstStartedPulling="2025-10-02 12:34:37.15865774 +0000 UTC m=+6192.101528684" lastFinishedPulling="2025-10-02 12:34:41.144978269 +0000 UTC m=+6196.087849213" observedRunningTime="2025-10-02 12:34:42.217841745 +0000 UTC m=+6197.160712689" watchObservedRunningTime="2025-10-02 12:34:42.232686691 +0000 UTC m=+6197.175557635" Oct 02 12:35:03 crc kubenswrapper[4766]: I1002 12:35:03.056748 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-nbgf2"] Oct 02 12:35:03 crc kubenswrapper[4766]: I1002 12:35:03.067277 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-nbgf2"] Oct 02 12:35:03 crc kubenswrapper[4766]: I1002 12:35:03.897758 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7999e98f-03cb-469a-8187-af8698bb975e" path="/var/lib/kubelet/pods/7999e98f-03cb-469a-8187-af8698bb975e/volumes" Oct 02 12:35:13 crc kubenswrapper[4766]: I1002 12:35:13.052356 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-3461-account-create-lc4f5"] Oct 02 12:35:13 crc kubenswrapper[4766]: I1002 12:35:13.062937 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-3461-account-create-lc4f5"] Oct 02 12:35:13 crc kubenswrapper[4766]: I1002 12:35:13.902897 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8" path="/var/lib/kubelet/pods/789f25c6-bb9c-43cd-8ad2-b0a4c1d8b1a8/volumes" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.734870 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-575f5495ff-tkktv"] Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.739720 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.747419 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.748038 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.747714 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-nlxhj" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.748455 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.786402 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-575f5495ff-tkktv"] Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.838792 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.839078 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9daa02cf-4179-422d-bbf1-eb56fecdaa2e" containerName="glance-log" containerID="cri-o://1929510534d2b26bef2c41e543ec440a13c4ba67b16dcaf7168c5dc77e27373f" gracePeriod=30 Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.839243 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9daa02cf-4179-422d-bbf1-eb56fecdaa2e" containerName="glance-httpd" containerID="cri-o://5510a5ae64fb788bc5c47e75ccb77eb054581bf32a617929d26866146d7f5cec" gracePeriod=30 Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.853209 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc5xp\" (UniqueName: \"kubernetes.io/projected/df4ca968-dcb2-434e-8442-81e871efe544-kube-api-access-wc5xp\") pod \"horizon-575f5495ff-tkktv\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.853293 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df4ca968-dcb2-434e-8442-81e871efe544-logs\") pod \"horizon-575f5495ff-tkktv\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.853409 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df4ca968-dcb2-434e-8442-81e871efe544-config-data\") pod \"horizon-575f5495ff-tkktv\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.853560 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df4ca968-dcb2-434e-8442-81e871efe544-scripts\") pod \"horizon-575f5495ff-tkktv\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.853668 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/df4ca968-dcb2-434e-8442-81e871efe544-horizon-secret-key\") pod \"horizon-575f5495ff-tkktv\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.911346 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.911710 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" containerName="glance-log" containerID="cri-o://fbc8d81cae7d71fa8a46628e8d150a14dfb67b400bf9f6724d0672397f9f3127" gracePeriod=30 Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.912384 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" containerName="glance-httpd" containerID="cri-o://6625b9274663fec0be311dbcf60adc35c30262d032c6554770c42479e46b47ed" gracePeriod=30 Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.956046 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df4ca968-dcb2-434e-8442-81e871efe544-scripts\") pod \"horizon-575f5495ff-tkktv\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.956227 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/df4ca968-dcb2-434e-8442-81e871efe544-horizon-secret-key\") pod \"horizon-575f5495ff-tkktv\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.956350 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc5xp\" (UniqueName: \"kubernetes.io/projected/df4ca968-dcb2-434e-8442-81e871efe544-kube-api-access-wc5xp\") pod \"horizon-575f5495ff-tkktv\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.956395 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df4ca968-dcb2-434e-8442-81e871efe544-logs\") pod \"horizon-575f5495ff-tkktv\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.956435 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df4ca968-dcb2-434e-8442-81e871efe544-config-data\") pod \"horizon-575f5495ff-tkktv\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.957151 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5464ff6cd7-twkzt"] Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.958334 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df4ca968-dcb2-434e-8442-81e871efe544-config-data\") pod \"horizon-575f5495ff-tkktv\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.959310 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.960000 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df4ca968-dcb2-434e-8442-81e871efe544-scripts\") pod \"horizon-575f5495ff-tkktv\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.961727 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df4ca968-dcb2-434e-8442-81e871efe544-logs\") pod \"horizon-575f5495ff-tkktv\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.976107 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/df4ca968-dcb2-434e-8442-81e871efe544-horizon-secret-key\") pod \"horizon-575f5495ff-tkktv\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:19 crc kubenswrapper[4766]: I1002 12:35:19.989065 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc5xp\" (UniqueName: \"kubernetes.io/projected/df4ca968-dcb2-434e-8442-81e871efe544-kube-api-access-wc5xp\") pod \"horizon-575f5495ff-tkktv\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.010223 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5464ff6cd7-twkzt"] Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.058614 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/058e43e2-30b3-46c3-993d-2ae5b1e076fb-horizon-secret-key\") pod \"horizon-5464ff6cd7-twkzt\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.058684 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s498n\" (UniqueName: \"kubernetes.io/projected/058e43e2-30b3-46c3-993d-2ae5b1e076fb-kube-api-access-s498n\") pod \"horizon-5464ff6cd7-twkzt\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.058752 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058e43e2-30b3-46c3-993d-2ae5b1e076fb-logs\") pod \"horizon-5464ff6cd7-twkzt\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.058790 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/058e43e2-30b3-46c3-993d-2ae5b1e076fb-scripts\") pod \"horizon-5464ff6cd7-twkzt\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.058861 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/058e43e2-30b3-46c3-993d-2ae5b1e076fb-config-data\") pod \"horizon-5464ff6cd7-twkzt\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.072999 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.160739 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/058e43e2-30b3-46c3-993d-2ae5b1e076fb-horizon-secret-key\") pod \"horizon-5464ff6cd7-twkzt\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.161396 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s498n\" (UniqueName: \"kubernetes.io/projected/058e43e2-30b3-46c3-993d-2ae5b1e076fb-kube-api-access-s498n\") pod \"horizon-5464ff6cd7-twkzt\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.161478 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058e43e2-30b3-46c3-993d-2ae5b1e076fb-logs\") pod \"horizon-5464ff6cd7-twkzt\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.161577 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/058e43e2-30b3-46c3-993d-2ae5b1e076fb-scripts\") pod \"horizon-5464ff6cd7-twkzt\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.161667 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/058e43e2-30b3-46c3-993d-2ae5b1e076fb-config-data\") pod \"horizon-5464ff6cd7-twkzt\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.163290 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/058e43e2-30b3-46c3-993d-2ae5b1e076fb-config-data\") pod \"horizon-5464ff6cd7-twkzt\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.163854 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/058e43e2-30b3-46c3-993d-2ae5b1e076fb-scripts\") pod \"horizon-5464ff6cd7-twkzt\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.163858 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058e43e2-30b3-46c3-993d-2ae5b1e076fb-logs\") pod \"horizon-5464ff6cd7-twkzt\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.170569 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/058e43e2-30b3-46c3-993d-2ae5b1e076fb-horizon-secret-key\") pod \"horizon-5464ff6cd7-twkzt\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.188462 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s498n\" (UniqueName: \"kubernetes.io/projected/058e43e2-30b3-46c3-993d-2ae5b1e076fb-kube-api-access-s498n\") pod \"horizon-5464ff6cd7-twkzt\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.242832 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.600636 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5464ff6cd7-twkzt"] Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.658837 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7d68765fcc-88cb5"] Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.673391 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.690885 4766 generic.go:334] "Generic (PLEG): container finished" podID="cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" containerID="fbc8d81cae7d71fa8a46628e8d150a14dfb67b400bf9f6724d0672397f9f3127" exitCode=143 Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.691844 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f","Type":"ContainerDied","Data":"fbc8d81cae7d71fa8a46628e8d150a14dfb67b400bf9f6724d0672397f9f3127"} Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.713586 4766 generic.go:334] "Generic (PLEG): container finished" podID="9daa02cf-4179-422d-bbf1-eb56fecdaa2e" containerID="1929510534d2b26bef2c41e543ec440a13c4ba67b16dcaf7168c5dc77e27373f" exitCode=143 Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.713673 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9daa02cf-4179-422d-bbf1-eb56fecdaa2e","Type":"ContainerDied","Data":"1929510534d2b26bef2c41e543ec440a13c4ba67b16dcaf7168c5dc77e27373f"} Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.729487 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-575f5495ff-tkktv"] Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.754000 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d68765fcc-88cb5"] Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.793362 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-scripts\") pod \"horizon-7d68765fcc-88cb5\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.793452 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfxfz\" (UniqueName: \"kubernetes.io/projected/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-kube-api-access-tfxfz\") pod \"horizon-7d68765fcc-88cb5\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.793518 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-logs\") pod \"horizon-7d68765fcc-88cb5\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.793627 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-config-data\") pod \"horizon-7d68765fcc-88cb5\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.793695 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-horizon-secret-key\") pod \"horizon-7d68765fcc-88cb5\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.855647 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5464ff6cd7-twkzt"] Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.898446 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-scripts\") pod \"horizon-7d68765fcc-88cb5\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.898530 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfxfz\" (UniqueName: \"kubernetes.io/projected/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-kube-api-access-tfxfz\") pod \"horizon-7d68765fcc-88cb5\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.898566 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-logs\") pod \"horizon-7d68765fcc-88cb5\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.898623 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-config-data\") pod \"horizon-7d68765fcc-88cb5\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.898655 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-horizon-secret-key\") pod \"horizon-7d68765fcc-88cb5\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.899453 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-scripts\") pod \"horizon-7d68765fcc-88cb5\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.901111 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-config-data\") pod \"horizon-7d68765fcc-88cb5\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.907904 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-logs\") pod \"horizon-7d68765fcc-88cb5\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.925301 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-horizon-secret-key\") pod \"horizon-7d68765fcc-88cb5\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:20 crc kubenswrapper[4766]: I1002 12:35:20.934716 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfxfz\" (UniqueName: \"kubernetes.io/projected/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-kube-api-access-tfxfz\") pod \"horizon-7d68765fcc-88cb5\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:21 crc kubenswrapper[4766]: I1002 12:35:21.029567 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:21 crc kubenswrapper[4766]: I1002 12:35:21.563948 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d68765fcc-88cb5"] Oct 02 12:35:21 crc kubenswrapper[4766]: I1002 12:35:21.729634 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575f5495ff-tkktv" event={"ID":"df4ca968-dcb2-434e-8442-81e871efe544","Type":"ContainerStarted","Data":"76d50d3c1faec2ac3b220795f30441c4554ed99996a53cc8f8b1d5acc879887b"} Oct 02 12:35:21 crc kubenswrapper[4766]: I1002 12:35:21.731476 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5464ff6cd7-twkzt" event={"ID":"058e43e2-30b3-46c3-993d-2ae5b1e076fb","Type":"ContainerStarted","Data":"825a96bed3d3d60a84f694ab06c7bf1d106708c61fce89acfe0289937d2b841f"} Oct 02 12:35:21 crc kubenswrapper[4766]: I1002 12:35:21.733369 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d68765fcc-88cb5" event={"ID":"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9","Type":"ContainerStarted","Data":"07139954cd627d00a1dcb3a6700bf0a125f5f7a642506ab3e4611d35a463b95d"} Oct 02 12:35:22 crc kubenswrapper[4766]: I1002 12:35:22.062723 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-8hdnj"] Oct 02 12:35:22 crc kubenswrapper[4766]: I1002 12:35:22.082865 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-8hdnj"] Oct 02 12:35:23 crc kubenswrapper[4766]: E1002 12:35:23.500318 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9daa02cf_4179_422d_bbf1_eb56fecdaa2e.slice/crio-5510a5ae64fb788bc5c47e75ccb77eb054581bf32a617929d26866146d7f5cec.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9daa02cf_4179_422d_bbf1_eb56fecdaa2e.slice/crio-conmon-5510a5ae64fb788bc5c47e75ccb77eb054581bf32a617929d26866146d7f5cec.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf138eee_b0bd_40cf_880a_d8a82ac1cd2f.slice/crio-6625b9274663fec0be311dbcf60adc35c30262d032c6554770c42479e46b47ed.scope\": RecentStats: unable to find data in memory cache]" Oct 02 12:35:23 crc kubenswrapper[4766]: I1002 12:35:23.761426 4766 generic.go:334] "Generic (PLEG): container finished" podID="9daa02cf-4179-422d-bbf1-eb56fecdaa2e" containerID="5510a5ae64fb788bc5c47e75ccb77eb054581bf32a617929d26866146d7f5cec" exitCode=0 Oct 02 12:35:23 crc kubenswrapper[4766]: I1002 12:35:23.761546 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9daa02cf-4179-422d-bbf1-eb56fecdaa2e","Type":"ContainerDied","Data":"5510a5ae64fb788bc5c47e75ccb77eb054581bf32a617929d26866146d7f5cec"} Oct 02 12:35:23 crc kubenswrapper[4766]: I1002 12:35:23.764423 4766 generic.go:334] "Generic (PLEG): container finished" podID="cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" containerID="6625b9274663fec0be311dbcf60adc35c30262d032c6554770c42479e46b47ed" exitCode=0 Oct 02 12:35:23 crc kubenswrapper[4766]: I1002 12:35:23.764472 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f","Type":"ContainerDied","Data":"6625b9274663fec0be311dbcf60adc35c30262d032c6554770c42479e46b47ed"} Oct 02 12:35:23 crc kubenswrapper[4766]: I1002 12:35:23.899404 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afda36fa-07fe-43b6-82a3-5ec9788fec1e" path="/var/lib/kubelet/pods/afda36fa-07fe-43b6-82a3-5ec9788fec1e/volumes" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.529129 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.625275 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-config-data\") pod \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.625441 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-combined-ca-bundle\") pod \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.625495 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-httpd-run\") pod \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.625618 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-ceph\") pod \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.626001 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" (UID: "cf138eee-b0bd-40cf-880a-d8a82ac1cd2f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.626202 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk9bp\" (UniqueName: \"kubernetes.io/projected/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-kube-api-access-kk9bp\") pod \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.626250 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-logs\") pod \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.626289 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-scripts\") pod \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\" (UID: \"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f\") " Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.627056 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.627198 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-logs" (OuterVolumeSpecName: "logs") pod "cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" (UID: "cf138eee-b0bd-40cf-880a-d8a82ac1cd2f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.637331 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-ceph" (OuterVolumeSpecName: "ceph") pod "cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" (UID: "cf138eee-b0bd-40cf-880a-d8a82ac1cd2f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.644446 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-kube-api-access-kk9bp" (OuterVolumeSpecName: "kube-api-access-kk9bp") pod "cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" (UID: "cf138eee-b0bd-40cf-880a-d8a82ac1cd2f"). InnerVolumeSpecName "kube-api-access-kk9bp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.644653 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-scripts" (OuterVolumeSpecName: "scripts") pod "cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" (UID: "cf138eee-b0bd-40cf-880a-d8a82ac1cd2f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.661209 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.723879 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" (UID: "cf138eee-b0bd-40cf-880a-d8a82ac1cd2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.728067 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-config-data\") pod \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.730454 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-ceph\") pod \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.730653 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-logs\") pod \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.730776 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-combined-ca-bundle\") pod \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.730879 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfr4q\" (UniqueName: \"kubernetes.io/projected/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-kube-api-access-bfr4q\") pod \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.731024 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-httpd-run\") pod \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.731160 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-scripts\") pod \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\" (UID: \"9daa02cf-4179-422d-bbf1-eb56fecdaa2e\") " Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.731214 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-logs" (OuterVolumeSpecName: "logs") pod "9daa02cf-4179-422d-bbf1-eb56fecdaa2e" (UID: "9daa02cf-4179-422d-bbf1-eb56fecdaa2e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.731479 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9daa02cf-4179-422d-bbf1-eb56fecdaa2e" (UID: "9daa02cf-4179-422d-bbf1-eb56fecdaa2e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.732478 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.732584 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.732651 4766 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.732709 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.732765 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.732829 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk9bp\" (UniqueName: \"kubernetes.io/projected/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-kube-api-access-kk9bp\") on node \"crc\" DevicePath \"\"" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.732885 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.740189 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-scripts" (OuterVolumeSpecName: "scripts") pod "9daa02cf-4179-422d-bbf1-eb56fecdaa2e" (UID: "9daa02cf-4179-422d-bbf1-eb56fecdaa2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.740292 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-kube-api-access-bfr4q" (OuterVolumeSpecName: "kube-api-access-bfr4q") pod "9daa02cf-4179-422d-bbf1-eb56fecdaa2e" (UID: "9daa02cf-4179-422d-bbf1-eb56fecdaa2e"). InnerVolumeSpecName "kube-api-access-bfr4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.768105 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-ceph" (OuterVolumeSpecName: "ceph") pod "9daa02cf-4179-422d-bbf1-eb56fecdaa2e" (UID: "9daa02cf-4179-422d-bbf1-eb56fecdaa2e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.838302 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.838368 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfr4q\" (UniqueName: \"kubernetes.io/projected/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-kube-api-access-bfr4q\") on node \"crc\" DevicePath \"\"" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.838381 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.845334 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5464ff6cd7-twkzt" event={"ID":"058e43e2-30b3-46c3-993d-2ae5b1e076fb","Type":"ContainerStarted","Data":"41750b0e3721602b859c59045ed817b05f202e819f632f4e54a34abdca1ad993"} Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.845426 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5464ff6cd7-twkzt" event={"ID":"058e43e2-30b3-46c3-993d-2ae5b1e076fb","Type":"ContainerStarted","Data":"072afce62d4041ebfdbd354b21ab4b1cf88cfc9b5380806b6e105d47041008df"} Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.845778 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5464ff6cd7-twkzt" podUID="058e43e2-30b3-46c3-993d-2ae5b1e076fb" containerName="horizon-log" containerID="cri-o://072afce62d4041ebfdbd354b21ab4b1cf88cfc9b5380806b6e105d47041008df" gracePeriod=30 Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.845809 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5464ff6cd7-twkzt" podUID="058e43e2-30b3-46c3-993d-2ae5b1e076fb" containerName="horizon" containerID="cri-o://41750b0e3721602b859c59045ed817b05f202e819f632f4e54a34abdca1ad993" gracePeriod=30 Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.848642 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d68765fcc-88cb5" event={"ID":"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9","Type":"ContainerStarted","Data":"b81881333fddbc48e419a202f15a835cf46d38d3517e10c56e69fb16145956ef"} Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.853298 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9daa02cf-4179-422d-bbf1-eb56fecdaa2e" (UID: "9daa02cf-4179-422d-bbf1-eb56fecdaa2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.853824 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575f5495ff-tkktv" event={"ID":"df4ca968-dcb2-434e-8442-81e871efe544","Type":"ContainerStarted","Data":"fd12c84becc00d1d29a25462954141a9c3a7f6a2b128641fb158a6a7a3ab9496"} Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.853947 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575f5495ff-tkktv" event={"ID":"df4ca968-dcb2-434e-8442-81e871efe544","Type":"ContainerStarted","Data":"a3360a36909acc808689ca870bd24d2c07667ae85c8d31378e12f5e3e3a06b1b"} Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.860738 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-config-data" (OuterVolumeSpecName: "config-data") pod "cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" (UID: "cf138eee-b0bd-40cf-880a-d8a82ac1cd2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.862148 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.862313 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"cf138eee-b0bd-40cf-880a-d8a82ac1cd2f","Type":"ContainerDied","Data":"2165bf4633bb084c6fb827aa9d02a8904be487566c0d436b738ac61a919b8caf"} Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.862417 4766 scope.go:117] "RemoveContainer" containerID="6625b9274663fec0be311dbcf60adc35c30262d032c6554770c42479e46b47ed" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.876233 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9daa02cf-4179-422d-bbf1-eb56fecdaa2e","Type":"ContainerDied","Data":"5f598fac833fc516efacda0e6d0d850f9c25ca970b154c245df39196893e2a15"} Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.876351 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.888782 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5464ff6cd7-twkzt" podStartSLOduration=2.62372957 podStartE2EDuration="9.888757973s" podCreationTimestamp="2025-10-02 12:35:19 +0000 UTC" firstStartedPulling="2025-10-02 12:35:20.881805525 +0000 UTC m=+6235.824676469" lastFinishedPulling="2025-10-02 12:35:28.146833908 +0000 UTC m=+6243.089704872" observedRunningTime="2025-10-02 12:35:28.872556854 +0000 UTC m=+6243.815427818" watchObservedRunningTime="2025-10-02 12:35:28.888757973 +0000 UTC m=+6243.831628917" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.902652 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-575f5495ff-tkktv" podStartSLOduration=2.429735785 podStartE2EDuration="9.902626427s" podCreationTimestamp="2025-10-02 12:35:19 +0000 UTC" firstStartedPulling="2025-10-02 12:35:20.675390613 +0000 UTC m=+6235.618261557" lastFinishedPulling="2025-10-02 12:35:28.148281255 +0000 UTC m=+6243.091152199" observedRunningTime="2025-10-02 12:35:28.898063922 +0000 UTC m=+6243.840934866" watchObservedRunningTime="2025-10-02 12:35:28.902626427 +0000 UTC m=+6243.845497361" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.912307 4766 scope.go:117] "RemoveContainer" containerID="fbc8d81cae7d71fa8a46628e8d150a14dfb67b400bf9f6724d0672397f9f3127" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.924768 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-config-data" (OuterVolumeSpecName: "config-data") pod "9daa02cf-4179-422d-bbf1-eb56fecdaa2e" (UID: "9daa02cf-4179-422d-bbf1-eb56fecdaa2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.937578 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.940964 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.941009 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9daa02cf-4179-422d-bbf1-eb56fecdaa2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.941023 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.943180 4766 scope.go:117] "RemoveContainer" containerID="5510a5ae64fb788bc5c47e75ccb77eb054581bf32a617929d26866146d7f5cec" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.946707 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.974421 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:35:28 crc kubenswrapper[4766]: E1002 12:35:28.975034 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" containerName="glance-log" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.975059 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" containerName="glance-log" Oct 02 12:35:28 crc kubenswrapper[4766]: E1002 12:35:28.975103 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9daa02cf-4179-422d-bbf1-eb56fecdaa2e" containerName="glance-log" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.975117 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9daa02cf-4179-422d-bbf1-eb56fecdaa2e" containerName="glance-log" Oct 02 12:35:28 crc kubenswrapper[4766]: E1002 12:35:28.975131 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9daa02cf-4179-422d-bbf1-eb56fecdaa2e" containerName="glance-httpd" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.975138 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9daa02cf-4179-422d-bbf1-eb56fecdaa2e" containerName="glance-httpd" Oct 02 12:35:28 crc kubenswrapper[4766]: E1002 12:35:28.975179 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" containerName="glance-httpd" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.975185 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" containerName="glance-httpd" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.975432 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" containerName="glance-httpd" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.975471 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9daa02cf-4179-422d-bbf1-eb56fecdaa2e" containerName="glance-log" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.975488 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9daa02cf-4179-422d-bbf1-eb56fecdaa2e" containerName="glance-httpd" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.975520 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" containerName="glance-log" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.977658 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.980053 4766 scope.go:117] "RemoveContainer" containerID="1929510534d2b26bef2c41e543ec440a13c4ba67b16dcaf7168c5dc77e27373f" Oct 02 12:35:28 crc kubenswrapper[4766]: I1002 12:35:28.981556 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.009817 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.044122 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e00770-3051-4ec4-a44c-364d503cb96c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.044212 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqcvl\" (UniqueName: \"kubernetes.io/projected/c8e00770-3051-4ec4-a44c-364d503cb96c-kube-api-access-wqcvl\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.044275 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8e00770-3051-4ec4-a44c-364d503cb96c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.044296 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8e00770-3051-4ec4-a44c-364d503cb96c-logs\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.044775 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8e00770-3051-4ec4-a44c-364d503cb96c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.044835 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e00770-3051-4ec4-a44c-364d503cb96c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.045119 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c8e00770-3051-4ec4-a44c-364d503cb96c-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.147865 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8e00770-3051-4ec4-a44c-364d503cb96c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.149701 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8e00770-3051-4ec4-a44c-364d503cb96c-logs\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.149827 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8e00770-3051-4ec4-a44c-364d503cb96c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.149856 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e00770-3051-4ec4-a44c-364d503cb96c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.149918 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c8e00770-3051-4ec4-a44c-364d503cb96c-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.149978 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e00770-3051-4ec4-a44c-364d503cb96c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.150028 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqcvl\" (UniqueName: \"kubernetes.io/projected/c8e00770-3051-4ec4-a44c-364d503cb96c-kube-api-access-wqcvl\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.150992 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8e00770-3051-4ec4-a44c-364d503cb96c-logs\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.151243 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c8e00770-3051-4ec4-a44c-364d503cb96c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.153440 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8e00770-3051-4ec4-a44c-364d503cb96c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.155076 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8e00770-3051-4ec4-a44c-364d503cb96c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.157739 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8e00770-3051-4ec4-a44c-364d503cb96c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.161778 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c8e00770-3051-4ec4-a44c-364d503cb96c-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.181313 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqcvl\" (UniqueName: \"kubernetes.io/projected/c8e00770-3051-4ec4-a44c-364d503cb96c-kube-api-access-wqcvl\") pod \"glance-default-internal-api-0\" (UID: \"c8e00770-3051-4ec4-a44c-364d503cb96c\") " pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.307796 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.311032 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.324148 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.336437 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.339018 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.341064 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.359331 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.458621 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e248d4b1-ecec-4d44-96cb-25f552b28709-config-data\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.463675 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e248d4b1-ecec-4d44-96cb-25f552b28709-logs\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.463767 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e248d4b1-ecec-4d44-96cb-25f552b28709-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.463907 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e248d4b1-ecec-4d44-96cb-25f552b28709-ceph\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.464194 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxbf5\" (UniqueName: \"kubernetes.io/projected/e248d4b1-ecec-4d44-96cb-25f552b28709-kube-api-access-hxbf5\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.464228 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e248d4b1-ecec-4d44-96cb-25f552b28709-scripts\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.464382 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e248d4b1-ecec-4d44-96cb-25f552b28709-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.566233 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxbf5\" (UniqueName: \"kubernetes.io/projected/e248d4b1-ecec-4d44-96cb-25f552b28709-kube-api-access-hxbf5\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.566298 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e248d4b1-ecec-4d44-96cb-25f552b28709-scripts\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.566383 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e248d4b1-ecec-4d44-96cb-25f552b28709-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.566438 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e248d4b1-ecec-4d44-96cb-25f552b28709-config-data\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.566469 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e248d4b1-ecec-4d44-96cb-25f552b28709-logs\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.566534 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e248d4b1-ecec-4d44-96cb-25f552b28709-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.566592 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e248d4b1-ecec-4d44-96cb-25f552b28709-ceph\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.568115 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e248d4b1-ecec-4d44-96cb-25f552b28709-logs\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.570199 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e248d4b1-ecec-4d44-96cb-25f552b28709-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.580111 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e248d4b1-ecec-4d44-96cb-25f552b28709-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.594233 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e248d4b1-ecec-4d44-96cb-25f552b28709-scripts\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.594664 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e248d4b1-ecec-4d44-96cb-25f552b28709-ceph\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.595728 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e248d4b1-ecec-4d44-96cb-25f552b28709-config-data\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.595943 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxbf5\" (UniqueName: \"kubernetes.io/projected/e248d4b1-ecec-4d44-96cb-25f552b28709-kube-api-access-hxbf5\") pod \"glance-default-external-api-0\" (UID: \"e248d4b1-ecec-4d44-96cb-25f552b28709\") " pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.769585 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.912969 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9daa02cf-4179-422d-bbf1-eb56fecdaa2e" path="/var/lib/kubelet/pods/9daa02cf-4179-422d-bbf1-eb56fecdaa2e/volumes" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.914613 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf138eee-b0bd-40cf-880a-d8a82ac1cd2f" path="/var/lib/kubelet/pods/cf138eee-b0bd-40cf-880a-d8a82ac1cd2f/volumes" Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.915713 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d68765fcc-88cb5" event={"ID":"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9","Type":"ContainerStarted","Data":"c64bf4995641dab91c56fda7ae96d849253e4ef453c3f5da7ca6e68cb9aa52be"} Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.957680 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 02 12:35:29 crc kubenswrapper[4766]: I1002 12:35:29.958742 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7d68765fcc-88cb5" podStartSLOduration=3.325481191 podStartE2EDuration="9.958715506s" podCreationTimestamp="2025-10-02 12:35:20 +0000 UTC" firstStartedPulling="2025-10-02 12:35:21.558049447 +0000 UTC m=+6236.500920391" lastFinishedPulling="2025-10-02 12:35:28.191283762 +0000 UTC m=+6243.134154706" observedRunningTime="2025-10-02 12:35:29.946736672 +0000 UTC m=+6244.889607636" watchObservedRunningTime="2025-10-02 12:35:29.958715506 +0000 UTC m=+6244.901586450" Oct 02 12:35:29 crc kubenswrapper[4766]: W1002 12:35:29.963212 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8e00770_3051_4ec4_a44c_364d503cb96c.slice/crio-4d4e5842e310c0e7064b6e0e9ea6c6562415190685679f2dee07be994798869b WatchSource:0}: Error finding container 4d4e5842e310c0e7064b6e0e9ea6c6562415190685679f2dee07be994798869b: Status 404 returned error can't find the container with id 4d4e5842e310c0e7064b6e0e9ea6c6562415190685679f2dee07be994798869b Oct 02 12:35:30 crc kubenswrapper[4766]: I1002 12:35:30.073869 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:30 crc kubenswrapper[4766]: I1002 12:35:30.073950 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:35:30 crc kubenswrapper[4766]: I1002 12:35:30.243746 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:35:30 crc kubenswrapper[4766]: I1002 12:35:30.397750 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 02 12:35:30 crc kubenswrapper[4766]: I1002 12:35:30.968684 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e248d4b1-ecec-4d44-96cb-25f552b28709","Type":"ContainerStarted","Data":"a48284541b8ed9e9424cf486fe15867534f66aba550f756590ef0992fc3897ee"} Oct 02 12:35:30 crc kubenswrapper[4766]: I1002 12:35:30.972956 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8e00770-3051-4ec4-a44c-364d503cb96c","Type":"ContainerStarted","Data":"e8ffd3c60cf149ed33a244b61e67ca81cccfba0db1992a685c0f664959a0a76b"} Oct 02 12:35:30 crc kubenswrapper[4766]: I1002 12:35:30.973008 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8e00770-3051-4ec4-a44c-364d503cb96c","Type":"ContainerStarted","Data":"4d4e5842e310c0e7064b6e0e9ea6c6562415190685679f2dee07be994798869b"} Oct 02 12:35:31 crc kubenswrapper[4766]: I1002 12:35:31.030311 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:31 crc kubenswrapper[4766]: I1002 12:35:31.031013 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:31 crc kubenswrapper[4766]: I1002 12:35:31.985384 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c8e00770-3051-4ec4-a44c-364d503cb96c","Type":"ContainerStarted","Data":"67e7043059306af01eaa73d4ce58088968a79e3d75dfa393f551d13da8982582"} Oct 02 12:35:31 crc kubenswrapper[4766]: I1002 12:35:31.989303 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e248d4b1-ecec-4d44-96cb-25f552b28709","Type":"ContainerStarted","Data":"fa63fa1274119d3b8f5bf2d06e783f07f0f31be13952323f947f2db3fa493896"} Oct 02 12:35:31 crc kubenswrapper[4766]: I1002 12:35:31.989349 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e248d4b1-ecec-4d44-96cb-25f552b28709","Type":"ContainerStarted","Data":"cea4ca76e9d9c6b4e7efe36c8b57e4614a69363f590e9ddef73641b5f83134d6"} Oct 02 12:35:32 crc kubenswrapper[4766]: I1002 12:35:32.015124 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.015097406 podStartE2EDuration="4.015097406s" podCreationTimestamp="2025-10-02 12:35:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:35:32.010575811 +0000 UTC m=+6246.953446755" watchObservedRunningTime="2025-10-02 12:35:32.015097406 +0000 UTC m=+6246.957968370" Oct 02 12:35:32 crc kubenswrapper[4766]: I1002 12:35:32.050858 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.050833721 podStartE2EDuration="3.050833721s" podCreationTimestamp="2025-10-02 12:35:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:35:32.045224591 +0000 UTC m=+6246.988095525" watchObservedRunningTime="2025-10-02 12:35:32.050833721 +0000 UTC m=+6246.993704655" Oct 02 12:35:37 crc kubenswrapper[4766]: I1002 12:35:37.171544 4766 scope.go:117] "RemoveContainer" containerID="cb7fda821febdc3b8dcd6397e43cbb18f177d4be100c11f44700f5869791dac3" Oct 02 12:35:37 crc kubenswrapper[4766]: I1002 12:35:37.228828 4766 scope.go:117] "RemoveContainer" containerID="419f3956430be7c0fac39d4d7d5a0f16f372f51caec32beca986d71cbd8e40b7" Oct 02 12:35:37 crc kubenswrapper[4766]: I1002 12:35:37.269733 4766 scope.go:117] "RemoveContainer" containerID="3c9e772d95b10749e2def625cd13c4517be08a6f113711d143aeb81cb0286e99" Oct 02 12:35:37 crc kubenswrapper[4766]: I1002 12:35:37.315600 4766 scope.go:117] "RemoveContainer" containerID="d828fdc69c70e7629f0f14a4c915322009929fa63a8c06a71174a5a618f1fb9f" Oct 02 12:35:37 crc kubenswrapper[4766]: I1002 12:35:37.373596 4766 scope.go:117] "RemoveContainer" containerID="4c4127f2a45399f5cdee38356dc3c43bb6e30e98c139aeaaaff76896baed8b62" Oct 02 12:35:39 crc kubenswrapper[4766]: I1002 12:35:39.312355 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 12:35:39 crc kubenswrapper[4766]: I1002 12:35:39.312844 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 02 12:35:39 crc kubenswrapper[4766]: I1002 12:35:39.353753 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 12:35:39 crc kubenswrapper[4766]: I1002 12:35:39.371752 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 02 12:35:39 crc kubenswrapper[4766]: I1002 12:35:39.769980 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 12:35:39 crc kubenswrapper[4766]: I1002 12:35:39.770725 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 02 12:35:39 crc kubenswrapper[4766]: I1002 12:35:39.816490 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 12:35:39 crc kubenswrapper[4766]: I1002 12:35:39.819725 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 02 12:35:40 crc kubenswrapper[4766]: I1002 12:35:40.076386 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-575f5495ff-tkktv" podUID="df4ca968-dcb2-434e-8442-81e871efe544" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.107:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.107:8080: connect: connection refused" Oct 02 12:35:40 crc kubenswrapper[4766]: I1002 12:35:40.095803 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 12:35:40 crc kubenswrapper[4766]: I1002 12:35:40.095887 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 02 12:35:40 crc kubenswrapper[4766]: I1002 12:35:40.096367 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 12:35:40 crc kubenswrapper[4766]: I1002 12:35:40.096611 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 02 12:35:41 crc kubenswrapper[4766]: I1002 12:35:41.031797 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7d68765fcc-88cb5" podUID="3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Oct 02 12:35:41 crc kubenswrapper[4766]: I1002 12:35:41.453047 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l5n6p"] Oct 02 12:35:41 crc kubenswrapper[4766]: I1002 12:35:41.455720 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l5n6p" Oct 02 12:35:41 crc kubenswrapper[4766]: I1002 12:35:41.480862 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l5n6p"] Oct 02 12:35:41 crc kubenswrapper[4766]: I1002 12:35:41.493429 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61ddb04a-ed6e-4873-8679-49bd1cd30937-utilities\") pod \"certified-operators-l5n6p\" (UID: \"61ddb04a-ed6e-4873-8679-49bd1cd30937\") " pod="openshift-marketplace/certified-operators-l5n6p" Oct 02 12:35:41 crc kubenswrapper[4766]: I1002 12:35:41.493702 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61ddb04a-ed6e-4873-8679-49bd1cd30937-catalog-content\") pod \"certified-operators-l5n6p\" (UID: \"61ddb04a-ed6e-4873-8679-49bd1cd30937\") " pod="openshift-marketplace/certified-operators-l5n6p" Oct 02 12:35:41 crc kubenswrapper[4766]: I1002 12:35:41.493917 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76c4g\" (UniqueName: \"kubernetes.io/projected/61ddb04a-ed6e-4873-8679-49bd1cd30937-kube-api-access-76c4g\") pod \"certified-operators-l5n6p\" (UID: \"61ddb04a-ed6e-4873-8679-49bd1cd30937\") " pod="openshift-marketplace/certified-operators-l5n6p" Oct 02 12:35:41 crc kubenswrapper[4766]: I1002 12:35:41.596352 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61ddb04a-ed6e-4873-8679-49bd1cd30937-utilities\") pod \"certified-operators-l5n6p\" (UID: \"61ddb04a-ed6e-4873-8679-49bd1cd30937\") " pod="openshift-marketplace/certified-operators-l5n6p" Oct 02 12:35:41 crc kubenswrapper[4766]: I1002 12:35:41.596675 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61ddb04a-ed6e-4873-8679-49bd1cd30937-catalog-content\") pod \"certified-operators-l5n6p\" (UID: \"61ddb04a-ed6e-4873-8679-49bd1cd30937\") " pod="openshift-marketplace/certified-operators-l5n6p" Oct 02 12:35:41 crc kubenswrapper[4766]: I1002 12:35:41.596817 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76c4g\" (UniqueName: \"kubernetes.io/projected/61ddb04a-ed6e-4873-8679-49bd1cd30937-kube-api-access-76c4g\") pod \"certified-operators-l5n6p\" (UID: \"61ddb04a-ed6e-4873-8679-49bd1cd30937\") " pod="openshift-marketplace/certified-operators-l5n6p" Oct 02 12:35:41 crc kubenswrapper[4766]: I1002 12:35:41.598111 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61ddb04a-ed6e-4873-8679-49bd1cd30937-utilities\") pod \"certified-operators-l5n6p\" (UID: \"61ddb04a-ed6e-4873-8679-49bd1cd30937\") " pod="openshift-marketplace/certified-operators-l5n6p" Oct 02 12:35:41 crc kubenswrapper[4766]: I1002 12:35:41.598329 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61ddb04a-ed6e-4873-8679-49bd1cd30937-catalog-content\") pod \"certified-operators-l5n6p\" (UID: \"61ddb04a-ed6e-4873-8679-49bd1cd30937\") " pod="openshift-marketplace/certified-operators-l5n6p" Oct 02 12:35:41 crc kubenswrapper[4766]: I1002 12:35:41.620507 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76c4g\" (UniqueName: \"kubernetes.io/projected/61ddb04a-ed6e-4873-8679-49bd1cd30937-kube-api-access-76c4g\") pod \"certified-operators-l5n6p\" (UID: \"61ddb04a-ed6e-4873-8679-49bd1cd30937\") " pod="openshift-marketplace/certified-operators-l5n6p" Oct 02 12:35:41 crc kubenswrapper[4766]: I1002 12:35:41.784750 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l5n6p" Oct 02 12:35:42 crc kubenswrapper[4766]: I1002 12:35:42.120133 4766 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 12:35:42 crc kubenswrapper[4766]: I1002 12:35:42.120526 4766 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 12:35:42 crc kubenswrapper[4766]: I1002 12:35:42.490916 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l5n6p"] Oct 02 12:35:43 crc kubenswrapper[4766]: I1002 12:35:43.143461 4766 generic.go:334] "Generic (PLEG): container finished" podID="61ddb04a-ed6e-4873-8679-49bd1cd30937" containerID="7f9c22d624b94d060dd87b07bb760895dae8acb7fb3d4254ebe6a52c340942d8" exitCode=0 Oct 02 12:35:43 crc kubenswrapper[4766]: I1002 12:35:43.145182 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5n6p" event={"ID":"61ddb04a-ed6e-4873-8679-49bd1cd30937","Type":"ContainerDied","Data":"7f9c22d624b94d060dd87b07bb760895dae8acb7fb3d4254ebe6a52c340942d8"} Oct 02 12:35:43 crc kubenswrapper[4766]: I1002 12:35:43.145228 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5n6p" event={"ID":"61ddb04a-ed6e-4873-8679-49bd1cd30937","Type":"ContainerStarted","Data":"3b9959103d38003508b71affd94b50d5c4b0db6607e6555dba2a722518e30479"} Oct 02 12:35:43 crc kubenswrapper[4766]: I1002 12:35:43.335677 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 12:35:43 crc kubenswrapper[4766]: I1002 12:35:43.335888 4766 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 12:35:43 crc kubenswrapper[4766]: I1002 12:35:43.377006 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 12:35:43 crc kubenswrapper[4766]: I1002 12:35:43.377556 4766 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 02 12:35:43 crc kubenswrapper[4766]: I1002 12:35:43.439259 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 02 12:35:43 crc kubenswrapper[4766]: I1002 12:35:43.541695 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 02 12:35:45 crc kubenswrapper[4766]: I1002 12:35:45.177381 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5n6p" event={"ID":"61ddb04a-ed6e-4873-8679-49bd1cd30937","Type":"ContainerStarted","Data":"41a0ba6adbb67a50afa5f7b17666eced0c99ca2271c411dcac3e10e57d8f87ac"} Oct 02 12:35:46 crc kubenswrapper[4766]: I1002 12:35:46.189935 4766 generic.go:334] "Generic (PLEG): container finished" podID="61ddb04a-ed6e-4873-8679-49bd1cd30937" containerID="41a0ba6adbb67a50afa5f7b17666eced0c99ca2271c411dcac3e10e57d8f87ac" exitCode=0 Oct 02 12:35:46 crc kubenswrapper[4766]: I1002 12:35:46.190204 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5n6p" event={"ID":"61ddb04a-ed6e-4873-8679-49bd1cd30937","Type":"ContainerDied","Data":"41a0ba6adbb67a50afa5f7b17666eced0c99ca2271c411dcac3e10e57d8f87ac"} Oct 02 12:35:47 crc kubenswrapper[4766]: I1002 12:35:47.204184 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5n6p" event={"ID":"61ddb04a-ed6e-4873-8679-49bd1cd30937","Type":"ContainerStarted","Data":"5c1652293854e25504e078ffddfe492a46ea5a58cb73153466c74e4616ebd145"} Oct 02 12:35:47 crc kubenswrapper[4766]: I1002 12:35:47.235832 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l5n6p" podStartSLOduration=2.7478544879999998 podStartE2EDuration="6.235804084s" podCreationTimestamp="2025-10-02 12:35:41 +0000 UTC" firstStartedPulling="2025-10-02 12:35:43.147074764 +0000 UTC m=+6258.089945708" lastFinishedPulling="2025-10-02 12:35:46.63502436 +0000 UTC m=+6261.577895304" observedRunningTime="2025-10-02 12:35:47.223382166 +0000 UTC m=+6262.166253120" watchObservedRunningTime="2025-10-02 12:35:47.235804084 +0000 UTC m=+6262.178675018" Oct 02 12:35:50 crc kubenswrapper[4766]: I1002 12:35:50.074651 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-575f5495ff-tkktv" podUID="df4ca968-dcb2-434e-8442-81e871efe544" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.107:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.107:8080: connect: connection refused" Oct 02 12:35:51 crc kubenswrapper[4766]: I1002 12:35:51.786670 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l5n6p" Oct 02 12:35:51 crc kubenswrapper[4766]: I1002 12:35:51.787183 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l5n6p" Oct 02 12:35:52 crc kubenswrapper[4766]: I1002 12:35:52.851904 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-l5n6p" podUID="61ddb04a-ed6e-4873-8679-49bd1cd30937" containerName="registry-server" probeResult="failure" output=< Oct 02 12:35:52 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Oct 02 12:35:52 crc kubenswrapper[4766]: > Oct 02 12:35:53 crc kubenswrapper[4766]: I1002 12:35:53.407417 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:55 crc kubenswrapper[4766]: I1002 12:35:55.519805 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:35:55 crc kubenswrapper[4766]: I1002 12:35:55.612430 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-575f5495ff-tkktv"] Oct 02 12:35:55 crc kubenswrapper[4766]: I1002 12:35:55.612813 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-575f5495ff-tkktv" podUID="df4ca968-dcb2-434e-8442-81e871efe544" containerName="horizon-log" containerID="cri-o://a3360a36909acc808689ca870bd24d2c07667ae85c8d31378e12f5e3e3a06b1b" gracePeriod=30 Oct 02 12:35:55 crc kubenswrapper[4766]: I1002 12:35:55.613245 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-575f5495ff-tkktv" podUID="df4ca968-dcb2-434e-8442-81e871efe544" containerName="horizon" containerID="cri-o://fd12c84becc00d1d29a25462954141a9c3a7f6a2b128641fb158a6a7a3ab9496" gracePeriod=30 Oct 02 12:35:56 crc kubenswrapper[4766]: I1002 12:35:56.305703 4766 generic.go:334] "Generic (PLEG): container finished" podID="df4ca968-dcb2-434e-8442-81e871efe544" containerID="fd12c84becc00d1d29a25462954141a9c3a7f6a2b128641fb158a6a7a3ab9496" exitCode=0 Oct 02 12:35:56 crc kubenswrapper[4766]: I1002 12:35:56.305804 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575f5495ff-tkktv" event={"ID":"df4ca968-dcb2-434e-8442-81e871efe544","Type":"ContainerDied","Data":"fd12c84becc00d1d29a25462954141a9c3a7f6a2b128641fb158a6a7a3ab9496"} Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.360463 4766 generic.go:334] "Generic (PLEG): container finished" podID="058e43e2-30b3-46c3-993d-2ae5b1e076fb" containerID="41750b0e3721602b859c59045ed817b05f202e819f632f4e54a34abdca1ad993" exitCode=137 Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.361301 4766 generic.go:334] "Generic (PLEG): container finished" podID="058e43e2-30b3-46c3-993d-2ae5b1e076fb" containerID="072afce62d4041ebfdbd354b21ab4b1cf88cfc9b5380806b6e105d47041008df" exitCode=137 Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.360635 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5464ff6cd7-twkzt" event={"ID":"058e43e2-30b3-46c3-993d-2ae5b1e076fb","Type":"ContainerDied","Data":"41750b0e3721602b859c59045ed817b05f202e819f632f4e54a34abdca1ad993"} Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.361369 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5464ff6cd7-twkzt" event={"ID":"058e43e2-30b3-46c3-993d-2ae5b1e076fb","Type":"ContainerDied","Data":"072afce62d4041ebfdbd354b21ab4b1cf88cfc9b5380806b6e105d47041008df"} Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.591365 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.733336 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/058e43e2-30b3-46c3-993d-2ae5b1e076fb-config-data\") pod \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.733389 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s498n\" (UniqueName: \"kubernetes.io/projected/058e43e2-30b3-46c3-993d-2ae5b1e076fb-kube-api-access-s498n\") pod \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.733536 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/058e43e2-30b3-46c3-993d-2ae5b1e076fb-horizon-secret-key\") pod \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.733739 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058e43e2-30b3-46c3-993d-2ae5b1e076fb-logs\") pod \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.733811 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/058e43e2-30b3-46c3-993d-2ae5b1e076fb-scripts\") pod \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\" (UID: \"058e43e2-30b3-46c3-993d-2ae5b1e076fb\") " Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.734203 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/058e43e2-30b3-46c3-993d-2ae5b1e076fb-logs" (OuterVolumeSpecName: "logs") pod "058e43e2-30b3-46c3-993d-2ae5b1e076fb" (UID: "058e43e2-30b3-46c3-993d-2ae5b1e076fb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.734604 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/058e43e2-30b3-46c3-993d-2ae5b1e076fb-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.740864 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/058e43e2-30b3-46c3-993d-2ae5b1e076fb-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "058e43e2-30b3-46c3-993d-2ae5b1e076fb" (UID: "058e43e2-30b3-46c3-993d-2ae5b1e076fb"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.755870 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/058e43e2-30b3-46c3-993d-2ae5b1e076fb-kube-api-access-s498n" (OuterVolumeSpecName: "kube-api-access-s498n") pod "058e43e2-30b3-46c3-993d-2ae5b1e076fb" (UID: "058e43e2-30b3-46c3-993d-2ae5b1e076fb"). InnerVolumeSpecName "kube-api-access-s498n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.767011 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058e43e2-30b3-46c3-993d-2ae5b1e076fb-config-data" (OuterVolumeSpecName: "config-data") pod "058e43e2-30b3-46c3-993d-2ae5b1e076fb" (UID: "058e43e2-30b3-46c3-993d-2ae5b1e076fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.784712 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/058e43e2-30b3-46c3-993d-2ae5b1e076fb-scripts" (OuterVolumeSpecName: "scripts") pod "058e43e2-30b3-46c3-993d-2ae5b1e076fb" (UID: "058e43e2-30b3-46c3-993d-2ae5b1e076fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.837402 4766 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/058e43e2-30b3-46c3-993d-2ae5b1e076fb-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.837660 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/058e43e2-30b3-46c3-993d-2ae5b1e076fb-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.837672 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/058e43e2-30b3-46c3-993d-2ae5b1e076fb-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:35:59.837686 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s498n\" (UniqueName: \"kubernetes.io/projected/058e43e2-30b3-46c3-993d-2ae5b1e076fb-kube-api-access-s498n\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:36:00.375346 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5464ff6cd7-twkzt" event={"ID":"058e43e2-30b3-46c3-993d-2ae5b1e076fb","Type":"ContainerDied","Data":"825a96bed3d3d60a84f694ab06c7bf1d106708c61fce89acfe0289937d2b841f"} Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:36:00.375408 4766 scope.go:117] "RemoveContainer" containerID="41750b0e3721602b859c59045ed817b05f202e819f632f4e54a34abdca1ad993" Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:36:00.375440 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5464ff6cd7-twkzt" Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:36:00.409758 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5464ff6cd7-twkzt"] Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:36:00.418415 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5464ff6cd7-twkzt"] Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:36:00.591594 4766 scope.go:117] "RemoveContainer" containerID="072afce62d4041ebfdbd354b21ab4b1cf88cfc9b5380806b6e105d47041008df" Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:36:01.842055 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l5n6p" Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:36:01.901054 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="058e43e2-30b3-46c3-993d-2ae5b1e076fb" path="/var/lib/kubelet/pods/058e43e2-30b3-46c3-993d-2ae5b1e076fb/volumes" Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:36:01.901961 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l5n6p" Oct 02 12:36:02 crc kubenswrapper[4766]: I1002 12:36:02.089431 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l5n6p"] Oct 02 12:36:03 crc kubenswrapper[4766]: I1002 12:36:03.413833 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l5n6p" podUID="61ddb04a-ed6e-4873-8679-49bd1cd30937" containerName="registry-server" containerID="cri-o://5c1652293854e25504e078ffddfe492a46ea5a58cb73153466c74e4616ebd145" gracePeriod=2 Oct 02 12:36:04 crc kubenswrapper[4766]: I1002 12:36:04.438865 4766 generic.go:334] "Generic (PLEG): container finished" podID="61ddb04a-ed6e-4873-8679-49bd1cd30937" containerID="5c1652293854e25504e078ffddfe492a46ea5a58cb73153466c74e4616ebd145" exitCode=0 Oct 02 12:36:04 crc kubenswrapper[4766]: I1002 12:36:04.438942 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5n6p" event={"ID":"61ddb04a-ed6e-4873-8679-49bd1cd30937","Type":"ContainerDied","Data":"5c1652293854e25504e078ffddfe492a46ea5a58cb73153466c74e4616ebd145"} Oct 02 12:36:04 crc kubenswrapper[4766]: I1002 12:36:04.543827 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l5n6p" Oct 02 12:36:04 crc kubenswrapper[4766]: I1002 12:36:04.665405 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76c4g\" (UniqueName: \"kubernetes.io/projected/61ddb04a-ed6e-4873-8679-49bd1cd30937-kube-api-access-76c4g\") pod \"61ddb04a-ed6e-4873-8679-49bd1cd30937\" (UID: \"61ddb04a-ed6e-4873-8679-49bd1cd30937\") " Oct 02 12:36:04 crc kubenswrapper[4766]: I1002 12:36:04.665551 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61ddb04a-ed6e-4873-8679-49bd1cd30937-utilities\") pod \"61ddb04a-ed6e-4873-8679-49bd1cd30937\" (UID: \"61ddb04a-ed6e-4873-8679-49bd1cd30937\") " Oct 02 12:36:04 crc kubenswrapper[4766]: I1002 12:36:04.665740 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61ddb04a-ed6e-4873-8679-49bd1cd30937-catalog-content\") pod \"61ddb04a-ed6e-4873-8679-49bd1cd30937\" (UID: \"61ddb04a-ed6e-4873-8679-49bd1cd30937\") " Oct 02 12:36:04 crc kubenswrapper[4766]: I1002 12:36:04.667515 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61ddb04a-ed6e-4873-8679-49bd1cd30937-utilities" (OuterVolumeSpecName: "utilities") pod "61ddb04a-ed6e-4873-8679-49bd1cd30937" (UID: "61ddb04a-ed6e-4873-8679-49bd1cd30937"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:36:04 crc kubenswrapper[4766]: I1002 12:36:04.674455 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61ddb04a-ed6e-4873-8679-49bd1cd30937-kube-api-access-76c4g" (OuterVolumeSpecName: "kube-api-access-76c4g") pod "61ddb04a-ed6e-4873-8679-49bd1cd30937" (UID: "61ddb04a-ed6e-4873-8679-49bd1cd30937"). InnerVolumeSpecName "kube-api-access-76c4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:36:04 crc kubenswrapper[4766]: I1002 12:36:04.709356 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61ddb04a-ed6e-4873-8679-49bd1cd30937-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61ddb04a-ed6e-4873-8679-49bd1cd30937" (UID: "61ddb04a-ed6e-4873-8679-49bd1cd30937"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:36:04 crc kubenswrapper[4766]: I1002 12:36:04.767945 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76c4g\" (UniqueName: \"kubernetes.io/projected/61ddb04a-ed6e-4873-8679-49bd1cd30937-kube-api-access-76c4g\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:04 crc kubenswrapper[4766]: I1002 12:36:04.768450 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61ddb04a-ed6e-4873-8679-49bd1cd30937-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:04 crc kubenswrapper[4766]: I1002 12:36:04.768461 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61ddb04a-ed6e-4873-8679-49bd1cd30937-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.456233 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l5n6p" event={"ID":"61ddb04a-ed6e-4873-8679-49bd1cd30937","Type":"ContainerDied","Data":"3b9959103d38003508b71affd94b50d5c4b0db6607e6555dba2a722518e30479"} Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.456314 4766 scope.go:117] "RemoveContainer" containerID="5c1652293854e25504e078ffddfe492a46ea5a58cb73153466c74e4616ebd145" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.456380 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l5n6p" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.506900 4766 scope.go:117] "RemoveContainer" containerID="41a0ba6adbb67a50afa5f7b17666eced0c99ca2271c411dcac3e10e57d8f87ac" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.511764 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s8p49"] Oct 02 12:36:05 crc kubenswrapper[4766]: E1002 12:36:05.512649 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61ddb04a-ed6e-4873-8679-49bd1cd30937" containerName="extract-utilities" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.512681 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="61ddb04a-ed6e-4873-8679-49bd1cd30937" containerName="extract-utilities" Oct 02 12:36:05 crc kubenswrapper[4766]: E1002 12:36:05.512737 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058e43e2-30b3-46c3-993d-2ae5b1e076fb" containerName="horizon" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.512756 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="058e43e2-30b3-46c3-993d-2ae5b1e076fb" containerName="horizon" Oct 02 12:36:05 crc kubenswrapper[4766]: E1002 12:36:05.512780 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="058e43e2-30b3-46c3-993d-2ae5b1e076fb" containerName="horizon-log" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.512790 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="058e43e2-30b3-46c3-993d-2ae5b1e076fb" containerName="horizon-log" Oct 02 12:36:05 crc kubenswrapper[4766]: E1002 12:36:05.512831 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61ddb04a-ed6e-4873-8679-49bd1cd30937" containerName="registry-server" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.512842 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="61ddb04a-ed6e-4873-8679-49bd1cd30937" containerName="registry-server" Oct 02 12:36:05 crc kubenswrapper[4766]: E1002 12:36:05.512870 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61ddb04a-ed6e-4873-8679-49bd1cd30937" containerName="extract-content" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.512884 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="61ddb04a-ed6e-4873-8679-49bd1cd30937" containerName="extract-content" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.513269 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="058e43e2-30b3-46c3-993d-2ae5b1e076fb" containerName="horizon" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.513319 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="61ddb04a-ed6e-4873-8679-49bd1cd30937" containerName="registry-server" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.513378 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="058e43e2-30b3-46c3-993d-2ae5b1e076fb" containerName="horizon-log" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.516191 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8p49" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.537609 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l5n6p"] Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.567753 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l5n6p"] Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.571287 4766 scope.go:117] "RemoveContainer" containerID="7f9c22d624b94d060dd87b07bb760895dae8acb7fb3d4254ebe6a52c340942d8" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.579679 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8p49"] Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.595229 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b24c4ba-d542-4483-b5e3-829d5cff486a-catalog-content\") pod \"redhat-marketplace-s8p49\" (UID: \"1b24c4ba-d542-4483-b5e3-829d5cff486a\") " pod="openshift-marketplace/redhat-marketplace-s8p49" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.595406 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpfxr\" (UniqueName: \"kubernetes.io/projected/1b24c4ba-d542-4483-b5e3-829d5cff486a-kube-api-access-jpfxr\") pod \"redhat-marketplace-s8p49\" (UID: \"1b24c4ba-d542-4483-b5e3-829d5cff486a\") " pod="openshift-marketplace/redhat-marketplace-s8p49" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.596142 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b24c4ba-d542-4483-b5e3-829d5cff486a-utilities\") pod \"redhat-marketplace-s8p49\" (UID: \"1b24c4ba-d542-4483-b5e3-829d5cff486a\") " pod="openshift-marketplace/redhat-marketplace-s8p49" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.698241 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b24c4ba-d542-4483-b5e3-829d5cff486a-catalog-content\") pod \"redhat-marketplace-s8p49\" (UID: \"1b24c4ba-d542-4483-b5e3-829d5cff486a\") " pod="openshift-marketplace/redhat-marketplace-s8p49" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.698316 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpfxr\" (UniqueName: \"kubernetes.io/projected/1b24c4ba-d542-4483-b5e3-829d5cff486a-kube-api-access-jpfxr\") pod \"redhat-marketplace-s8p49\" (UID: \"1b24c4ba-d542-4483-b5e3-829d5cff486a\") " pod="openshift-marketplace/redhat-marketplace-s8p49" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.698444 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b24c4ba-d542-4483-b5e3-829d5cff486a-utilities\") pod \"redhat-marketplace-s8p49\" (UID: \"1b24c4ba-d542-4483-b5e3-829d5cff486a\") " pod="openshift-marketplace/redhat-marketplace-s8p49" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.698957 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b24c4ba-d542-4483-b5e3-829d5cff486a-catalog-content\") pod \"redhat-marketplace-s8p49\" (UID: \"1b24c4ba-d542-4483-b5e3-829d5cff486a\") " pod="openshift-marketplace/redhat-marketplace-s8p49" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.698987 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b24c4ba-d542-4483-b5e3-829d5cff486a-utilities\") pod \"redhat-marketplace-s8p49\" (UID: \"1b24c4ba-d542-4483-b5e3-829d5cff486a\") " pod="openshift-marketplace/redhat-marketplace-s8p49" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.721054 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpfxr\" (UniqueName: \"kubernetes.io/projected/1b24c4ba-d542-4483-b5e3-829d5cff486a-kube-api-access-jpfxr\") pod \"redhat-marketplace-s8p49\" (UID: \"1b24c4ba-d542-4483-b5e3-829d5cff486a\") " pod="openshift-marketplace/redhat-marketplace-s8p49" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.894254 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61ddb04a-ed6e-4873-8679-49bd1cd30937" path="/var/lib/kubelet/pods/61ddb04a-ed6e-4873-8679-49bd1cd30937/volumes" Oct 02 12:36:05 crc kubenswrapper[4766]: I1002 12:36:05.955590 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8p49" Oct 02 12:36:06 crc kubenswrapper[4766]: I1002 12:36:06.442087 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8p49"] Oct 02 12:36:06 crc kubenswrapper[4766]: I1002 12:36:06.471900 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8p49" event={"ID":"1b24c4ba-d542-4483-b5e3-829d5cff486a","Type":"ContainerStarted","Data":"c125a7a187a3109c0934db521c9b2576315877cf3d10137cbd65b5be67b9e00c"} Oct 02 12:36:07 crc kubenswrapper[4766]: I1002 12:36:07.054114 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-96qjq"] Oct 02 12:36:07 crc kubenswrapper[4766]: I1002 12:36:07.076466 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-96qjq"] Oct 02 12:36:07 crc kubenswrapper[4766]: I1002 12:36:07.489979 4766 generic.go:334] "Generic (PLEG): container finished" podID="1b24c4ba-d542-4483-b5e3-829d5cff486a" containerID="918a061cbb7ef30327faa8c6bc8b80e8fd5c1e46cba15046e747cb57a2551945" exitCode=0 Oct 02 12:36:07 crc kubenswrapper[4766]: I1002 12:36:07.490034 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8p49" event={"ID":"1b24c4ba-d542-4483-b5e3-829d5cff486a","Type":"ContainerDied","Data":"918a061cbb7ef30327faa8c6bc8b80e8fd5c1e46cba15046e747cb57a2551945"} Oct 02 12:36:07 crc kubenswrapper[4766]: I1002 12:36:07.895638 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="690f940e-9bfd-4423-8f76-3e3d5c55347d" path="/var/lib/kubelet/pods/690f940e-9bfd-4423-8f76-3e3d5c55347d/volumes" Oct 02 12:36:09 crc kubenswrapper[4766]: I1002 12:36:09.514199 4766 generic.go:334] "Generic (PLEG): container finished" podID="1b24c4ba-d542-4483-b5e3-829d5cff486a" containerID="199849a411b232fa852f3cffb4d7fab30a674025c344df73c02f86ed8cc6e10f" exitCode=0 Oct 02 12:36:09 crc kubenswrapper[4766]: I1002 12:36:09.514266 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8p49" event={"ID":"1b24c4ba-d542-4483-b5e3-829d5cff486a","Type":"ContainerDied","Data":"199849a411b232fa852f3cffb4d7fab30a674025c344df73c02f86ed8cc6e10f"} Oct 02 12:36:10 crc kubenswrapper[4766]: I1002 12:36:10.526777 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8p49" event={"ID":"1b24c4ba-d542-4483-b5e3-829d5cff486a","Type":"ContainerStarted","Data":"d66e0ce46cfa1548a16ae98b92d08a730c54e991c75283fac224d11851b82879"} Oct 02 12:36:10 crc kubenswrapper[4766]: I1002 12:36:10.555440 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s8p49" podStartSLOduration=2.828117814 podStartE2EDuration="5.555410315s" podCreationTimestamp="2025-10-02 12:36:05 +0000 UTC" firstStartedPulling="2025-10-02 12:36:07.493968421 +0000 UTC m=+6282.436839365" lastFinishedPulling="2025-10-02 12:36:10.221260922 +0000 UTC m=+6285.164131866" observedRunningTime="2025-10-02 12:36:10.544907739 +0000 UTC m=+6285.487778693" watchObservedRunningTime="2025-10-02 12:36:10.555410315 +0000 UTC m=+6285.498281259" Oct 02 12:36:15 crc kubenswrapper[4766]: I1002 12:36:15.956120 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s8p49" Oct 02 12:36:15 crc kubenswrapper[4766]: I1002 12:36:15.957268 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s8p49" Oct 02 12:36:16 crc kubenswrapper[4766]: I1002 12:36:16.014060 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s8p49" Oct 02 12:36:16 crc kubenswrapper[4766]: I1002 12:36:16.655981 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s8p49" Oct 02 12:36:16 crc kubenswrapper[4766]: I1002 12:36:16.712028 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8p49"] Oct 02 12:36:17 crc kubenswrapper[4766]: I1002 12:36:17.050386 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-e3d4-account-create-fknc7"] Oct 02 12:36:17 crc kubenswrapper[4766]: I1002 12:36:17.062455 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-e3d4-account-create-fknc7"] Oct 02 12:36:17 crc kubenswrapper[4766]: I1002 12:36:17.899425 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15e20b4c-4af1-42f7-8391-8b187672bb16" path="/var/lib/kubelet/pods/15e20b4c-4af1-42f7-8391-8b187672bb16/volumes" Oct 02 12:36:18 crc kubenswrapper[4766]: I1002 12:36:18.621643 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s8p49" podUID="1b24c4ba-d542-4483-b5e3-829d5cff486a" containerName="registry-server" containerID="cri-o://d66e0ce46cfa1548a16ae98b92d08a730c54e991c75283fac224d11851b82879" gracePeriod=2 Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.140408 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8p49" Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.238377 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b24c4ba-d542-4483-b5e3-829d5cff486a-utilities\") pod \"1b24c4ba-d542-4483-b5e3-829d5cff486a\" (UID: \"1b24c4ba-d542-4483-b5e3-829d5cff486a\") " Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.238521 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b24c4ba-d542-4483-b5e3-829d5cff486a-catalog-content\") pod \"1b24c4ba-d542-4483-b5e3-829d5cff486a\" (UID: \"1b24c4ba-d542-4483-b5e3-829d5cff486a\") " Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.238662 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpfxr\" (UniqueName: \"kubernetes.io/projected/1b24c4ba-d542-4483-b5e3-829d5cff486a-kube-api-access-jpfxr\") pod \"1b24c4ba-d542-4483-b5e3-829d5cff486a\" (UID: \"1b24c4ba-d542-4483-b5e3-829d5cff486a\") " Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.239685 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b24c4ba-d542-4483-b5e3-829d5cff486a-utilities" (OuterVolumeSpecName: "utilities") pod "1b24c4ba-d542-4483-b5e3-829d5cff486a" (UID: "1b24c4ba-d542-4483-b5e3-829d5cff486a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.248029 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b24c4ba-d542-4483-b5e3-829d5cff486a-kube-api-access-jpfxr" (OuterVolumeSpecName: "kube-api-access-jpfxr") pod "1b24c4ba-d542-4483-b5e3-829d5cff486a" (UID: "1b24c4ba-d542-4483-b5e3-829d5cff486a"). InnerVolumeSpecName "kube-api-access-jpfxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.254386 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b24c4ba-d542-4483-b5e3-829d5cff486a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b24c4ba-d542-4483-b5e3-829d5cff486a" (UID: "1b24c4ba-d542-4483-b5e3-829d5cff486a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.341807 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b24c4ba-d542-4483-b5e3-829d5cff486a-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.341847 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b24c4ba-d542-4483-b5e3-829d5cff486a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.341859 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpfxr\" (UniqueName: \"kubernetes.io/projected/1b24c4ba-d542-4483-b5e3-829d5cff486a-kube-api-access-jpfxr\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.634313 4766 generic.go:334] "Generic (PLEG): container finished" podID="1b24c4ba-d542-4483-b5e3-829d5cff486a" containerID="d66e0ce46cfa1548a16ae98b92d08a730c54e991c75283fac224d11851b82879" exitCode=0 Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.634375 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8p49" event={"ID":"1b24c4ba-d542-4483-b5e3-829d5cff486a","Type":"ContainerDied","Data":"d66e0ce46cfa1548a16ae98b92d08a730c54e991c75283fac224d11851b82879"} Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.634430 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s8p49" event={"ID":"1b24c4ba-d542-4483-b5e3-829d5cff486a","Type":"ContainerDied","Data":"c125a7a187a3109c0934db521c9b2576315877cf3d10137cbd65b5be67b9e00c"} Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.634446 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s8p49" Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.634457 4766 scope.go:117] "RemoveContainer" containerID="d66e0ce46cfa1548a16ae98b92d08a730c54e991c75283fac224d11851b82879" Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.685053 4766 scope.go:117] "RemoveContainer" containerID="199849a411b232fa852f3cffb4d7fab30a674025c344df73c02f86ed8cc6e10f" Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.690287 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8p49"] Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.702075 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s8p49"] Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.712086 4766 scope.go:117] "RemoveContainer" containerID="918a061cbb7ef30327faa8c6bc8b80e8fd5c1e46cba15046e747cb57a2551945" Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.756446 4766 scope.go:117] "RemoveContainer" containerID="d66e0ce46cfa1548a16ae98b92d08a730c54e991c75283fac224d11851b82879" Oct 02 12:36:19 crc kubenswrapper[4766]: E1002 12:36:19.757119 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d66e0ce46cfa1548a16ae98b92d08a730c54e991c75283fac224d11851b82879\": container with ID starting with d66e0ce46cfa1548a16ae98b92d08a730c54e991c75283fac224d11851b82879 not found: ID does not exist" containerID="d66e0ce46cfa1548a16ae98b92d08a730c54e991c75283fac224d11851b82879" Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.757185 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d66e0ce46cfa1548a16ae98b92d08a730c54e991c75283fac224d11851b82879"} err="failed to get container status \"d66e0ce46cfa1548a16ae98b92d08a730c54e991c75283fac224d11851b82879\": rpc error: code = NotFound desc = could not find container \"d66e0ce46cfa1548a16ae98b92d08a730c54e991c75283fac224d11851b82879\": container with ID starting with d66e0ce46cfa1548a16ae98b92d08a730c54e991c75283fac224d11851b82879 not found: ID does not exist" Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.757232 4766 scope.go:117] "RemoveContainer" containerID="199849a411b232fa852f3cffb4d7fab30a674025c344df73c02f86ed8cc6e10f" Oct 02 12:36:19 crc kubenswrapper[4766]: E1002 12:36:19.758007 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"199849a411b232fa852f3cffb4d7fab30a674025c344df73c02f86ed8cc6e10f\": container with ID starting with 199849a411b232fa852f3cffb4d7fab30a674025c344df73c02f86ed8cc6e10f not found: ID does not exist" containerID="199849a411b232fa852f3cffb4d7fab30a674025c344df73c02f86ed8cc6e10f" Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.758047 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"199849a411b232fa852f3cffb4d7fab30a674025c344df73c02f86ed8cc6e10f"} err="failed to get container status \"199849a411b232fa852f3cffb4d7fab30a674025c344df73c02f86ed8cc6e10f\": rpc error: code = NotFound desc = could not find container \"199849a411b232fa852f3cffb4d7fab30a674025c344df73c02f86ed8cc6e10f\": container with ID starting with 199849a411b232fa852f3cffb4d7fab30a674025c344df73c02f86ed8cc6e10f not found: ID does not exist" Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.758087 4766 scope.go:117] "RemoveContainer" containerID="918a061cbb7ef30327faa8c6bc8b80e8fd5c1e46cba15046e747cb57a2551945" Oct 02 12:36:19 crc kubenswrapper[4766]: E1002 12:36:19.758694 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"918a061cbb7ef30327faa8c6bc8b80e8fd5c1e46cba15046e747cb57a2551945\": container with ID starting with 918a061cbb7ef30327faa8c6bc8b80e8fd5c1e46cba15046e747cb57a2551945 not found: ID does not exist" containerID="918a061cbb7ef30327faa8c6bc8b80e8fd5c1e46cba15046e747cb57a2551945" Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.758732 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"918a061cbb7ef30327faa8c6bc8b80e8fd5c1e46cba15046e747cb57a2551945"} err="failed to get container status \"918a061cbb7ef30327faa8c6bc8b80e8fd5c1e46cba15046e747cb57a2551945\": rpc error: code = NotFound desc = could not find container \"918a061cbb7ef30327faa8c6bc8b80e8fd5c1e46cba15046e747cb57a2551945\": container with ID starting with 918a061cbb7ef30327faa8c6bc8b80e8fd5c1e46cba15046e747cb57a2551945 not found: ID does not exist" Oct 02 12:36:19 crc kubenswrapper[4766]: I1002 12:36:19.894673 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b24c4ba-d542-4483-b5e3-829d5cff486a" path="/var/lib/kubelet/pods/1b24c4ba-d542-4483-b5e3-829d5cff486a/volumes" Oct 02 12:36:24 crc kubenswrapper[4766]: I1002 12:36:24.432550 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:36:24 crc kubenswrapper[4766]: I1002 12:36:24.433797 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:36:25 crc kubenswrapper[4766]: I1002 12:36:25.037548 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-zhtb7"] Oct 02 12:36:25 crc kubenswrapper[4766]: I1002 12:36:25.052016 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-zhtb7"] Oct 02 12:36:25 crc kubenswrapper[4766]: I1002 12:36:25.713312 4766 generic.go:334] "Generic (PLEG): container finished" podID="df4ca968-dcb2-434e-8442-81e871efe544" containerID="a3360a36909acc808689ca870bd24d2c07667ae85c8d31378e12f5e3e3a06b1b" exitCode=137 Oct 02 12:36:25 crc kubenswrapper[4766]: I1002 12:36:25.713493 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575f5495ff-tkktv" event={"ID":"df4ca968-dcb2-434e-8442-81e871efe544","Type":"ContainerDied","Data":"a3360a36909acc808689ca870bd24d2c07667ae85c8d31378e12f5e3e3a06b1b"} Oct 02 12:36:25 crc kubenswrapper[4766]: I1002 12:36:25.900605 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd100b70-ea16-47b3-ac1b-6ec049ff4ee7" path="/var/lib/kubelet/pods/dd100b70-ea16-47b3-ac1b-6ec049ff4ee7/volumes" Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.063597 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.112894 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df4ca968-dcb2-434e-8442-81e871efe544-config-data\") pod \"df4ca968-dcb2-434e-8442-81e871efe544\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.112996 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wc5xp\" (UniqueName: \"kubernetes.io/projected/df4ca968-dcb2-434e-8442-81e871efe544-kube-api-access-wc5xp\") pod \"df4ca968-dcb2-434e-8442-81e871efe544\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.113139 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/df4ca968-dcb2-434e-8442-81e871efe544-horizon-secret-key\") pod \"df4ca968-dcb2-434e-8442-81e871efe544\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.113247 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df4ca968-dcb2-434e-8442-81e871efe544-logs\") pod \"df4ca968-dcb2-434e-8442-81e871efe544\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.113317 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df4ca968-dcb2-434e-8442-81e871efe544-scripts\") pod \"df4ca968-dcb2-434e-8442-81e871efe544\" (UID: \"df4ca968-dcb2-434e-8442-81e871efe544\") " Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.114245 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df4ca968-dcb2-434e-8442-81e871efe544-logs" (OuterVolumeSpecName: "logs") pod "df4ca968-dcb2-434e-8442-81e871efe544" (UID: "df4ca968-dcb2-434e-8442-81e871efe544"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.121747 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df4ca968-dcb2-434e-8442-81e871efe544-kube-api-access-wc5xp" (OuterVolumeSpecName: "kube-api-access-wc5xp") pod "df4ca968-dcb2-434e-8442-81e871efe544" (UID: "df4ca968-dcb2-434e-8442-81e871efe544"). InnerVolumeSpecName "kube-api-access-wc5xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.122370 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df4ca968-dcb2-434e-8442-81e871efe544-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "df4ca968-dcb2-434e-8442-81e871efe544" (UID: "df4ca968-dcb2-434e-8442-81e871efe544"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.144742 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df4ca968-dcb2-434e-8442-81e871efe544-scripts" (OuterVolumeSpecName: "scripts") pod "df4ca968-dcb2-434e-8442-81e871efe544" (UID: "df4ca968-dcb2-434e-8442-81e871efe544"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.150474 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df4ca968-dcb2-434e-8442-81e871efe544-config-data" (OuterVolumeSpecName: "config-data") pod "df4ca968-dcb2-434e-8442-81e871efe544" (UID: "df4ca968-dcb2-434e-8442-81e871efe544"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.215369 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/df4ca968-dcb2-434e-8442-81e871efe544-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.215424 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wc5xp\" (UniqueName: \"kubernetes.io/projected/df4ca968-dcb2-434e-8442-81e871efe544-kube-api-access-wc5xp\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.215442 4766 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/df4ca968-dcb2-434e-8442-81e871efe544-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.215456 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df4ca968-dcb2-434e-8442-81e871efe544-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.215469 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df4ca968-dcb2-434e-8442-81e871efe544-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.730386 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-575f5495ff-tkktv" event={"ID":"df4ca968-dcb2-434e-8442-81e871efe544","Type":"ContainerDied","Data":"76d50d3c1faec2ac3b220795f30441c4554ed99996a53cc8f8b1d5acc879887b"} Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.730458 4766 scope.go:117] "RemoveContainer" containerID="fd12c84becc00d1d29a25462954141a9c3a7f6a2b128641fb158a6a7a3ab9496" Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.730571 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-575f5495ff-tkktv" Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.794309 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-575f5495ff-tkktv"] Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.806302 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-575f5495ff-tkktv"] Oct 02 12:36:26 crc kubenswrapper[4766]: I1002 12:36:26.963951 4766 scope.go:117] "RemoveContainer" containerID="a3360a36909acc808689ca870bd24d2c07667ae85c8d31378e12f5e3e3a06b1b" Oct 02 12:36:27 crc kubenswrapper[4766]: I1002 12:36:27.895882 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df4ca968-dcb2-434e-8442-81e871efe544" path="/var/lib/kubelet/pods/df4ca968-dcb2-434e-8442-81e871efe544/volumes" Oct 02 12:36:37 crc kubenswrapper[4766]: I1002 12:36:37.577464 4766 scope.go:117] "RemoveContainer" containerID="bcce8611cd6371522e4b9a78afe114f325e07d3d293640d426de5d0afe64e439" Oct 02 12:36:37 crc kubenswrapper[4766]: I1002 12:36:37.630592 4766 scope.go:117] "RemoveContainer" containerID="042375723cc3113382d4f430b4bc84948ad966137380389ed9a04aecba9152f4" Oct 02 12:36:37 crc kubenswrapper[4766]: I1002 12:36:37.660324 4766 scope.go:117] "RemoveContainer" containerID="d5d5aff3b8057332d11063cbf80d0d045f4a8f2111b3026cf7cefdd510f528ac" Oct 02 12:36:54 crc kubenswrapper[4766]: I1002 12:36:54.432360 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:36:54 crc kubenswrapper[4766]: I1002 12:36:54.433088 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:36:57 crc kubenswrapper[4766]: I1002 12:36:57.046586 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-7tgcb"] Oct 02 12:36:57 crc kubenswrapper[4766]: I1002 12:36:57.059302 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-7tgcb"] Oct 02 12:36:57 crc kubenswrapper[4766]: I1002 12:36:57.892129 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="067d7606-7cf3-433d-8908-8d4e5fcae88c" path="/var/lib/kubelet/pods/067d7606-7cf3-433d-8908-8d4e5fcae88c/volumes" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.526589 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5d8dc9db9c-bxckd"] Oct 02 12:37:00 crc kubenswrapper[4766]: E1002 12:37:00.527658 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4ca968-dcb2-434e-8442-81e871efe544" containerName="horizon-log" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.527671 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4ca968-dcb2-434e-8442-81e871efe544" containerName="horizon-log" Oct 02 12:37:00 crc kubenswrapper[4766]: E1002 12:37:00.527685 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b24c4ba-d542-4483-b5e3-829d5cff486a" containerName="extract-content" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.527691 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b24c4ba-d542-4483-b5e3-829d5cff486a" containerName="extract-content" Oct 02 12:37:00 crc kubenswrapper[4766]: E1002 12:37:00.527708 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b24c4ba-d542-4483-b5e3-829d5cff486a" containerName="extract-utilities" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.527717 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b24c4ba-d542-4483-b5e3-829d5cff486a" containerName="extract-utilities" Oct 02 12:37:00 crc kubenswrapper[4766]: E1002 12:37:00.527740 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b24c4ba-d542-4483-b5e3-829d5cff486a" containerName="registry-server" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.527746 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b24c4ba-d542-4483-b5e3-829d5cff486a" containerName="registry-server" Oct 02 12:37:00 crc kubenswrapper[4766]: E1002 12:37:00.527771 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df4ca968-dcb2-434e-8442-81e871efe544" containerName="horizon" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.527777 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="df4ca968-dcb2-434e-8442-81e871efe544" containerName="horizon" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.527948 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4ca968-dcb2-434e-8442-81e871efe544" containerName="horizon-log" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.527969 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="df4ca968-dcb2-434e-8442-81e871efe544" containerName="horizon" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.527989 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b24c4ba-d542-4483-b5e3-829d5cff486a" containerName="registry-server" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.529183 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.576955 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d8dc9db9c-bxckd"] Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.688597 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adfb66d8-7e20-477f-adce-87cacf4382d5-horizon-secret-key\") pod \"horizon-5d8dc9db9c-bxckd\" (UID: \"adfb66d8-7e20-477f-adce-87cacf4382d5\") " pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.689014 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adfb66d8-7e20-477f-adce-87cacf4382d5-config-data\") pod \"horizon-5d8dc9db9c-bxckd\" (UID: \"adfb66d8-7e20-477f-adce-87cacf4382d5\") " pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.689135 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjjzb\" (UniqueName: \"kubernetes.io/projected/adfb66d8-7e20-477f-adce-87cacf4382d5-kube-api-access-gjjzb\") pod \"horizon-5d8dc9db9c-bxckd\" (UID: \"adfb66d8-7e20-477f-adce-87cacf4382d5\") " pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.689459 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adfb66d8-7e20-477f-adce-87cacf4382d5-scripts\") pod \"horizon-5d8dc9db9c-bxckd\" (UID: \"adfb66d8-7e20-477f-adce-87cacf4382d5\") " pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.689635 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adfb66d8-7e20-477f-adce-87cacf4382d5-logs\") pod \"horizon-5d8dc9db9c-bxckd\" (UID: \"adfb66d8-7e20-477f-adce-87cacf4382d5\") " pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.792172 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adfb66d8-7e20-477f-adce-87cacf4382d5-horizon-secret-key\") pod \"horizon-5d8dc9db9c-bxckd\" (UID: \"adfb66d8-7e20-477f-adce-87cacf4382d5\") " pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.792250 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adfb66d8-7e20-477f-adce-87cacf4382d5-config-data\") pod \"horizon-5d8dc9db9c-bxckd\" (UID: \"adfb66d8-7e20-477f-adce-87cacf4382d5\") " pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.792320 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjjzb\" (UniqueName: \"kubernetes.io/projected/adfb66d8-7e20-477f-adce-87cacf4382d5-kube-api-access-gjjzb\") pod \"horizon-5d8dc9db9c-bxckd\" (UID: \"adfb66d8-7e20-477f-adce-87cacf4382d5\") " pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.792374 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adfb66d8-7e20-477f-adce-87cacf4382d5-scripts\") pod \"horizon-5d8dc9db9c-bxckd\" (UID: \"adfb66d8-7e20-477f-adce-87cacf4382d5\") " pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.792544 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adfb66d8-7e20-477f-adce-87cacf4382d5-logs\") pod \"horizon-5d8dc9db9c-bxckd\" (UID: \"adfb66d8-7e20-477f-adce-87cacf4382d5\") " pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.793052 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adfb66d8-7e20-477f-adce-87cacf4382d5-logs\") pod \"horizon-5d8dc9db9c-bxckd\" (UID: \"adfb66d8-7e20-477f-adce-87cacf4382d5\") " pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.793337 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adfb66d8-7e20-477f-adce-87cacf4382d5-scripts\") pod \"horizon-5d8dc9db9c-bxckd\" (UID: \"adfb66d8-7e20-477f-adce-87cacf4382d5\") " pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.794554 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adfb66d8-7e20-477f-adce-87cacf4382d5-config-data\") pod \"horizon-5d8dc9db9c-bxckd\" (UID: \"adfb66d8-7e20-477f-adce-87cacf4382d5\") " pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.800245 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adfb66d8-7e20-477f-adce-87cacf4382d5-horizon-secret-key\") pod \"horizon-5d8dc9db9c-bxckd\" (UID: \"adfb66d8-7e20-477f-adce-87cacf4382d5\") " pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.812604 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjjzb\" (UniqueName: \"kubernetes.io/projected/adfb66d8-7e20-477f-adce-87cacf4382d5-kube-api-access-gjjzb\") pod \"horizon-5d8dc9db9c-bxckd\" (UID: \"adfb66d8-7e20-477f-adce-87cacf4382d5\") " pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:00 crc kubenswrapper[4766]: I1002 12:37:00.904316 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:01 crc kubenswrapper[4766]: I1002 12:37:01.446463 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5d8dc9db9c-bxckd"] Oct 02 12:37:02 crc kubenswrapper[4766]: I1002 12:37:02.204774 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d8dc9db9c-bxckd" event={"ID":"adfb66d8-7e20-477f-adce-87cacf4382d5","Type":"ContainerStarted","Data":"aed5267d845cf6ce71fa3512883dc20dfb0aa39ae4b0c4c2245bf8e00dc52700"} Oct 02 12:37:02 crc kubenswrapper[4766]: I1002 12:37:02.205180 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d8dc9db9c-bxckd" event={"ID":"adfb66d8-7e20-477f-adce-87cacf4382d5","Type":"ContainerStarted","Data":"6d2d47896b49b1f8f4accc3a2fa5ae36d16ef048ab6f88db6419aeac28549f87"} Oct 02 12:37:02 crc kubenswrapper[4766]: I1002 12:37:02.205194 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5d8dc9db9c-bxckd" event={"ID":"adfb66d8-7e20-477f-adce-87cacf4382d5","Type":"ContainerStarted","Data":"081ef96abdfe4352fea0c011f17507ded0806f36d60b406353e2f328ef5a920d"} Oct 02 12:37:02 crc kubenswrapper[4766]: I1002 12:37:02.349164 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5d8dc9db9c-bxckd" podStartSLOduration=2.349134856 podStartE2EDuration="2.349134856s" podCreationTimestamp="2025-10-02 12:37:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:37:02.233127951 +0000 UTC m=+6337.175998935" watchObservedRunningTime="2025-10-02 12:37:02.349134856 +0000 UTC m=+6337.292005790" Oct 02 12:37:02 crc kubenswrapper[4766]: I1002 12:37:02.356653 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-zff98"] Oct 02 12:37:02 crc kubenswrapper[4766]: I1002 12:37:02.358286 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-zff98" Oct 02 12:37:02 crc kubenswrapper[4766]: I1002 12:37:02.370089 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-zff98"] Oct 02 12:37:02 crc kubenswrapper[4766]: I1002 12:37:02.433989 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cc5s\" (UniqueName: \"kubernetes.io/projected/98c2cafd-3943-4964-a407-348c81b0b416-kube-api-access-8cc5s\") pod \"heat-db-create-zff98\" (UID: \"98c2cafd-3943-4964-a407-348c81b0b416\") " pod="openstack/heat-db-create-zff98" Oct 02 12:37:02 crc kubenswrapper[4766]: I1002 12:37:02.536838 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cc5s\" (UniqueName: \"kubernetes.io/projected/98c2cafd-3943-4964-a407-348c81b0b416-kube-api-access-8cc5s\") pod \"heat-db-create-zff98\" (UID: \"98c2cafd-3943-4964-a407-348c81b0b416\") " pod="openstack/heat-db-create-zff98" Oct 02 12:37:02 crc kubenswrapper[4766]: I1002 12:37:02.564233 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cc5s\" (UniqueName: \"kubernetes.io/projected/98c2cafd-3943-4964-a407-348c81b0b416-kube-api-access-8cc5s\") pod \"heat-db-create-zff98\" (UID: \"98c2cafd-3943-4964-a407-348c81b0b416\") " pod="openstack/heat-db-create-zff98" Oct 02 12:37:02 crc kubenswrapper[4766]: I1002 12:37:02.678078 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-zff98" Oct 02 12:37:03 crc kubenswrapper[4766]: I1002 12:37:03.253296 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-zff98"] Oct 02 12:37:04 crc kubenswrapper[4766]: I1002 12:37:04.229820 4766 generic.go:334] "Generic (PLEG): container finished" podID="98c2cafd-3943-4964-a407-348c81b0b416" containerID="3c6912673af50e25395766b8fd03ba85852936101dd2472604fdbf19b2c441ea" exitCode=0 Oct 02 12:37:04 crc kubenswrapper[4766]: I1002 12:37:04.230246 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-zff98" event={"ID":"98c2cafd-3943-4964-a407-348c81b0b416","Type":"ContainerDied","Data":"3c6912673af50e25395766b8fd03ba85852936101dd2472604fdbf19b2c441ea"} Oct 02 12:37:04 crc kubenswrapper[4766]: I1002 12:37:04.230283 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-zff98" event={"ID":"98c2cafd-3943-4964-a407-348c81b0b416","Type":"ContainerStarted","Data":"059bcc07122a63d3ffae77d623e511be384dba0237ef64d4d2a67d7acd78e494"} Oct 02 12:37:05 crc kubenswrapper[4766]: I1002 12:37:05.684658 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-zff98" Oct 02 12:37:05 crc kubenswrapper[4766]: I1002 12:37:05.818846 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cc5s\" (UniqueName: \"kubernetes.io/projected/98c2cafd-3943-4964-a407-348c81b0b416-kube-api-access-8cc5s\") pod \"98c2cafd-3943-4964-a407-348c81b0b416\" (UID: \"98c2cafd-3943-4964-a407-348c81b0b416\") " Oct 02 12:37:05 crc kubenswrapper[4766]: I1002 12:37:05.827450 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c2cafd-3943-4964-a407-348c81b0b416-kube-api-access-8cc5s" (OuterVolumeSpecName: "kube-api-access-8cc5s") pod "98c2cafd-3943-4964-a407-348c81b0b416" (UID: "98c2cafd-3943-4964-a407-348c81b0b416"). InnerVolumeSpecName "kube-api-access-8cc5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:37:05 crc kubenswrapper[4766]: I1002 12:37:05.922238 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cc5s\" (UniqueName: \"kubernetes.io/projected/98c2cafd-3943-4964-a407-348c81b0b416-kube-api-access-8cc5s\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:06 crc kubenswrapper[4766]: I1002 12:37:06.251597 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-zff98" event={"ID":"98c2cafd-3943-4964-a407-348c81b0b416","Type":"ContainerDied","Data":"059bcc07122a63d3ffae77d623e511be384dba0237ef64d4d2a67d7acd78e494"} Oct 02 12:37:06 crc kubenswrapper[4766]: I1002 12:37:06.251654 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="059bcc07122a63d3ffae77d623e511be384dba0237ef64d4d2a67d7acd78e494" Oct 02 12:37:06 crc kubenswrapper[4766]: I1002 12:37:06.251705 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-zff98" Oct 02 12:37:07 crc kubenswrapper[4766]: I1002 12:37:07.032911 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-1150-account-create-cct6z"] Oct 02 12:37:07 crc kubenswrapper[4766]: I1002 12:37:07.041823 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-1150-account-create-cct6z"] Oct 02 12:37:07 crc kubenswrapper[4766]: I1002 12:37:07.895927 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13e3fa1c-1096-4c60-9191-b11de2440178" path="/var/lib/kubelet/pods/13e3fa1c-1096-4c60-9191-b11de2440178/volumes" Oct 02 12:37:10 crc kubenswrapper[4766]: I1002 12:37:10.905437 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:10 crc kubenswrapper[4766]: I1002 12:37:10.905977 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:12 crc kubenswrapper[4766]: I1002 12:37:12.476048 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-082b-account-create-nshkm"] Oct 02 12:37:12 crc kubenswrapper[4766]: E1002 12:37:12.478579 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c2cafd-3943-4964-a407-348c81b0b416" containerName="mariadb-database-create" Oct 02 12:37:12 crc kubenswrapper[4766]: I1002 12:37:12.478671 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c2cafd-3943-4964-a407-348c81b0b416" containerName="mariadb-database-create" Oct 02 12:37:12 crc kubenswrapper[4766]: I1002 12:37:12.479078 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c2cafd-3943-4964-a407-348c81b0b416" containerName="mariadb-database-create" Oct 02 12:37:12 crc kubenswrapper[4766]: I1002 12:37:12.480356 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-082b-account-create-nshkm" Oct 02 12:37:12 crc kubenswrapper[4766]: I1002 12:37:12.484773 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Oct 02 12:37:12 crc kubenswrapper[4766]: I1002 12:37:12.491829 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-082b-account-create-nshkm"] Oct 02 12:37:12 crc kubenswrapper[4766]: I1002 12:37:12.608515 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qclm\" (UniqueName: \"kubernetes.io/projected/86e9a00c-0b12-4517-8018-8164a05fac41-kube-api-access-8qclm\") pod \"heat-082b-account-create-nshkm\" (UID: \"86e9a00c-0b12-4517-8018-8164a05fac41\") " pod="openstack/heat-082b-account-create-nshkm" Oct 02 12:37:12 crc kubenswrapper[4766]: I1002 12:37:12.710593 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qclm\" (UniqueName: \"kubernetes.io/projected/86e9a00c-0b12-4517-8018-8164a05fac41-kube-api-access-8qclm\") pod \"heat-082b-account-create-nshkm\" (UID: \"86e9a00c-0b12-4517-8018-8164a05fac41\") " pod="openstack/heat-082b-account-create-nshkm" Oct 02 12:37:12 crc kubenswrapper[4766]: I1002 12:37:12.737085 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qclm\" (UniqueName: \"kubernetes.io/projected/86e9a00c-0b12-4517-8018-8164a05fac41-kube-api-access-8qclm\") pod \"heat-082b-account-create-nshkm\" (UID: \"86e9a00c-0b12-4517-8018-8164a05fac41\") " pod="openstack/heat-082b-account-create-nshkm" Oct 02 12:37:12 crc kubenswrapper[4766]: I1002 12:37:12.810110 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-082b-account-create-nshkm" Oct 02 12:37:13 crc kubenswrapper[4766]: I1002 12:37:13.048557 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-fgnx6"] Oct 02 12:37:13 crc kubenswrapper[4766]: I1002 12:37:13.063883 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-fgnx6"] Oct 02 12:37:13 crc kubenswrapper[4766]: I1002 12:37:13.372876 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-082b-account-create-nshkm" event={"ID":"86e9a00c-0b12-4517-8018-8164a05fac41","Type":"ContainerStarted","Data":"d1bb85f8393a61a096fb22ec1d97e8e4cfdb405104ba83f20243347a6ca16953"} Oct 02 12:37:13 crc kubenswrapper[4766]: I1002 12:37:13.395358 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-082b-account-create-nshkm"] Oct 02 12:37:13 crc kubenswrapper[4766]: I1002 12:37:13.893926 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15fe436b-0401-43b0-910f-529ae2ed73d1" path="/var/lib/kubelet/pods/15fe436b-0401-43b0-910f-529ae2ed73d1/volumes" Oct 02 12:37:14 crc kubenswrapper[4766]: I1002 12:37:14.384655 4766 generic.go:334] "Generic (PLEG): container finished" podID="86e9a00c-0b12-4517-8018-8164a05fac41" containerID="ec51ccd2e26d2c1a0aafcb13a06f150dbd55dfa6c7812bacfbe1c4a372a51dd2" exitCode=0 Oct 02 12:37:14 crc kubenswrapper[4766]: I1002 12:37:14.384723 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-082b-account-create-nshkm" event={"ID":"86e9a00c-0b12-4517-8018-8164a05fac41","Type":"ContainerDied","Data":"ec51ccd2e26d2c1a0aafcb13a06f150dbd55dfa6c7812bacfbe1c4a372a51dd2"} Oct 02 12:37:15 crc kubenswrapper[4766]: I1002 12:37:15.793803 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-082b-account-create-nshkm" Oct 02 12:37:15 crc kubenswrapper[4766]: I1002 12:37:15.906803 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qclm\" (UniqueName: \"kubernetes.io/projected/86e9a00c-0b12-4517-8018-8164a05fac41-kube-api-access-8qclm\") pod \"86e9a00c-0b12-4517-8018-8164a05fac41\" (UID: \"86e9a00c-0b12-4517-8018-8164a05fac41\") " Oct 02 12:37:15 crc kubenswrapper[4766]: I1002 12:37:15.925754 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86e9a00c-0b12-4517-8018-8164a05fac41-kube-api-access-8qclm" (OuterVolumeSpecName: "kube-api-access-8qclm") pod "86e9a00c-0b12-4517-8018-8164a05fac41" (UID: "86e9a00c-0b12-4517-8018-8164a05fac41"). InnerVolumeSpecName "kube-api-access-8qclm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:37:16 crc kubenswrapper[4766]: I1002 12:37:16.011079 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qclm\" (UniqueName: \"kubernetes.io/projected/86e9a00c-0b12-4517-8018-8164a05fac41-kube-api-access-8qclm\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:16 crc kubenswrapper[4766]: I1002 12:37:16.409256 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-082b-account-create-nshkm" event={"ID":"86e9a00c-0b12-4517-8018-8164a05fac41","Type":"ContainerDied","Data":"d1bb85f8393a61a096fb22ec1d97e8e4cfdb405104ba83f20243347a6ca16953"} Oct 02 12:37:16 crc kubenswrapper[4766]: I1002 12:37:16.409312 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1bb85f8393a61a096fb22ec1d97e8e4cfdb405104ba83f20243347a6ca16953" Oct 02 12:37:16 crc kubenswrapper[4766]: I1002 12:37:16.409379 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-082b-account-create-nshkm" Oct 02 12:37:17 crc kubenswrapper[4766]: I1002 12:37:17.554630 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-h4jks"] Oct 02 12:37:17 crc kubenswrapper[4766]: E1002 12:37:17.555809 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86e9a00c-0b12-4517-8018-8164a05fac41" containerName="mariadb-account-create" Oct 02 12:37:17 crc kubenswrapper[4766]: I1002 12:37:17.555831 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="86e9a00c-0b12-4517-8018-8164a05fac41" containerName="mariadb-account-create" Oct 02 12:37:17 crc kubenswrapper[4766]: I1002 12:37:17.556105 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="86e9a00c-0b12-4517-8018-8164a05fac41" containerName="mariadb-account-create" Oct 02 12:37:17 crc kubenswrapper[4766]: I1002 12:37:17.557290 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-h4jks" Oct 02 12:37:17 crc kubenswrapper[4766]: I1002 12:37:17.560098 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 02 12:37:17 crc kubenswrapper[4766]: I1002 12:37:17.560274 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-qf9gs" Oct 02 12:37:17 crc kubenswrapper[4766]: I1002 12:37:17.575495 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-h4jks"] Oct 02 12:37:17 crc kubenswrapper[4766]: I1002 12:37:17.660574 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46aae5f7-366f-4467-bc67-e662384d164b-config-data\") pod \"heat-db-sync-h4jks\" (UID: \"46aae5f7-366f-4467-bc67-e662384d164b\") " pod="openstack/heat-db-sync-h4jks" Oct 02 12:37:17 crc kubenswrapper[4766]: I1002 12:37:17.660665 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46aae5f7-366f-4467-bc67-e662384d164b-combined-ca-bundle\") pod \"heat-db-sync-h4jks\" (UID: \"46aae5f7-366f-4467-bc67-e662384d164b\") " pod="openstack/heat-db-sync-h4jks" Oct 02 12:37:17 crc kubenswrapper[4766]: I1002 12:37:17.661187 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxvmf\" (UniqueName: \"kubernetes.io/projected/46aae5f7-366f-4467-bc67-e662384d164b-kube-api-access-gxvmf\") pod \"heat-db-sync-h4jks\" (UID: \"46aae5f7-366f-4467-bc67-e662384d164b\") " pod="openstack/heat-db-sync-h4jks" Oct 02 12:37:17 crc kubenswrapper[4766]: I1002 12:37:17.763073 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46aae5f7-366f-4467-bc67-e662384d164b-config-data\") pod \"heat-db-sync-h4jks\" (UID: \"46aae5f7-366f-4467-bc67-e662384d164b\") " pod="openstack/heat-db-sync-h4jks" Oct 02 12:37:17 crc kubenswrapper[4766]: I1002 12:37:17.763168 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46aae5f7-366f-4467-bc67-e662384d164b-combined-ca-bundle\") pod \"heat-db-sync-h4jks\" (UID: \"46aae5f7-366f-4467-bc67-e662384d164b\") " pod="openstack/heat-db-sync-h4jks" Oct 02 12:37:17 crc kubenswrapper[4766]: I1002 12:37:17.763250 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxvmf\" (UniqueName: \"kubernetes.io/projected/46aae5f7-366f-4467-bc67-e662384d164b-kube-api-access-gxvmf\") pod \"heat-db-sync-h4jks\" (UID: \"46aae5f7-366f-4467-bc67-e662384d164b\") " pod="openstack/heat-db-sync-h4jks" Oct 02 12:37:17 crc kubenswrapper[4766]: I1002 12:37:17.774497 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46aae5f7-366f-4467-bc67-e662384d164b-combined-ca-bundle\") pod \"heat-db-sync-h4jks\" (UID: \"46aae5f7-366f-4467-bc67-e662384d164b\") " pod="openstack/heat-db-sync-h4jks" Oct 02 12:37:17 crc kubenswrapper[4766]: I1002 12:37:17.783421 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46aae5f7-366f-4467-bc67-e662384d164b-config-data\") pod \"heat-db-sync-h4jks\" (UID: \"46aae5f7-366f-4467-bc67-e662384d164b\") " pod="openstack/heat-db-sync-h4jks" Oct 02 12:37:17 crc kubenswrapper[4766]: I1002 12:37:17.790980 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxvmf\" (UniqueName: \"kubernetes.io/projected/46aae5f7-366f-4467-bc67-e662384d164b-kube-api-access-gxvmf\") pod \"heat-db-sync-h4jks\" (UID: \"46aae5f7-366f-4467-bc67-e662384d164b\") " pod="openstack/heat-db-sync-h4jks" Oct 02 12:37:17 crc kubenswrapper[4766]: I1002 12:37:17.891724 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-h4jks" Oct 02 12:37:18 crc kubenswrapper[4766]: I1002 12:37:18.468161 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-h4jks"] Oct 02 12:37:18 crc kubenswrapper[4766]: W1002 12:37:18.483956 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46aae5f7_366f_4467_bc67_e662384d164b.slice/crio-6459631caf45b12052c496a9bd0e71e1158c0296b878ee4941bd0df1e4ffc42d WatchSource:0}: Error finding container 6459631caf45b12052c496a9bd0e71e1158c0296b878ee4941bd0df1e4ffc42d: Status 404 returned error can't find the container with id 6459631caf45b12052c496a9bd0e71e1158c0296b878ee4941bd0df1e4ffc42d Oct 02 12:37:19 crc kubenswrapper[4766]: I1002 12:37:19.448567 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-h4jks" event={"ID":"46aae5f7-366f-4467-bc67-e662384d164b","Type":"ContainerStarted","Data":"6459631caf45b12052c496a9bd0e71e1158c0296b878ee4941bd0df1e4ffc42d"} Oct 02 12:37:20 crc kubenswrapper[4766]: I1002 12:37:20.912914 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5d8dc9db9c-bxckd" podUID="adfb66d8-7e20-477f-adce-87cacf4382d5" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.114:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.114:8080: connect: connection refused" Oct 02 12:37:24 crc kubenswrapper[4766]: I1002 12:37:24.432062 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:37:24 crc kubenswrapper[4766]: I1002 12:37:24.432634 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:37:24 crc kubenswrapper[4766]: I1002 12:37:24.432686 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 12:37:24 crc kubenswrapper[4766]: I1002 12:37:24.433639 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:37:24 crc kubenswrapper[4766]: I1002 12:37:24.433704 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" gracePeriod=600 Oct 02 12:37:25 crc kubenswrapper[4766]: E1002 12:37:25.301430 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:37:25 crc kubenswrapper[4766]: I1002 12:37:25.517704 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" exitCode=0 Oct 02 12:37:25 crc kubenswrapper[4766]: I1002 12:37:25.517780 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b"} Oct 02 12:37:25 crc kubenswrapper[4766]: I1002 12:37:25.517826 4766 scope.go:117] "RemoveContainer" containerID="d94a111146843f4af084462f95b5d87bd572303698d956b1f0dbe284471be49c" Oct 02 12:37:25 crc kubenswrapper[4766]: I1002 12:37:25.519109 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:37:25 crc kubenswrapper[4766]: E1002 12:37:25.519531 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:37:26 crc kubenswrapper[4766]: I1002 12:37:26.531402 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-h4jks" event={"ID":"46aae5f7-366f-4467-bc67-e662384d164b","Type":"ContainerStarted","Data":"14845fb6abdb72bc7cf94cb0758b8f225bb4fb3175f2ae8aec5c870ee4adf61e"} Oct 02 12:37:26 crc kubenswrapper[4766]: I1002 12:37:26.556681 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-h4jks" podStartSLOduration=2.679200431 podStartE2EDuration="9.55665757s" podCreationTimestamp="2025-10-02 12:37:17 +0000 UTC" firstStartedPulling="2025-10-02 12:37:18.488554343 +0000 UTC m=+6353.431425287" lastFinishedPulling="2025-10-02 12:37:25.366011482 +0000 UTC m=+6360.308882426" observedRunningTime="2025-10-02 12:37:26.545282426 +0000 UTC m=+6361.488153390" watchObservedRunningTime="2025-10-02 12:37:26.55665757 +0000 UTC m=+6361.499528514" Oct 02 12:37:28 crc kubenswrapper[4766]: I1002 12:37:28.557371 4766 generic.go:334] "Generic (PLEG): container finished" podID="46aae5f7-366f-4467-bc67-e662384d164b" containerID="14845fb6abdb72bc7cf94cb0758b8f225bb4fb3175f2ae8aec5c870ee4adf61e" exitCode=0 Oct 02 12:37:28 crc kubenswrapper[4766]: I1002 12:37:28.557477 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-h4jks" event={"ID":"46aae5f7-366f-4467-bc67-e662384d164b","Type":"ContainerDied","Data":"14845fb6abdb72bc7cf94cb0758b8f225bb4fb3175f2ae8aec5c870ee4adf61e"} Oct 02 12:37:29 crc kubenswrapper[4766]: I1002 12:37:29.986658 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-h4jks" Oct 02 12:37:30 crc kubenswrapper[4766]: I1002 12:37:30.082286 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxvmf\" (UniqueName: \"kubernetes.io/projected/46aae5f7-366f-4467-bc67-e662384d164b-kube-api-access-gxvmf\") pod \"46aae5f7-366f-4467-bc67-e662384d164b\" (UID: \"46aae5f7-366f-4467-bc67-e662384d164b\") " Oct 02 12:37:30 crc kubenswrapper[4766]: I1002 12:37:30.082549 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46aae5f7-366f-4467-bc67-e662384d164b-config-data\") pod \"46aae5f7-366f-4467-bc67-e662384d164b\" (UID: \"46aae5f7-366f-4467-bc67-e662384d164b\") " Oct 02 12:37:30 crc kubenswrapper[4766]: I1002 12:37:30.082668 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46aae5f7-366f-4467-bc67-e662384d164b-combined-ca-bundle\") pod \"46aae5f7-366f-4467-bc67-e662384d164b\" (UID: \"46aae5f7-366f-4467-bc67-e662384d164b\") " Oct 02 12:37:30 crc kubenswrapper[4766]: I1002 12:37:30.090569 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46aae5f7-366f-4467-bc67-e662384d164b-kube-api-access-gxvmf" (OuterVolumeSpecName: "kube-api-access-gxvmf") pod "46aae5f7-366f-4467-bc67-e662384d164b" (UID: "46aae5f7-366f-4467-bc67-e662384d164b"). InnerVolumeSpecName "kube-api-access-gxvmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:37:30 crc kubenswrapper[4766]: I1002 12:37:30.116069 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46aae5f7-366f-4467-bc67-e662384d164b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46aae5f7-366f-4467-bc67-e662384d164b" (UID: "46aae5f7-366f-4467-bc67-e662384d164b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:37:30 crc kubenswrapper[4766]: I1002 12:37:30.186748 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46aae5f7-366f-4467-bc67-e662384d164b-config-data" (OuterVolumeSpecName: "config-data") pod "46aae5f7-366f-4467-bc67-e662384d164b" (UID: "46aae5f7-366f-4467-bc67-e662384d164b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:37:30 crc kubenswrapper[4766]: I1002 12:37:30.187369 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46aae5f7-366f-4467-bc67-e662384d164b-config-data\") pod \"46aae5f7-366f-4467-bc67-e662384d164b\" (UID: \"46aae5f7-366f-4467-bc67-e662384d164b\") " Oct 02 12:37:30 crc kubenswrapper[4766]: W1002 12:37:30.187619 4766 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/46aae5f7-366f-4467-bc67-e662384d164b/volumes/kubernetes.io~secret/config-data Oct 02 12:37:30 crc kubenswrapper[4766]: I1002 12:37:30.187669 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46aae5f7-366f-4467-bc67-e662384d164b-config-data" (OuterVolumeSpecName: "config-data") pod "46aae5f7-366f-4467-bc67-e662384d164b" (UID: "46aae5f7-366f-4467-bc67-e662384d164b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:37:30 crc kubenswrapper[4766]: I1002 12:37:30.188636 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46aae5f7-366f-4467-bc67-e662384d164b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:30 crc kubenswrapper[4766]: I1002 12:37:30.188663 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxvmf\" (UniqueName: \"kubernetes.io/projected/46aae5f7-366f-4467-bc67-e662384d164b-kube-api-access-gxvmf\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:30 crc kubenswrapper[4766]: I1002 12:37:30.188677 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46aae5f7-366f-4467-bc67-e662384d164b-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:37:30 crc kubenswrapper[4766]: I1002 12:37:30.607543 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-h4jks" event={"ID":"46aae5f7-366f-4467-bc67-e662384d164b","Type":"ContainerDied","Data":"6459631caf45b12052c496a9bd0e71e1158c0296b878ee4941bd0df1e4ffc42d"} Oct 02 12:37:30 crc kubenswrapper[4766]: I1002 12:37:30.607617 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6459631caf45b12052c496a9bd0e71e1158c0296b878ee4941bd0df1e4ffc42d" Oct 02 12:37:30 crc kubenswrapper[4766]: I1002 12:37:30.610026 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-h4jks" Oct 02 12:37:31 crc kubenswrapper[4766]: I1002 12:37:31.991742 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-67cdcf9c8-cwdb9"] Oct 02 12:37:31 crc kubenswrapper[4766]: E1002 12:37:31.993050 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46aae5f7-366f-4467-bc67-e662384d164b" containerName="heat-db-sync" Oct 02 12:37:31 crc kubenswrapper[4766]: I1002 12:37:31.993068 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="46aae5f7-366f-4467-bc67-e662384d164b" containerName="heat-db-sync" Oct 02 12:37:31 crc kubenswrapper[4766]: I1002 12:37:31.993310 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="46aae5f7-366f-4467-bc67-e662384d164b" containerName="heat-db-sync" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.003168 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-67cdcf9c8-cwdb9" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.010957 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-qf9gs" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.011197 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.011451 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.023656 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-67cdcf9c8-cwdb9"] Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.059273 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-78997d45f6-fcx4l"] Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.061517 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-78997d45f6-fcx4l" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.082134 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.106694 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-78997d45f6-fcx4l"] Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.139538 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2e256f4-b176-44f6-9cb0-d8019ae9cc2c-config-data-custom\") pod \"heat-engine-67cdcf9c8-cwdb9\" (UID: \"d2e256f4-b176-44f6-9cb0-d8019ae9cc2c\") " pod="openstack/heat-engine-67cdcf9c8-cwdb9" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.139612 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrqbp\" (UniqueName: \"kubernetes.io/projected/8defab53-42c6-4cff-b024-5014eae2d6f8-kube-api-access-wrqbp\") pod \"heat-api-78997d45f6-fcx4l\" (UID: \"8defab53-42c6-4cff-b024-5014eae2d6f8\") " pod="openstack/heat-api-78997d45f6-fcx4l" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.139682 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8defab53-42c6-4cff-b024-5014eae2d6f8-config-data\") pod \"heat-api-78997d45f6-fcx4l\" (UID: \"8defab53-42c6-4cff-b024-5014eae2d6f8\") " pod="openstack/heat-api-78997d45f6-fcx4l" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.139783 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2e256f4-b176-44f6-9cb0-d8019ae9cc2c-config-data\") pod \"heat-engine-67cdcf9c8-cwdb9\" (UID: \"d2e256f4-b176-44f6-9cb0-d8019ae9cc2c\") " pod="openstack/heat-engine-67cdcf9c8-cwdb9" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.139816 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8defab53-42c6-4cff-b024-5014eae2d6f8-combined-ca-bundle\") pod \"heat-api-78997d45f6-fcx4l\" (UID: \"8defab53-42c6-4cff-b024-5014eae2d6f8\") " pod="openstack/heat-api-78997d45f6-fcx4l" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.139856 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nq6c\" (UniqueName: \"kubernetes.io/projected/d2e256f4-b176-44f6-9cb0-d8019ae9cc2c-kube-api-access-2nq6c\") pod \"heat-engine-67cdcf9c8-cwdb9\" (UID: \"d2e256f4-b176-44f6-9cb0-d8019ae9cc2c\") " pod="openstack/heat-engine-67cdcf9c8-cwdb9" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.139896 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e256f4-b176-44f6-9cb0-d8019ae9cc2c-combined-ca-bundle\") pod \"heat-engine-67cdcf9c8-cwdb9\" (UID: \"d2e256f4-b176-44f6-9cb0-d8019ae9cc2c\") " pod="openstack/heat-engine-67cdcf9c8-cwdb9" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.139928 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8defab53-42c6-4cff-b024-5014eae2d6f8-config-data-custom\") pod \"heat-api-78997d45f6-fcx4l\" (UID: \"8defab53-42c6-4cff-b024-5014eae2d6f8\") " pod="openstack/heat-api-78997d45f6-fcx4l" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.163866 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-58bc7d788f-cz9w5"] Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.165955 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.171825 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.196572 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-58bc7d788f-cz9w5"] Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.242136 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2e256f4-b176-44f6-9cb0-d8019ae9cc2c-config-data\") pod \"heat-engine-67cdcf9c8-cwdb9\" (UID: \"d2e256f4-b176-44f6-9cb0-d8019ae9cc2c\") " pod="openstack/heat-engine-67cdcf9c8-cwdb9" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.242231 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8defab53-42c6-4cff-b024-5014eae2d6f8-combined-ca-bundle\") pod \"heat-api-78997d45f6-fcx4l\" (UID: \"8defab53-42c6-4cff-b024-5014eae2d6f8\") " pod="openstack/heat-api-78997d45f6-fcx4l" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.242273 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nq6c\" (UniqueName: \"kubernetes.io/projected/d2e256f4-b176-44f6-9cb0-d8019ae9cc2c-kube-api-access-2nq6c\") pod \"heat-engine-67cdcf9c8-cwdb9\" (UID: \"d2e256f4-b176-44f6-9cb0-d8019ae9cc2c\") " pod="openstack/heat-engine-67cdcf9c8-cwdb9" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.242321 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e256f4-b176-44f6-9cb0-d8019ae9cc2c-combined-ca-bundle\") pod \"heat-engine-67cdcf9c8-cwdb9\" (UID: \"d2e256f4-b176-44f6-9cb0-d8019ae9cc2c\") " pod="openstack/heat-engine-67cdcf9c8-cwdb9" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.242356 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8defab53-42c6-4cff-b024-5014eae2d6f8-config-data-custom\") pod \"heat-api-78997d45f6-fcx4l\" (UID: \"8defab53-42c6-4cff-b024-5014eae2d6f8\") " pod="openstack/heat-api-78997d45f6-fcx4l" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.242395 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d049d96-d99d-4e97-84ed-0310c5d0b772-config-data\") pod \"heat-cfnapi-58bc7d788f-cz9w5\" (UID: \"1d049d96-d99d-4e97-84ed-0310c5d0b772\") " pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.242523 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2e256f4-b176-44f6-9cb0-d8019ae9cc2c-config-data-custom\") pod \"heat-engine-67cdcf9c8-cwdb9\" (UID: \"d2e256f4-b176-44f6-9cb0-d8019ae9cc2c\") " pod="openstack/heat-engine-67cdcf9c8-cwdb9" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.242561 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrqbp\" (UniqueName: \"kubernetes.io/projected/8defab53-42c6-4cff-b024-5014eae2d6f8-kube-api-access-wrqbp\") pod \"heat-api-78997d45f6-fcx4l\" (UID: \"8defab53-42c6-4cff-b024-5014eae2d6f8\") " pod="openstack/heat-api-78997d45f6-fcx4l" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.242592 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d049d96-d99d-4e97-84ed-0310c5d0b772-combined-ca-bundle\") pod \"heat-cfnapi-58bc7d788f-cz9w5\" (UID: \"1d049d96-d99d-4e97-84ed-0310c5d0b772\") " pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.242621 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95ftb\" (UniqueName: \"kubernetes.io/projected/1d049d96-d99d-4e97-84ed-0310c5d0b772-kube-api-access-95ftb\") pod \"heat-cfnapi-58bc7d788f-cz9w5\" (UID: \"1d049d96-d99d-4e97-84ed-0310c5d0b772\") " pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.242654 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d049d96-d99d-4e97-84ed-0310c5d0b772-config-data-custom\") pod \"heat-cfnapi-58bc7d788f-cz9w5\" (UID: \"1d049d96-d99d-4e97-84ed-0310c5d0b772\") " pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.242720 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8defab53-42c6-4cff-b024-5014eae2d6f8-config-data\") pod \"heat-api-78997d45f6-fcx4l\" (UID: \"8defab53-42c6-4cff-b024-5014eae2d6f8\") " pod="openstack/heat-api-78997d45f6-fcx4l" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.252118 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8defab53-42c6-4cff-b024-5014eae2d6f8-config-data-custom\") pod \"heat-api-78997d45f6-fcx4l\" (UID: \"8defab53-42c6-4cff-b024-5014eae2d6f8\") " pod="openstack/heat-api-78997d45f6-fcx4l" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.253421 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8defab53-42c6-4cff-b024-5014eae2d6f8-config-data\") pod \"heat-api-78997d45f6-fcx4l\" (UID: \"8defab53-42c6-4cff-b024-5014eae2d6f8\") " pod="openstack/heat-api-78997d45f6-fcx4l" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.255685 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2e256f4-b176-44f6-9cb0-d8019ae9cc2c-combined-ca-bundle\") pod \"heat-engine-67cdcf9c8-cwdb9\" (UID: \"d2e256f4-b176-44f6-9cb0-d8019ae9cc2c\") " pod="openstack/heat-engine-67cdcf9c8-cwdb9" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.256431 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8defab53-42c6-4cff-b024-5014eae2d6f8-combined-ca-bundle\") pod \"heat-api-78997d45f6-fcx4l\" (UID: \"8defab53-42c6-4cff-b024-5014eae2d6f8\") " pod="openstack/heat-api-78997d45f6-fcx4l" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.257311 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2e256f4-b176-44f6-9cb0-d8019ae9cc2c-config-data\") pod \"heat-engine-67cdcf9c8-cwdb9\" (UID: \"d2e256f4-b176-44f6-9cb0-d8019ae9cc2c\") " pod="openstack/heat-engine-67cdcf9c8-cwdb9" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.257812 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2e256f4-b176-44f6-9cb0-d8019ae9cc2c-config-data-custom\") pod \"heat-engine-67cdcf9c8-cwdb9\" (UID: \"d2e256f4-b176-44f6-9cb0-d8019ae9cc2c\") " pod="openstack/heat-engine-67cdcf9c8-cwdb9" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.272102 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nq6c\" (UniqueName: \"kubernetes.io/projected/d2e256f4-b176-44f6-9cb0-d8019ae9cc2c-kube-api-access-2nq6c\") pod \"heat-engine-67cdcf9c8-cwdb9\" (UID: \"d2e256f4-b176-44f6-9cb0-d8019ae9cc2c\") " pod="openstack/heat-engine-67cdcf9c8-cwdb9" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.278829 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrqbp\" (UniqueName: \"kubernetes.io/projected/8defab53-42c6-4cff-b024-5014eae2d6f8-kube-api-access-wrqbp\") pod \"heat-api-78997d45f6-fcx4l\" (UID: \"8defab53-42c6-4cff-b024-5014eae2d6f8\") " pod="openstack/heat-api-78997d45f6-fcx4l" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.343432 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-67cdcf9c8-cwdb9" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.344591 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d049d96-d99d-4e97-84ed-0310c5d0b772-config-data\") pod \"heat-cfnapi-58bc7d788f-cz9w5\" (UID: \"1d049d96-d99d-4e97-84ed-0310c5d0b772\") " pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.344780 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d049d96-d99d-4e97-84ed-0310c5d0b772-combined-ca-bundle\") pod \"heat-cfnapi-58bc7d788f-cz9w5\" (UID: \"1d049d96-d99d-4e97-84ed-0310c5d0b772\") " pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.344874 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95ftb\" (UniqueName: \"kubernetes.io/projected/1d049d96-d99d-4e97-84ed-0310c5d0b772-kube-api-access-95ftb\") pod \"heat-cfnapi-58bc7d788f-cz9w5\" (UID: \"1d049d96-d99d-4e97-84ed-0310c5d0b772\") " pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.344986 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d049d96-d99d-4e97-84ed-0310c5d0b772-config-data-custom\") pod \"heat-cfnapi-58bc7d788f-cz9w5\" (UID: \"1d049d96-d99d-4e97-84ed-0310c5d0b772\") " pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.349841 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1d049d96-d99d-4e97-84ed-0310c5d0b772-config-data-custom\") pod \"heat-cfnapi-58bc7d788f-cz9w5\" (UID: \"1d049d96-d99d-4e97-84ed-0310c5d0b772\") " pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.351363 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d049d96-d99d-4e97-84ed-0310c5d0b772-combined-ca-bundle\") pod \"heat-cfnapi-58bc7d788f-cz9w5\" (UID: \"1d049d96-d99d-4e97-84ed-0310c5d0b772\") " pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.351457 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d049d96-d99d-4e97-84ed-0310c5d0b772-config-data\") pod \"heat-cfnapi-58bc7d788f-cz9w5\" (UID: \"1d049d96-d99d-4e97-84ed-0310c5d0b772\") " pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.368084 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95ftb\" (UniqueName: \"kubernetes.io/projected/1d049d96-d99d-4e97-84ed-0310c5d0b772-kube-api-access-95ftb\") pod \"heat-cfnapi-58bc7d788f-cz9w5\" (UID: \"1d049d96-d99d-4e97-84ed-0310c5d0b772\") " pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.413337 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-78997d45f6-fcx4l" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.493232 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" Oct 02 12:37:32 crc kubenswrapper[4766]: I1002 12:37:32.980590 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-67cdcf9c8-cwdb9"] Oct 02 12:37:33 crc kubenswrapper[4766]: I1002 12:37:33.143312 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-78997d45f6-fcx4l"] Oct 02 12:37:33 crc kubenswrapper[4766]: I1002 12:37:33.243320 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-58bc7d788f-cz9w5"] Oct 02 12:37:33 crc kubenswrapper[4766]: W1002 12:37:33.246345 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d049d96_d99d_4e97_84ed_0310c5d0b772.slice/crio-ab6574515ec0d4813bb447d8f6dc1aa2ba7027247710c4f06e71aab6e583dde8 WatchSource:0}: Error finding container ab6574515ec0d4813bb447d8f6dc1aa2ba7027247710c4f06e71aab6e583dde8: Status 404 returned error can't find the container with id ab6574515ec0d4813bb447d8f6dc1aa2ba7027247710c4f06e71aab6e583dde8 Oct 02 12:37:33 crc kubenswrapper[4766]: I1002 12:37:33.289287 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:33 crc kubenswrapper[4766]: I1002 12:37:33.678600 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" event={"ID":"1d049d96-d99d-4e97-84ed-0310c5d0b772","Type":"ContainerStarted","Data":"ab6574515ec0d4813bb447d8f6dc1aa2ba7027247710c4f06e71aab6e583dde8"} Oct 02 12:37:33 crc kubenswrapper[4766]: I1002 12:37:33.681001 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-67cdcf9c8-cwdb9" event={"ID":"d2e256f4-b176-44f6-9cb0-d8019ae9cc2c","Type":"ContainerStarted","Data":"f748f8d130f46ee9b88177c06a7f27041d0c3b95fb8cbe038a6837b613fb1f86"} Oct 02 12:37:33 crc kubenswrapper[4766]: I1002 12:37:33.681055 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-67cdcf9c8-cwdb9" event={"ID":"d2e256f4-b176-44f6-9cb0-d8019ae9cc2c","Type":"ContainerStarted","Data":"ba488ba11a7a87ee7c2ffa287d6f8c6fd93f8d94982ec1dc631b5060d7033aaf"} Oct 02 12:37:33 crc kubenswrapper[4766]: I1002 12:37:33.683122 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-67cdcf9c8-cwdb9" Oct 02 12:37:33 crc kubenswrapper[4766]: I1002 12:37:33.684825 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-78997d45f6-fcx4l" event={"ID":"8defab53-42c6-4cff-b024-5014eae2d6f8","Type":"ContainerStarted","Data":"144d9123d83a338266146a97e5a4310d0ebc11e0b1fe5b647219bd419b27f04f"} Oct 02 12:37:33 crc kubenswrapper[4766]: I1002 12:37:33.710811 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-67cdcf9c8-cwdb9" podStartSLOduration=2.710780821 podStartE2EDuration="2.710780821s" podCreationTimestamp="2025-10-02 12:37:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:37:33.704118867 +0000 UTC m=+6368.646989811" watchObservedRunningTime="2025-10-02 12:37:33.710780821 +0000 UTC m=+6368.653651775" Oct 02 12:37:35 crc kubenswrapper[4766]: I1002 12:37:35.286682 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5d8dc9db9c-bxckd" Oct 02 12:37:35 crc kubenswrapper[4766]: I1002 12:37:35.379002 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d68765fcc-88cb5"] Oct 02 12:37:35 crc kubenswrapper[4766]: I1002 12:37:35.379724 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7d68765fcc-88cb5" podUID="3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" containerName="horizon-log" containerID="cri-o://b81881333fddbc48e419a202f15a835cf46d38d3517e10c56e69fb16145956ef" gracePeriod=30 Oct 02 12:37:35 crc kubenswrapper[4766]: I1002 12:37:35.379869 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7d68765fcc-88cb5" podUID="3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" containerName="horizon" containerID="cri-o://c64bf4995641dab91c56fda7ae96d849253e4ef453c3f5da7ca6e68cb9aa52be" gracePeriod=30 Oct 02 12:37:35 crc kubenswrapper[4766]: I1002 12:37:35.709963 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" event={"ID":"1d049d96-d99d-4e97-84ed-0310c5d0b772","Type":"ContainerStarted","Data":"1d8ff0ec78d656f5b3a45d84d8690d3e1e6df3a06e5959f9eaef7e289113c7fd"} Oct 02 12:37:35 crc kubenswrapper[4766]: I1002 12:37:35.710101 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" Oct 02 12:37:35 crc kubenswrapper[4766]: I1002 12:37:35.712147 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-78997d45f6-fcx4l" event={"ID":"8defab53-42c6-4cff-b024-5014eae2d6f8","Type":"ContainerStarted","Data":"88a65ee73e2f443c379b5ec3d5dc36f3b80a7c9c2504df4375c53153cc1f503c"} Oct 02 12:37:35 crc kubenswrapper[4766]: I1002 12:37:35.713498 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-78997d45f6-fcx4l" Oct 02 12:37:35 crc kubenswrapper[4766]: I1002 12:37:35.732116 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" podStartSLOduration=1.906607054 podStartE2EDuration="3.732086907s" podCreationTimestamp="2025-10-02 12:37:32 +0000 UTC" firstStartedPulling="2025-10-02 12:37:33.25060413 +0000 UTC m=+6368.193475074" lastFinishedPulling="2025-10-02 12:37:35.076083983 +0000 UTC m=+6370.018954927" observedRunningTime="2025-10-02 12:37:35.725898418 +0000 UTC m=+6370.668769372" watchObservedRunningTime="2025-10-02 12:37:35.732086907 +0000 UTC m=+6370.674957851" Oct 02 12:37:35 crc kubenswrapper[4766]: I1002 12:37:35.765247 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-78997d45f6-fcx4l" podStartSLOduration=1.852826661 podStartE2EDuration="3.765214888s" podCreationTimestamp="2025-10-02 12:37:32 +0000 UTC" firstStartedPulling="2025-10-02 12:37:33.16035341 +0000 UTC m=+6368.103224354" lastFinishedPulling="2025-10-02 12:37:35.072741637 +0000 UTC m=+6370.015612581" observedRunningTime="2025-10-02 12:37:35.756140907 +0000 UTC m=+6370.699011851" watchObservedRunningTime="2025-10-02 12:37:35.765214888 +0000 UTC m=+6370.708085832" Oct 02 12:37:35 crc kubenswrapper[4766]: I1002 12:37:35.881695 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:37:35 crc kubenswrapper[4766]: E1002 12:37:35.882038 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:37:37 crc kubenswrapper[4766]: I1002 12:37:37.873334 4766 scope.go:117] "RemoveContainer" containerID="e2dfcc63279915f3be5966486516deb094039a020bcfab8070301cfb54de06fa" Oct 02 12:37:37 crc kubenswrapper[4766]: I1002 12:37:37.921073 4766 scope.go:117] "RemoveContainer" containerID="7c17d62cda7bf4b1b7ddfd1834208a92c7c0261c32935b07bb68922cbbebe55d" Oct 02 12:37:37 crc kubenswrapper[4766]: I1002 12:37:37.959955 4766 scope.go:117] "RemoveContainer" containerID="6231307ebda306c68c9cb6a164ed7bd47bc48a6c406d1a0669db5a8e63bc22b4" Oct 02 12:37:38 crc kubenswrapper[4766]: I1002 12:37:38.761376 4766 generic.go:334] "Generic (PLEG): container finished" podID="3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" containerID="c64bf4995641dab91c56fda7ae96d849253e4ef453c3f5da7ca6e68cb9aa52be" exitCode=0 Oct 02 12:37:38 crc kubenswrapper[4766]: I1002 12:37:38.761425 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d68765fcc-88cb5" event={"ID":"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9","Type":"ContainerDied","Data":"c64bf4995641dab91c56fda7ae96d849253e4ef453c3f5da7ca6e68cb9aa52be"} Oct 02 12:37:41 crc kubenswrapper[4766]: I1002 12:37:41.030839 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7d68765fcc-88cb5" podUID="3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Oct 02 12:37:44 crc kubenswrapper[4766]: I1002 12:37:44.067684 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-78997d45f6-fcx4l" Oct 02 12:37:44 crc kubenswrapper[4766]: I1002 12:37:44.116695 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-58bc7d788f-cz9w5" Oct 02 12:37:48 crc kubenswrapper[4766]: I1002 12:37:48.882191 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:37:48 crc kubenswrapper[4766]: E1002 12:37:48.883109 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:37:51 crc kubenswrapper[4766]: I1002 12:37:51.031122 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7d68765fcc-88cb5" podUID="3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Oct 02 12:37:52 crc kubenswrapper[4766]: I1002 12:37:52.387356 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-67cdcf9c8-cwdb9" Oct 02 12:37:59 crc kubenswrapper[4766]: I1002 12:37:59.881755 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:37:59 crc kubenswrapper[4766]: E1002 12:37:59.882986 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:38:01 crc kubenswrapper[4766]: I1002 12:38:01.031355 4766 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7d68765fcc-88cb5" podUID="3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.109:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.109:8080: connect: connection refused" Oct 02 12:38:01 crc kubenswrapper[4766]: I1002 12:38:01.031556 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:38:02 crc kubenswrapper[4766]: I1002 12:38:02.086005 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4"] Oct 02 12:38:02 crc kubenswrapper[4766]: I1002 12:38:02.090205 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" Oct 02 12:38:02 crc kubenswrapper[4766]: I1002 12:38:02.092612 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 02 12:38:02 crc kubenswrapper[4766]: I1002 12:38:02.097911 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4"] Oct 02 12:38:02 crc kubenswrapper[4766]: I1002 12:38:02.246271 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcqsb\" (UniqueName: \"kubernetes.io/projected/403ad43d-bdf9-4c87-ad12-313410089de3-kube-api-access-vcqsb\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4\" (UID: \"403ad43d-bdf9-4c87-ad12-313410089de3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" Oct 02 12:38:02 crc kubenswrapper[4766]: I1002 12:38:02.246381 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/403ad43d-bdf9-4c87-ad12-313410089de3-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4\" (UID: \"403ad43d-bdf9-4c87-ad12-313410089de3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" Oct 02 12:38:02 crc kubenswrapper[4766]: I1002 12:38:02.246452 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/403ad43d-bdf9-4c87-ad12-313410089de3-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4\" (UID: \"403ad43d-bdf9-4c87-ad12-313410089de3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" Oct 02 12:38:02 crc kubenswrapper[4766]: I1002 12:38:02.351427 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcqsb\" (UniqueName: \"kubernetes.io/projected/403ad43d-bdf9-4c87-ad12-313410089de3-kube-api-access-vcqsb\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4\" (UID: \"403ad43d-bdf9-4c87-ad12-313410089de3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" Oct 02 12:38:02 crc kubenswrapper[4766]: I1002 12:38:02.351538 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/403ad43d-bdf9-4c87-ad12-313410089de3-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4\" (UID: \"403ad43d-bdf9-4c87-ad12-313410089de3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" Oct 02 12:38:02 crc kubenswrapper[4766]: I1002 12:38:02.351568 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/403ad43d-bdf9-4c87-ad12-313410089de3-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4\" (UID: \"403ad43d-bdf9-4c87-ad12-313410089de3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" Oct 02 12:38:02 crc kubenswrapper[4766]: I1002 12:38:02.352228 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/403ad43d-bdf9-4c87-ad12-313410089de3-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4\" (UID: \"403ad43d-bdf9-4c87-ad12-313410089de3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" Oct 02 12:38:02 crc kubenswrapper[4766]: I1002 12:38:02.352533 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/403ad43d-bdf9-4c87-ad12-313410089de3-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4\" (UID: \"403ad43d-bdf9-4c87-ad12-313410089de3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" Oct 02 12:38:02 crc kubenswrapper[4766]: I1002 12:38:02.393335 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcqsb\" (UniqueName: \"kubernetes.io/projected/403ad43d-bdf9-4c87-ad12-313410089de3-kube-api-access-vcqsb\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4\" (UID: \"403ad43d-bdf9-4c87-ad12-313410089de3\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" Oct 02 12:38:02 crc kubenswrapper[4766]: I1002 12:38:02.432817 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" Oct 02 12:38:02 crc kubenswrapper[4766]: I1002 12:38:02.963124 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4"] Oct 02 12:38:03 crc kubenswrapper[4766]: I1002 12:38:03.086341 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" event={"ID":"403ad43d-bdf9-4c87-ad12-313410089de3","Type":"ContainerStarted","Data":"f8e09c541dc0e7bec1fe7a23fe40fbc29fd27b8f0e62f21d3f38aedbac81988a"} Oct 02 12:38:04 crc kubenswrapper[4766]: I1002 12:38:04.101866 4766 generic.go:334] "Generic (PLEG): container finished" podID="403ad43d-bdf9-4c87-ad12-313410089de3" containerID="1025fba89235da4c65c344215a9e88b44990d27a91e3366433982589808b2164" exitCode=0 Oct 02 12:38:04 crc kubenswrapper[4766]: I1002 12:38:04.101950 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" event={"ID":"403ad43d-bdf9-4c87-ad12-313410089de3","Type":"ContainerDied","Data":"1025fba89235da4c65c344215a9e88b44990d27a91e3366433982589808b2164"} Oct 02 12:38:04 crc kubenswrapper[4766]: I1002 12:38:04.432795 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5nlqp"] Oct 02 12:38:04 crc kubenswrapper[4766]: I1002 12:38:04.437243 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5nlqp" Oct 02 12:38:04 crc kubenswrapper[4766]: I1002 12:38:04.454975 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5nlqp"] Oct 02 12:38:04 crc kubenswrapper[4766]: I1002 12:38:04.510680 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42843af7-d9fb-49af-8d10-2333c67b2b01-utilities\") pod \"redhat-operators-5nlqp\" (UID: \"42843af7-d9fb-49af-8d10-2333c67b2b01\") " pod="openshift-marketplace/redhat-operators-5nlqp" Oct 02 12:38:04 crc kubenswrapper[4766]: I1002 12:38:04.510886 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njb5v\" (UniqueName: \"kubernetes.io/projected/42843af7-d9fb-49af-8d10-2333c67b2b01-kube-api-access-njb5v\") pod \"redhat-operators-5nlqp\" (UID: \"42843af7-d9fb-49af-8d10-2333c67b2b01\") " pod="openshift-marketplace/redhat-operators-5nlqp" Oct 02 12:38:04 crc kubenswrapper[4766]: I1002 12:38:04.510994 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42843af7-d9fb-49af-8d10-2333c67b2b01-catalog-content\") pod \"redhat-operators-5nlqp\" (UID: \"42843af7-d9fb-49af-8d10-2333c67b2b01\") " pod="openshift-marketplace/redhat-operators-5nlqp" Oct 02 12:38:04 crc kubenswrapper[4766]: I1002 12:38:04.614044 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njb5v\" (UniqueName: \"kubernetes.io/projected/42843af7-d9fb-49af-8d10-2333c67b2b01-kube-api-access-njb5v\") pod \"redhat-operators-5nlqp\" (UID: \"42843af7-d9fb-49af-8d10-2333c67b2b01\") " pod="openshift-marketplace/redhat-operators-5nlqp" Oct 02 12:38:04 crc kubenswrapper[4766]: I1002 12:38:04.614203 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42843af7-d9fb-49af-8d10-2333c67b2b01-catalog-content\") pod \"redhat-operators-5nlqp\" (UID: \"42843af7-d9fb-49af-8d10-2333c67b2b01\") " pod="openshift-marketplace/redhat-operators-5nlqp" Oct 02 12:38:04 crc kubenswrapper[4766]: I1002 12:38:04.614322 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42843af7-d9fb-49af-8d10-2333c67b2b01-utilities\") pod \"redhat-operators-5nlqp\" (UID: \"42843af7-d9fb-49af-8d10-2333c67b2b01\") " pod="openshift-marketplace/redhat-operators-5nlqp" Oct 02 12:38:04 crc kubenswrapper[4766]: I1002 12:38:04.614834 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42843af7-d9fb-49af-8d10-2333c67b2b01-catalog-content\") pod \"redhat-operators-5nlqp\" (UID: \"42843af7-d9fb-49af-8d10-2333c67b2b01\") " pod="openshift-marketplace/redhat-operators-5nlqp" Oct 02 12:38:04 crc kubenswrapper[4766]: I1002 12:38:04.614918 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42843af7-d9fb-49af-8d10-2333c67b2b01-utilities\") pod \"redhat-operators-5nlqp\" (UID: \"42843af7-d9fb-49af-8d10-2333c67b2b01\") " pod="openshift-marketplace/redhat-operators-5nlqp" Oct 02 12:38:04 crc kubenswrapper[4766]: I1002 12:38:04.637105 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njb5v\" (UniqueName: \"kubernetes.io/projected/42843af7-d9fb-49af-8d10-2333c67b2b01-kube-api-access-njb5v\") pod \"redhat-operators-5nlqp\" (UID: \"42843af7-d9fb-49af-8d10-2333c67b2b01\") " pod="openshift-marketplace/redhat-operators-5nlqp" Oct 02 12:38:04 crc kubenswrapper[4766]: I1002 12:38:04.767750 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5nlqp" Oct 02 12:38:05 crc kubenswrapper[4766]: I1002 12:38:05.520386 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5nlqp"] Oct 02 12:38:05 crc kubenswrapper[4766]: W1002 12:38:05.582853 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42843af7_d9fb_49af_8d10_2333c67b2b01.slice/crio-c599120ba202fa141dc421f71ab86bf04c965c2be3074de9c3d9b1ce8df13ebf WatchSource:0}: Error finding container c599120ba202fa141dc421f71ab86bf04c965c2be3074de9c3d9b1ce8df13ebf: Status 404 returned error can't find the container with id c599120ba202fa141dc421f71ab86bf04c965c2be3074de9c3d9b1ce8df13ebf Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.133338 4766 generic.go:334] "Generic (PLEG): container finished" podID="42843af7-d9fb-49af-8d10-2333c67b2b01" containerID="ad05f8f0f912d674c03146ff3087d6dadf3ccd2dc31f97a41100d72b1c5144dd" exitCode=0 Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.133464 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nlqp" event={"ID":"42843af7-d9fb-49af-8d10-2333c67b2b01","Type":"ContainerDied","Data":"ad05f8f0f912d674c03146ff3087d6dadf3ccd2dc31f97a41100d72b1c5144dd"} Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.133857 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nlqp" event={"ID":"42843af7-d9fb-49af-8d10-2333c67b2b01","Type":"ContainerStarted","Data":"c599120ba202fa141dc421f71ab86bf04c965c2be3074de9c3d9b1ce8df13ebf"} Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.141461 4766 generic.go:334] "Generic (PLEG): container finished" podID="3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" containerID="b81881333fddbc48e419a202f15a835cf46d38d3517e10c56e69fb16145956ef" exitCode=137 Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.141564 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d68765fcc-88cb5" event={"ID":"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9","Type":"ContainerDied","Data":"b81881333fddbc48e419a202f15a835cf46d38d3517e10c56e69fb16145956ef"} Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.141603 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d68765fcc-88cb5" event={"ID":"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9","Type":"ContainerDied","Data":"07139954cd627d00a1dcb3a6700bf0a125f5f7a642506ab3e4611d35a463b95d"} Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.141616 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07139954cd627d00a1dcb3a6700bf0a125f5f7a642506ab3e4611d35a463b95d" Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.150337 4766 generic.go:334] "Generic (PLEG): container finished" podID="403ad43d-bdf9-4c87-ad12-313410089de3" containerID="7d59b982f0bbb3204457d180a6c1befd20d38bb3d37b30ce199e52ed3e1723db" exitCode=0 Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.150382 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" event={"ID":"403ad43d-bdf9-4c87-ad12-313410089de3","Type":"ContainerDied","Data":"7d59b982f0bbb3204457d180a6c1befd20d38bb3d37b30ce199e52ed3e1723db"} Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.191403 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.322401 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-config-data\") pod \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.322526 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-logs\") pod \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.322605 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-scripts\") pod \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.322653 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfxfz\" (UniqueName: \"kubernetes.io/projected/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-kube-api-access-tfxfz\") pod \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.323092 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-horizon-secret-key\") pod \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\" (UID: \"3f0c4c17-eef1-4c5c-ac86-453e6229c1d9\") " Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.323224 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-logs" (OuterVolumeSpecName: "logs") pod "3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" (UID: "3f0c4c17-eef1-4c5c-ac86-453e6229c1d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.325728 4766 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-logs\") on node \"crc\" DevicePath \"\"" Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.333746 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" (UID: "3f0c4c17-eef1-4c5c-ac86-453e6229c1d9"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.333953 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-kube-api-access-tfxfz" (OuterVolumeSpecName: "kube-api-access-tfxfz") pod "3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" (UID: "3f0c4c17-eef1-4c5c-ac86-453e6229c1d9"). InnerVolumeSpecName "kube-api-access-tfxfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.365919 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-scripts" (OuterVolumeSpecName: "scripts") pod "3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" (UID: "3f0c4c17-eef1-4c5c-ac86-453e6229c1d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.366214 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-config-data" (OuterVolumeSpecName: "config-data") pod "3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" (UID: "3f0c4c17-eef1-4c5c-ac86-453e6229c1d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.426809 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.426866 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.426877 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfxfz\" (UniqueName: \"kubernetes.io/projected/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-kube-api-access-tfxfz\") on node \"crc\" DevicePath \"\"" Oct 02 12:38:06 crc kubenswrapper[4766]: I1002 12:38:06.426890 4766 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 02 12:38:07 crc kubenswrapper[4766]: I1002 12:38:07.163641 4766 generic.go:334] "Generic (PLEG): container finished" podID="403ad43d-bdf9-4c87-ad12-313410089de3" containerID="dac636fe563898f6d173d92bfe30ea0d2a661bfa0b6b4885d786c72194a89f7c" exitCode=0 Oct 02 12:38:07 crc kubenswrapper[4766]: I1002 12:38:07.164112 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d68765fcc-88cb5" Oct 02 12:38:07 crc kubenswrapper[4766]: I1002 12:38:07.163782 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" event={"ID":"403ad43d-bdf9-4c87-ad12-313410089de3","Type":"ContainerDied","Data":"dac636fe563898f6d173d92bfe30ea0d2a661bfa0b6b4885d786c72194a89f7c"} Oct 02 12:38:07 crc kubenswrapper[4766]: I1002 12:38:07.295261 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d68765fcc-88cb5"] Oct 02 12:38:07 crc kubenswrapper[4766]: I1002 12:38:07.306347 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7d68765fcc-88cb5"] Oct 02 12:38:07 crc kubenswrapper[4766]: I1002 12:38:07.894738 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" path="/var/lib/kubelet/pods/3f0c4c17-eef1-4c5c-ac86-453e6229c1d9/volumes" Oct 02 12:38:08 crc kubenswrapper[4766]: I1002 12:38:08.181672 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nlqp" event={"ID":"42843af7-d9fb-49af-8d10-2333c67b2b01","Type":"ContainerStarted","Data":"335797aebb7125c15c64028bcbec0ae534a244189899effb9eb47eceabd6a19c"} Oct 02 12:38:08 crc kubenswrapper[4766]: I1002 12:38:08.673302 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" Oct 02 12:38:08 crc kubenswrapper[4766]: I1002 12:38:08.679855 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/403ad43d-bdf9-4c87-ad12-313410089de3-bundle\") pod \"403ad43d-bdf9-4c87-ad12-313410089de3\" (UID: \"403ad43d-bdf9-4c87-ad12-313410089de3\") " Oct 02 12:38:08 crc kubenswrapper[4766]: I1002 12:38:08.679926 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcqsb\" (UniqueName: \"kubernetes.io/projected/403ad43d-bdf9-4c87-ad12-313410089de3-kube-api-access-vcqsb\") pod \"403ad43d-bdf9-4c87-ad12-313410089de3\" (UID: \"403ad43d-bdf9-4c87-ad12-313410089de3\") " Oct 02 12:38:08 crc kubenswrapper[4766]: I1002 12:38:08.680106 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/403ad43d-bdf9-4c87-ad12-313410089de3-util\") pod \"403ad43d-bdf9-4c87-ad12-313410089de3\" (UID: \"403ad43d-bdf9-4c87-ad12-313410089de3\") " Oct 02 12:38:08 crc kubenswrapper[4766]: I1002 12:38:08.682696 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/403ad43d-bdf9-4c87-ad12-313410089de3-bundle" (OuterVolumeSpecName: "bundle") pod "403ad43d-bdf9-4c87-ad12-313410089de3" (UID: "403ad43d-bdf9-4c87-ad12-313410089de3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:38:08 crc kubenswrapper[4766]: I1002 12:38:08.695584 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/403ad43d-bdf9-4c87-ad12-313410089de3-util" (OuterVolumeSpecName: "util") pod "403ad43d-bdf9-4c87-ad12-313410089de3" (UID: "403ad43d-bdf9-4c87-ad12-313410089de3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:38:08 crc kubenswrapper[4766]: I1002 12:38:08.696281 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/403ad43d-bdf9-4c87-ad12-313410089de3-kube-api-access-vcqsb" (OuterVolumeSpecName: "kube-api-access-vcqsb") pod "403ad43d-bdf9-4c87-ad12-313410089de3" (UID: "403ad43d-bdf9-4c87-ad12-313410089de3"). InnerVolumeSpecName "kube-api-access-vcqsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:38:08 crc kubenswrapper[4766]: I1002 12:38:08.783717 4766 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/403ad43d-bdf9-4c87-ad12-313410089de3-util\") on node \"crc\" DevicePath \"\"" Oct 02 12:38:08 crc kubenswrapper[4766]: I1002 12:38:08.783771 4766 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/403ad43d-bdf9-4c87-ad12-313410089de3-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:38:08 crc kubenswrapper[4766]: I1002 12:38:08.783782 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcqsb\" (UniqueName: \"kubernetes.io/projected/403ad43d-bdf9-4c87-ad12-313410089de3-kube-api-access-vcqsb\") on node \"crc\" DevicePath \"\"" Oct 02 12:38:09 crc kubenswrapper[4766]: I1002 12:38:09.203766 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" Oct 02 12:38:09 crc kubenswrapper[4766]: I1002 12:38:09.204080 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4" event={"ID":"403ad43d-bdf9-4c87-ad12-313410089de3","Type":"ContainerDied","Data":"f8e09c541dc0e7bec1fe7a23fe40fbc29fd27b8f0e62f21d3f38aedbac81988a"} Oct 02 12:38:09 crc kubenswrapper[4766]: I1002 12:38:09.204348 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8e09c541dc0e7bec1fe7a23fe40fbc29fd27b8f0e62f21d3f38aedbac81988a" Oct 02 12:38:09 crc kubenswrapper[4766]: I1002 12:38:09.210398 4766 generic.go:334] "Generic (PLEG): container finished" podID="42843af7-d9fb-49af-8d10-2333c67b2b01" containerID="335797aebb7125c15c64028bcbec0ae534a244189899effb9eb47eceabd6a19c" exitCode=0 Oct 02 12:38:09 crc kubenswrapper[4766]: I1002 12:38:09.210464 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nlqp" event={"ID":"42843af7-d9fb-49af-8d10-2333c67b2b01","Type":"ContainerDied","Data":"335797aebb7125c15c64028bcbec0ae534a244189899effb9eb47eceabd6a19c"} Oct 02 12:38:11 crc kubenswrapper[4766]: I1002 12:38:11.048116 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-v5nvm"] Oct 02 12:38:11 crc kubenswrapper[4766]: I1002 12:38:11.061083 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-2mjk4"] Oct 02 12:38:11 crc kubenswrapper[4766]: I1002 12:38:11.072054 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-v5nvm"] Oct 02 12:38:11 crc kubenswrapper[4766]: I1002 12:38:11.082582 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-2mjk4"] Oct 02 12:38:11 crc kubenswrapper[4766]: I1002 12:38:11.093470 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-6fx7r"] Oct 02 12:38:11 crc kubenswrapper[4766]: I1002 12:38:11.102761 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-6fx7r"] Oct 02 12:38:11 crc kubenswrapper[4766]: I1002 12:38:11.239384 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nlqp" event={"ID":"42843af7-d9fb-49af-8d10-2333c67b2b01","Type":"ContainerStarted","Data":"d94e147673035530d65509563720a4c13bd2f43c4870f5cb1ffb3994deddd646"} Oct 02 12:38:11 crc kubenswrapper[4766]: I1002 12:38:11.279900 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5nlqp" podStartSLOduration=3.3536251950000002 podStartE2EDuration="7.279873359s" podCreationTimestamp="2025-10-02 12:38:04 +0000 UTC" firstStartedPulling="2025-10-02 12:38:06.144790314 +0000 UTC m=+6401.087661248" lastFinishedPulling="2025-10-02 12:38:10.071038458 +0000 UTC m=+6405.013909412" observedRunningTime="2025-10-02 12:38:11.263908358 +0000 UTC m=+6406.206779312" watchObservedRunningTime="2025-10-02 12:38:11.279873359 +0000 UTC m=+6406.222744303" Oct 02 12:38:11 crc kubenswrapper[4766]: I1002 12:38:11.900617 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49a751b6-3e73-4b99-8407-386e287fedf3" path="/var/lib/kubelet/pods/49a751b6-3e73-4b99-8407-386e287fedf3/volumes" Oct 02 12:38:11 crc kubenswrapper[4766]: I1002 12:38:11.902708 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9af48511-50fd-409c-a582-1473fc1776cf" path="/var/lib/kubelet/pods/9af48511-50fd-409c-a582-1473fc1776cf/volumes" Oct 02 12:38:11 crc kubenswrapper[4766]: I1002 12:38:11.903825 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7b8b672-6952-4faf-9a27-61e46932d297" path="/var/lib/kubelet/pods/c7b8b672-6952-4faf-9a27-61e46932d297/volumes" Oct 02 12:38:14 crc kubenswrapper[4766]: I1002 12:38:14.768995 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5nlqp" Oct 02 12:38:14 crc kubenswrapper[4766]: I1002 12:38:14.769880 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5nlqp" Oct 02 12:38:14 crc kubenswrapper[4766]: I1002 12:38:14.881658 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:38:14 crc kubenswrapper[4766]: E1002 12:38:14.882028 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:38:15 crc kubenswrapper[4766]: I1002 12:38:15.831486 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5nlqp" podUID="42843af7-d9fb-49af-8d10-2333c67b2b01" containerName="registry-server" probeResult="failure" output=< Oct 02 12:38:15 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Oct 02 12:38:15 crc kubenswrapper[4766]: > Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.131397 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-vwl9k"] Oct 02 12:38:20 crc kubenswrapper[4766]: E1002 12:38:20.132633 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403ad43d-bdf9-4c87-ad12-313410089de3" containerName="util" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.132647 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="403ad43d-bdf9-4c87-ad12-313410089de3" containerName="util" Oct 02 12:38:20 crc kubenswrapper[4766]: E1002 12:38:20.132655 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403ad43d-bdf9-4c87-ad12-313410089de3" containerName="pull" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.132662 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="403ad43d-bdf9-4c87-ad12-313410089de3" containerName="pull" Oct 02 12:38:20 crc kubenswrapper[4766]: E1002 12:38:20.132671 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403ad43d-bdf9-4c87-ad12-313410089de3" containerName="extract" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.132678 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="403ad43d-bdf9-4c87-ad12-313410089de3" containerName="extract" Oct 02 12:38:20 crc kubenswrapper[4766]: E1002 12:38:20.132699 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" containerName="horizon-log" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.132705 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" containerName="horizon-log" Oct 02 12:38:20 crc kubenswrapper[4766]: E1002 12:38:20.132719 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" containerName="horizon" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.132726 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" containerName="horizon" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.132917 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" containerName="horizon-log" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.132934 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f0c4c17-eef1-4c5c-ac86-453e6229c1d9" containerName="horizon" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.132944 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="403ad43d-bdf9-4c87-ad12-313410089de3" containerName="extract" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.133745 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vwl9k" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.137657 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.137970 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.138160 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-rvrq2" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.148430 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-vwl9k"] Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.288096 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7s5h\" (UniqueName: \"kubernetes.io/projected/453f7915-e705-47b3-9078-a7704846c9e0-kube-api-access-x7s5h\") pod \"obo-prometheus-operator-7c8cf85677-vwl9k\" (UID: \"453f7915-e705-47b3-9078-a7704846c9e0\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vwl9k" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.289302 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt"] Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.291191 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.292703 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-x2f88" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.294761 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.308561 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4"] Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.310203 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.333087 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt"] Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.356593 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4"] Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.390603 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7s5h\" (UniqueName: \"kubernetes.io/projected/453f7915-e705-47b3-9078-a7704846c9e0-kube-api-access-x7s5h\") pod \"obo-prometheus-operator-7c8cf85677-vwl9k\" (UID: \"453f7915-e705-47b3-9078-a7704846c9e0\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vwl9k" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.390681 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c30e4de-fe7f-4f68-a633-bdf33112ef8e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt\" (UID: \"3c30e4de-fe7f-4f68-a633-bdf33112ef8e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.390807 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c30e4de-fe7f-4f68-a633-bdf33112ef8e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt\" (UID: \"3c30e4de-fe7f-4f68-a633-bdf33112ef8e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.418785 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7s5h\" (UniqueName: \"kubernetes.io/projected/453f7915-e705-47b3-9078-a7704846c9e0-kube-api-access-x7s5h\") pod \"obo-prometheus-operator-7c8cf85677-vwl9k\" (UID: \"453f7915-e705-47b3-9078-a7704846c9e0\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vwl9k" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.475887 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-h8hd7"] Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.477522 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-h8hd7" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.480272 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-j9qqv" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.480740 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.493024 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c30e4de-fe7f-4f68-a633-bdf33112ef8e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt\" (UID: \"3c30e4de-fe7f-4f68-a633-bdf33112ef8e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.493154 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0ae4fe5d-375d-407c-9386-a99585c786ad-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4\" (UID: \"0ae4fe5d-375d-407c-9386-a99585c786ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.493200 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c30e4de-fe7f-4f68-a633-bdf33112ef8e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt\" (UID: \"3c30e4de-fe7f-4f68-a633-bdf33112ef8e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.493223 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0ae4fe5d-375d-407c-9386-a99585c786ad-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4\" (UID: \"0ae4fe5d-375d-407c-9386-a99585c786ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.498660 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3c30e4de-fe7f-4f68-a633-bdf33112ef8e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt\" (UID: \"3c30e4de-fe7f-4f68-a633-bdf33112ef8e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.505108 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3c30e4de-fe7f-4f68-a633-bdf33112ef8e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt\" (UID: \"3c30e4de-fe7f-4f68-a633-bdf33112ef8e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.508809 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-h8hd7"] Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.514642 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vwl9k" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.596882 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a50a36bc-6db8-4a5f-91c7-b01539ceaad9-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-h8hd7\" (UID: \"a50a36bc-6db8-4a5f-91c7-b01539ceaad9\") " pod="openshift-operators/observability-operator-cc5f78dfc-h8hd7" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.597142 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0ae4fe5d-375d-407c-9386-a99585c786ad-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4\" (UID: \"0ae4fe5d-375d-407c-9386-a99585c786ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.597219 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0ae4fe5d-375d-407c-9386-a99585c786ad-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4\" (UID: \"0ae4fe5d-375d-407c-9386-a99585c786ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.597293 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds6wh\" (UniqueName: \"kubernetes.io/projected/a50a36bc-6db8-4a5f-91c7-b01539ceaad9-kube-api-access-ds6wh\") pod \"observability-operator-cc5f78dfc-h8hd7\" (UID: \"a50a36bc-6db8-4a5f-91c7-b01539ceaad9\") " pod="openshift-operators/observability-operator-cc5f78dfc-h8hd7" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.601897 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0ae4fe5d-375d-407c-9386-a99585c786ad-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4\" (UID: \"0ae4fe5d-375d-407c-9386-a99585c786ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.603533 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0ae4fe5d-375d-407c-9386-a99585c786ad-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4\" (UID: \"0ae4fe5d-375d-407c-9386-a99585c786ad\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.615319 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.645462 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.677603 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-2nb7v"] Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.679106 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-2nb7v" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.683921 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-t4ddv" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.688910 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-2nb7v"] Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.699417 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a50a36bc-6db8-4a5f-91c7-b01539ceaad9-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-h8hd7\" (UID: \"a50a36bc-6db8-4a5f-91c7-b01539ceaad9\") " pod="openshift-operators/observability-operator-cc5f78dfc-h8hd7" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.699548 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds6wh\" (UniqueName: \"kubernetes.io/projected/a50a36bc-6db8-4a5f-91c7-b01539ceaad9-kube-api-access-ds6wh\") pod \"observability-operator-cc5f78dfc-h8hd7\" (UID: \"a50a36bc-6db8-4a5f-91c7-b01539ceaad9\") " pod="openshift-operators/observability-operator-cc5f78dfc-h8hd7" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.705256 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a50a36bc-6db8-4a5f-91c7-b01539ceaad9-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-h8hd7\" (UID: \"a50a36bc-6db8-4a5f-91c7-b01539ceaad9\") " pod="openshift-operators/observability-operator-cc5f78dfc-h8hd7" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.748777 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds6wh\" (UniqueName: \"kubernetes.io/projected/a50a36bc-6db8-4a5f-91c7-b01539ceaad9-kube-api-access-ds6wh\") pod \"observability-operator-cc5f78dfc-h8hd7\" (UID: \"a50a36bc-6db8-4a5f-91c7-b01539ceaad9\") " pod="openshift-operators/observability-operator-cc5f78dfc-h8hd7" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.808826 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fd4ad3e-17b0-498d-8710-949d10cb68fd-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-2nb7v\" (UID: \"8fd4ad3e-17b0-498d-8710-949d10cb68fd\") " pod="openshift-operators/perses-operator-54bc95c9fb-2nb7v" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.809274 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwlmv\" (UniqueName: \"kubernetes.io/projected/8fd4ad3e-17b0-498d-8710-949d10cb68fd-kube-api-access-hwlmv\") pod \"perses-operator-54bc95c9fb-2nb7v\" (UID: \"8fd4ad3e-17b0-498d-8710-949d10cb68fd\") " pod="openshift-operators/perses-operator-54bc95c9fb-2nb7v" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.911311 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fd4ad3e-17b0-498d-8710-949d10cb68fd-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-2nb7v\" (UID: \"8fd4ad3e-17b0-498d-8710-949d10cb68fd\") " pod="openshift-operators/perses-operator-54bc95c9fb-2nb7v" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.911396 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwlmv\" (UniqueName: \"kubernetes.io/projected/8fd4ad3e-17b0-498d-8710-949d10cb68fd-kube-api-access-hwlmv\") pod \"perses-operator-54bc95c9fb-2nb7v\" (UID: \"8fd4ad3e-17b0-498d-8710-949d10cb68fd\") " pod="openshift-operators/perses-operator-54bc95c9fb-2nb7v" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.912696 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/8fd4ad3e-17b0-498d-8710-949d10cb68fd-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-2nb7v\" (UID: \"8fd4ad3e-17b0-498d-8710-949d10cb68fd\") " pod="openshift-operators/perses-operator-54bc95c9fb-2nb7v" Oct 02 12:38:20 crc kubenswrapper[4766]: I1002 12:38:20.955244 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwlmv\" (UniqueName: \"kubernetes.io/projected/8fd4ad3e-17b0-498d-8710-949d10cb68fd-kube-api-access-hwlmv\") pod \"perses-operator-54bc95c9fb-2nb7v\" (UID: \"8fd4ad3e-17b0-498d-8710-949d10cb68fd\") " pod="openshift-operators/perses-operator-54bc95c9fb-2nb7v" Oct 02 12:38:21 crc kubenswrapper[4766]: I1002 12:38:21.001568 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-h8hd7" Oct 02 12:38:21 crc kubenswrapper[4766]: I1002 12:38:21.087414 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-af55-account-create-cqr4r"] Oct 02 12:38:21 crc kubenswrapper[4766]: I1002 12:38:21.102064 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-af55-account-create-cqr4r"] Oct 02 12:38:21 crc kubenswrapper[4766]: I1002 12:38:21.127302 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-2nb7v" Oct 02 12:38:21 crc kubenswrapper[4766]: I1002 12:38:21.368964 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-vwl9k"] Oct 02 12:38:21 crc kubenswrapper[4766]: W1002 12:38:21.375669 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod453f7915_e705_47b3_9078_a7704846c9e0.slice/crio-c05156448b22d58b29755b8c73ca96f903d9c936c6bf588db0f35dd0953707b7 WatchSource:0}: Error finding container c05156448b22d58b29755b8c73ca96f903d9c936c6bf588db0f35dd0953707b7: Status 404 returned error can't find the container with id c05156448b22d58b29755b8c73ca96f903d9c936c6bf588db0f35dd0953707b7 Oct 02 12:38:21 crc kubenswrapper[4766]: W1002 12:38:21.516294 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ae4fe5d_375d_407c_9386_a99585c786ad.slice/crio-f7106de766f93a36458978b57f8ef55ea7586b521dc871b70452b5f67189e4f9 WatchSource:0}: Error finding container f7106de766f93a36458978b57f8ef55ea7586b521dc871b70452b5f67189e4f9: Status 404 returned error can't find the container with id f7106de766f93a36458978b57f8ef55ea7586b521dc871b70452b5f67189e4f9 Oct 02 12:38:21 crc kubenswrapper[4766]: W1002 12:38:21.520604 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c30e4de_fe7f_4f68_a633_bdf33112ef8e.slice/crio-cfe277757ca13f465bdeaf5367ba3af7c554d681e28de819688e649dccc277be WatchSource:0}: Error finding container cfe277757ca13f465bdeaf5367ba3af7c554d681e28de819688e649dccc277be: Status 404 returned error can't find the container with id cfe277757ca13f465bdeaf5367ba3af7c554d681e28de819688e649dccc277be Oct 02 12:38:21 crc kubenswrapper[4766]: I1002 12:38:21.542941 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4"] Oct 02 12:38:21 crc kubenswrapper[4766]: I1002 12:38:21.562971 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt"] Oct 02 12:38:21 crc kubenswrapper[4766]: I1002 12:38:21.823010 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-h8hd7"] Oct 02 12:38:21 crc kubenswrapper[4766]: W1002 12:38:21.827063 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda50a36bc_6db8_4a5f_91c7_b01539ceaad9.slice/crio-821c1923ab2067d0f18002f2e5a8acb1b911415264252f27c326c39178747cb2 WatchSource:0}: Error finding container 821c1923ab2067d0f18002f2e5a8acb1b911415264252f27c326c39178747cb2: Status 404 returned error can't find the container with id 821c1923ab2067d0f18002f2e5a8acb1b911415264252f27c326c39178747cb2 Oct 02 12:38:21 crc kubenswrapper[4766]: I1002 12:38:21.900295 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="678b54d8-3db1-45ac-b2bc-85bbf874b697" path="/var/lib/kubelet/pods/678b54d8-3db1-45ac-b2bc-85bbf874b697/volumes" Oct 02 12:38:21 crc kubenswrapper[4766]: I1002 12:38:21.954116 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-2nb7v"] Oct 02 12:38:21 crc kubenswrapper[4766]: W1002 12:38:21.965251 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fd4ad3e_17b0_498d_8710_949d10cb68fd.slice/crio-2cb7d36d8da380eb30db6318a40417637cb87b96d8f0166e40328df421066c46 WatchSource:0}: Error finding container 2cb7d36d8da380eb30db6318a40417637cb87b96d8f0166e40328df421066c46: Status 404 returned error can't find the container with id 2cb7d36d8da380eb30db6318a40417637cb87b96d8f0166e40328df421066c46 Oct 02 12:38:22 crc kubenswrapper[4766]: I1002 12:38:22.035924 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9040-account-create-gvdld"] Oct 02 12:38:22 crc kubenswrapper[4766]: I1002 12:38:22.047918 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9040-account-create-gvdld"] Oct 02 12:38:22 crc kubenswrapper[4766]: I1002 12:38:22.415637 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vwl9k" event={"ID":"453f7915-e705-47b3-9078-a7704846c9e0","Type":"ContainerStarted","Data":"c05156448b22d58b29755b8c73ca96f903d9c936c6bf588db0f35dd0953707b7"} Oct 02 12:38:22 crc kubenswrapper[4766]: I1002 12:38:22.417528 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4" event={"ID":"0ae4fe5d-375d-407c-9386-a99585c786ad","Type":"ContainerStarted","Data":"f7106de766f93a36458978b57f8ef55ea7586b521dc871b70452b5f67189e4f9"} Oct 02 12:38:22 crc kubenswrapper[4766]: I1002 12:38:22.419405 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-2nb7v" event={"ID":"8fd4ad3e-17b0-498d-8710-949d10cb68fd","Type":"ContainerStarted","Data":"2cb7d36d8da380eb30db6318a40417637cb87b96d8f0166e40328df421066c46"} Oct 02 12:38:22 crc kubenswrapper[4766]: I1002 12:38:22.422676 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-h8hd7" event={"ID":"a50a36bc-6db8-4a5f-91c7-b01539ceaad9","Type":"ContainerStarted","Data":"821c1923ab2067d0f18002f2e5a8acb1b911415264252f27c326c39178747cb2"} Oct 02 12:38:22 crc kubenswrapper[4766]: I1002 12:38:22.428621 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt" event={"ID":"3c30e4de-fe7f-4f68-a633-bdf33112ef8e","Type":"ContainerStarted","Data":"cfe277757ca13f465bdeaf5367ba3af7c554d681e28de819688e649dccc277be"} Oct 02 12:38:23 crc kubenswrapper[4766]: I1002 12:38:23.050910 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-f7c6-account-create-ksxjs"] Oct 02 12:38:23 crc kubenswrapper[4766]: I1002 12:38:23.067553 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-f7c6-account-create-ksxjs"] Oct 02 12:38:23 crc kubenswrapper[4766]: I1002 12:38:23.911723 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0951018c-a2da-4b4e-9855-5f8c01e996d3" path="/var/lib/kubelet/pods/0951018c-a2da-4b4e-9855-5f8c01e996d3/volumes" Oct 02 12:38:23 crc kubenswrapper[4766]: I1002 12:38:23.912714 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c777bfde-3b16-4cd1-865a-910c60753ba3" path="/var/lib/kubelet/pods/c777bfde-3b16-4cd1-865a-910c60753ba3/volumes" Oct 02 12:38:24 crc kubenswrapper[4766]: I1002 12:38:24.856286 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5nlqp" Oct 02 12:38:24 crc kubenswrapper[4766]: I1002 12:38:24.923591 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5nlqp" Oct 02 12:38:25 crc kubenswrapper[4766]: I1002 12:38:25.109149 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5nlqp"] Oct 02 12:38:26 crc kubenswrapper[4766]: I1002 12:38:26.478399 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5nlqp" podUID="42843af7-d9fb-49af-8d10-2333c67b2b01" containerName="registry-server" containerID="cri-o://d94e147673035530d65509563720a4c13bd2f43c4870f5cb1ffb3994deddd646" gracePeriod=2 Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.426029 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5nlqp" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.516744 4766 generic.go:334] "Generic (PLEG): container finished" podID="42843af7-d9fb-49af-8d10-2333c67b2b01" containerID="d94e147673035530d65509563720a4c13bd2f43c4870f5cb1ffb3994deddd646" exitCode=0 Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.517129 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nlqp" event={"ID":"42843af7-d9fb-49af-8d10-2333c67b2b01","Type":"ContainerDied","Data":"d94e147673035530d65509563720a4c13bd2f43c4870f5cb1ffb3994deddd646"} Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.517166 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5nlqp" event={"ID":"42843af7-d9fb-49af-8d10-2333c67b2b01","Type":"ContainerDied","Data":"c599120ba202fa141dc421f71ab86bf04c965c2be3074de9c3d9b1ce8df13ebf"} Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.517187 4766 scope.go:117] "RemoveContainer" containerID="d94e147673035530d65509563720a4c13bd2f43c4870f5cb1ffb3994deddd646" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.517359 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5nlqp" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.524903 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4" event={"ID":"0ae4fe5d-375d-407c-9386-a99585c786ad","Type":"ContainerStarted","Data":"c103fbe4d3dc07de45b5d07f663bd897c12f985c62d340f4694ab3699f5668f8"} Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.528351 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-2nb7v" event={"ID":"8fd4ad3e-17b0-498d-8710-949d10cb68fd","Type":"ContainerStarted","Data":"7bdc14b9f927d48f70291ac6b556871f2da8baa70eb530189196b6bea6dff66d"} Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.529636 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-2nb7v" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.540096 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt" event={"ID":"3c30e4de-fe7f-4f68-a633-bdf33112ef8e","Type":"ContainerStarted","Data":"6968b2f3918cd3391a7a0cc805b7e04f8e45f60674e9e34ef9d47b6c322d020a"} Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.561179 4766 scope.go:117] "RemoveContainer" containerID="335797aebb7125c15c64028bcbec0ae534a244189899effb9eb47eceabd6a19c" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.562790 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4" podStartSLOduration=2.4044008 podStartE2EDuration="7.562757262s" podCreationTimestamp="2025-10-02 12:38:20 +0000 UTC" firstStartedPulling="2025-10-02 12:38:21.522757019 +0000 UTC m=+6416.465627963" lastFinishedPulling="2025-10-02 12:38:26.681113481 +0000 UTC m=+6421.623984425" observedRunningTime="2025-10-02 12:38:27.552867525 +0000 UTC m=+6422.495738469" watchObservedRunningTime="2025-10-02 12:38:27.562757262 +0000 UTC m=+6422.505628206" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.604310 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-2nb7v" podStartSLOduration=2.9118236939999997 podStartE2EDuration="7.604289652s" podCreationTimestamp="2025-10-02 12:38:20 +0000 UTC" firstStartedPulling="2025-10-02 12:38:21.968288921 +0000 UTC m=+6416.911159875" lastFinishedPulling="2025-10-02 12:38:26.660754889 +0000 UTC m=+6421.603625833" observedRunningTime="2025-10-02 12:38:27.572876956 +0000 UTC m=+6422.515747890" watchObservedRunningTime="2025-10-02 12:38:27.604289652 +0000 UTC m=+6422.547160596" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.612067 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42843af7-d9fb-49af-8d10-2333c67b2b01-catalog-content\") pod \"42843af7-d9fb-49af-8d10-2333c67b2b01\" (UID: \"42843af7-d9fb-49af-8d10-2333c67b2b01\") " Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.612145 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42843af7-d9fb-49af-8d10-2333c67b2b01-utilities\") pod \"42843af7-d9fb-49af-8d10-2333c67b2b01\" (UID: \"42843af7-d9fb-49af-8d10-2333c67b2b01\") " Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.612203 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njb5v\" (UniqueName: \"kubernetes.io/projected/42843af7-d9fb-49af-8d10-2333c67b2b01-kube-api-access-njb5v\") pod \"42843af7-d9fb-49af-8d10-2333c67b2b01\" (UID: \"42843af7-d9fb-49af-8d10-2333c67b2b01\") " Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.616653 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42843af7-d9fb-49af-8d10-2333c67b2b01-utilities" (OuterVolumeSpecName: "utilities") pod "42843af7-d9fb-49af-8d10-2333c67b2b01" (UID: "42843af7-d9fb-49af-8d10-2333c67b2b01"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.640680 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt" podStartSLOduration=2.459550647 podStartE2EDuration="7.640635846s" podCreationTimestamp="2025-10-02 12:38:20 +0000 UTC" firstStartedPulling="2025-10-02 12:38:21.542838353 +0000 UTC m=+6416.485709297" lastFinishedPulling="2025-10-02 12:38:26.723923552 +0000 UTC m=+6421.666794496" observedRunningTime="2025-10-02 12:38:27.600369977 +0000 UTC m=+6422.543240921" watchObservedRunningTime="2025-10-02 12:38:27.640635846 +0000 UTC m=+6422.583506790" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.643733 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42843af7-d9fb-49af-8d10-2333c67b2b01-kube-api-access-njb5v" (OuterVolumeSpecName: "kube-api-access-njb5v") pod "42843af7-d9fb-49af-8d10-2333c67b2b01" (UID: "42843af7-d9fb-49af-8d10-2333c67b2b01"). InnerVolumeSpecName "kube-api-access-njb5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.649877 4766 scope.go:117] "RemoveContainer" containerID="ad05f8f0f912d674c03146ff3087d6dadf3ccd2dc31f97a41100d72b1c5144dd" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.715790 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42843af7-d9fb-49af-8d10-2333c67b2b01-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.715828 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njb5v\" (UniqueName: \"kubernetes.io/projected/42843af7-d9fb-49af-8d10-2333c67b2b01-kube-api-access-njb5v\") on node \"crc\" DevicePath \"\"" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.716451 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42843af7-d9fb-49af-8d10-2333c67b2b01-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42843af7-d9fb-49af-8d10-2333c67b2b01" (UID: "42843af7-d9fb-49af-8d10-2333c67b2b01"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.753154 4766 scope.go:117] "RemoveContainer" containerID="d94e147673035530d65509563720a4c13bd2f43c4870f5cb1ffb3994deddd646" Oct 02 12:38:27 crc kubenswrapper[4766]: E1002 12:38:27.753898 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d94e147673035530d65509563720a4c13bd2f43c4870f5cb1ffb3994deddd646\": container with ID starting with d94e147673035530d65509563720a4c13bd2f43c4870f5cb1ffb3994deddd646 not found: ID does not exist" containerID="d94e147673035530d65509563720a4c13bd2f43c4870f5cb1ffb3994deddd646" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.753958 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d94e147673035530d65509563720a4c13bd2f43c4870f5cb1ffb3994deddd646"} err="failed to get container status \"d94e147673035530d65509563720a4c13bd2f43c4870f5cb1ffb3994deddd646\": rpc error: code = NotFound desc = could not find container \"d94e147673035530d65509563720a4c13bd2f43c4870f5cb1ffb3994deddd646\": container with ID starting with d94e147673035530d65509563720a4c13bd2f43c4870f5cb1ffb3994deddd646 not found: ID does not exist" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.753987 4766 scope.go:117] "RemoveContainer" containerID="335797aebb7125c15c64028bcbec0ae534a244189899effb9eb47eceabd6a19c" Oct 02 12:38:27 crc kubenswrapper[4766]: E1002 12:38:27.754606 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"335797aebb7125c15c64028bcbec0ae534a244189899effb9eb47eceabd6a19c\": container with ID starting with 335797aebb7125c15c64028bcbec0ae534a244189899effb9eb47eceabd6a19c not found: ID does not exist" containerID="335797aebb7125c15c64028bcbec0ae534a244189899effb9eb47eceabd6a19c" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.754684 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"335797aebb7125c15c64028bcbec0ae534a244189899effb9eb47eceabd6a19c"} err="failed to get container status \"335797aebb7125c15c64028bcbec0ae534a244189899effb9eb47eceabd6a19c\": rpc error: code = NotFound desc = could not find container \"335797aebb7125c15c64028bcbec0ae534a244189899effb9eb47eceabd6a19c\": container with ID starting with 335797aebb7125c15c64028bcbec0ae534a244189899effb9eb47eceabd6a19c not found: ID does not exist" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.754737 4766 scope.go:117] "RemoveContainer" containerID="ad05f8f0f912d674c03146ff3087d6dadf3ccd2dc31f97a41100d72b1c5144dd" Oct 02 12:38:27 crc kubenswrapper[4766]: E1002 12:38:27.755270 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad05f8f0f912d674c03146ff3087d6dadf3ccd2dc31f97a41100d72b1c5144dd\": container with ID starting with ad05f8f0f912d674c03146ff3087d6dadf3ccd2dc31f97a41100d72b1c5144dd not found: ID does not exist" containerID="ad05f8f0f912d674c03146ff3087d6dadf3ccd2dc31f97a41100d72b1c5144dd" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.755305 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad05f8f0f912d674c03146ff3087d6dadf3ccd2dc31f97a41100d72b1c5144dd"} err="failed to get container status \"ad05f8f0f912d674c03146ff3087d6dadf3ccd2dc31f97a41100d72b1c5144dd\": rpc error: code = NotFound desc = could not find container \"ad05f8f0f912d674c03146ff3087d6dadf3ccd2dc31f97a41100d72b1c5144dd\": container with ID starting with ad05f8f0f912d674c03146ff3087d6dadf3ccd2dc31f97a41100d72b1c5144dd not found: ID does not exist" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.818598 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42843af7-d9fb-49af-8d10-2333c67b2b01-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.874939 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5nlqp"] Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.888455 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:38:27 crc kubenswrapper[4766]: E1002 12:38:27.890422 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:38:27 crc kubenswrapper[4766]: I1002 12:38:27.913886 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5nlqp"] Oct 02 12:38:29 crc kubenswrapper[4766]: I1002 12:38:29.900190 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42843af7-d9fb-49af-8d10-2333c67b2b01" path="/var/lib/kubelet/pods/42843af7-d9fb-49af-8d10-2333c67b2b01/volumes" Oct 02 12:38:33 crc kubenswrapper[4766]: I1002 12:38:33.048979 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q6f4d"] Oct 02 12:38:33 crc kubenswrapper[4766]: I1002 12:38:33.059001 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-q6f4d"] Oct 02 12:38:33 crc kubenswrapper[4766]: I1002 12:38:33.629600 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vwl9k" event={"ID":"453f7915-e705-47b3-9078-a7704846c9e0","Type":"ContainerStarted","Data":"9a84d60f384a709b06d0e1b4799d15ae940d325a96c18e3f2c9a8f382f63dd53"} Oct 02 12:38:33 crc kubenswrapper[4766]: I1002 12:38:33.632271 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-h8hd7" event={"ID":"a50a36bc-6db8-4a5f-91c7-b01539ceaad9","Type":"ContainerStarted","Data":"fa625d3ce6dd94e7a20489be470ba10ed813361c99b8af5bb6f6ab249efdbf7c"} Oct 02 12:38:33 crc kubenswrapper[4766]: I1002 12:38:33.632520 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-h8hd7" Oct 02 12:38:33 crc kubenswrapper[4766]: I1002 12:38:33.641248 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-h8hd7" Oct 02 12:38:33 crc kubenswrapper[4766]: I1002 12:38:33.667820 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-vwl9k" podStartSLOduration=2.657509798 podStartE2EDuration="13.667794749s" podCreationTimestamp="2025-10-02 12:38:20 +0000 UTC" firstStartedPulling="2025-10-02 12:38:21.378333482 +0000 UTC m=+6416.321204426" lastFinishedPulling="2025-10-02 12:38:32.388618433 +0000 UTC m=+6427.331489377" observedRunningTime="2025-10-02 12:38:33.65816013 +0000 UTC m=+6428.601031074" watchObservedRunningTime="2025-10-02 12:38:33.667794749 +0000 UTC m=+6428.610665693" Oct 02 12:38:33 crc kubenswrapper[4766]: I1002 12:38:33.707146 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-h8hd7" podStartSLOduration=3.123125333 podStartE2EDuration="13.707121368s" podCreationTimestamp="2025-10-02 12:38:20 +0000 UTC" firstStartedPulling="2025-10-02 12:38:21.831237481 +0000 UTC m=+6416.774108425" lastFinishedPulling="2025-10-02 12:38:32.415233516 +0000 UTC m=+6427.358104460" observedRunningTime="2025-10-02 12:38:33.700445914 +0000 UTC m=+6428.643316858" watchObservedRunningTime="2025-10-02 12:38:33.707121368 +0000 UTC m=+6428.649992302" Oct 02 12:38:33 crc kubenswrapper[4766]: I1002 12:38:33.893791 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38bacb80-6110-42c7-9923-95cebf834ef0" path="/var/lib/kubelet/pods/38bacb80-6110-42c7-9923-95cebf834ef0/volumes" Oct 02 12:38:38 crc kubenswrapper[4766]: I1002 12:38:38.111370 4766 scope.go:117] "RemoveContainer" containerID="c05146ce2eae86aef129757af364beaf1fefdc130f2c1d0519bd2d957200167f" Oct 02 12:38:38 crc kubenswrapper[4766]: I1002 12:38:38.195671 4766 scope.go:117] "RemoveContainer" containerID="06e3ff123212eadc9f53fbf6cb66440a39104196b857c43e66473f5c01dc9c49" Oct 02 12:38:38 crc kubenswrapper[4766]: I1002 12:38:38.235254 4766 scope.go:117] "RemoveContainer" containerID="34e4dafcd44066e3a05bfea9b089fa167ca5a4f7edec9a3214eea6ed2895a760" Oct 02 12:38:38 crc kubenswrapper[4766]: I1002 12:38:38.313670 4766 scope.go:117] "RemoveContainer" containerID="f46e2edf4c77ade8fba78bc0c90166d0c016f8c08e45ecd4e15018bbcc0f4a54" Oct 02 12:38:38 crc kubenswrapper[4766]: I1002 12:38:38.366995 4766 scope.go:117] "RemoveContainer" containerID="e4db6605cb0b202a8d0c73618d6578faebc86fb1850d5eb2de0a0fe56fe93bc3" Oct 02 12:38:38 crc kubenswrapper[4766]: I1002 12:38:38.407788 4766 scope.go:117] "RemoveContainer" containerID="8268d2fecded465798967c832ac0112aa25fffc8c1535460c33d506ba344a174" Oct 02 12:38:38 crc kubenswrapper[4766]: I1002 12:38:38.461622 4766 scope.go:117] "RemoveContainer" containerID="2ba49f0fd3fa684c911d44383a7d98531f824028e5391c320ae4f9bd49a1150c" Oct 02 12:38:41 crc kubenswrapper[4766]: I1002 12:38:41.132277 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-2nb7v" Oct 02 12:38:41 crc kubenswrapper[4766]: I1002 12:38:41.881491 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:38:41 crc kubenswrapper[4766]: E1002 12:38:41.882195 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:38:43 crc kubenswrapper[4766]: I1002 12:38:43.770356 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 02 12:38:43 crc kubenswrapper[4766]: I1002 12:38:43.771074 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="34f0b55d-1a54-413b-8131-71b5816277c4" containerName="openstackclient" containerID="cri-o://a1a03a1fcf20e0b259fd245736580119ca7cb59fac730795381c654f08dfdbca" gracePeriod=2 Oct 02 12:38:43 crc kubenswrapper[4766]: I1002 12:38:43.779653 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 02 12:38:43 crc kubenswrapper[4766]: I1002 12:38:43.827735 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 02 12:38:43 crc kubenswrapper[4766]: E1002 12:38:43.828247 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f0b55d-1a54-413b-8131-71b5816277c4" containerName="openstackclient" Oct 02 12:38:43 crc kubenswrapper[4766]: I1002 12:38:43.828269 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f0b55d-1a54-413b-8131-71b5816277c4" containerName="openstackclient" Oct 02 12:38:43 crc kubenswrapper[4766]: E1002 12:38:43.828285 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42843af7-d9fb-49af-8d10-2333c67b2b01" containerName="registry-server" Oct 02 12:38:43 crc kubenswrapper[4766]: I1002 12:38:43.828293 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="42843af7-d9fb-49af-8d10-2333c67b2b01" containerName="registry-server" Oct 02 12:38:43 crc kubenswrapper[4766]: E1002 12:38:43.828311 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42843af7-d9fb-49af-8d10-2333c67b2b01" containerName="extract-utilities" Oct 02 12:38:43 crc kubenswrapper[4766]: I1002 12:38:43.828317 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="42843af7-d9fb-49af-8d10-2333c67b2b01" containerName="extract-utilities" Oct 02 12:38:43 crc kubenswrapper[4766]: E1002 12:38:43.828329 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42843af7-d9fb-49af-8d10-2333c67b2b01" containerName="extract-content" Oct 02 12:38:43 crc kubenswrapper[4766]: I1002 12:38:43.828336 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="42843af7-d9fb-49af-8d10-2333c67b2b01" containerName="extract-content" Oct 02 12:38:43 crc kubenswrapper[4766]: I1002 12:38:43.828569 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="34f0b55d-1a54-413b-8131-71b5816277c4" containerName="openstackclient" Oct 02 12:38:43 crc kubenswrapper[4766]: I1002 12:38:43.828587 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="42843af7-d9fb-49af-8d10-2333c67b2b01" containerName="registry-server" Oct 02 12:38:43 crc kubenswrapper[4766]: I1002 12:38:43.829487 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 12:38:43 crc kubenswrapper[4766]: I1002 12:38:43.844329 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 12:38:43 crc kubenswrapper[4766]: I1002 12:38:43.903877 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="34f0b55d-1a54-413b-8131-71b5816277c4" podUID="1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc" Oct 02 12:38:43 crc kubenswrapper[4766]: I1002 12:38:43.933073 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 02 12:38:43 crc kubenswrapper[4766]: E1002 12:38:43.933874 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-kqhqh openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/openstackclient" podUID="1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc" Oct 02 12:38:43 crc kubenswrapper[4766]: I1002 12:38:43.965461 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqhqh\" (UniqueName: \"kubernetes.io/projected/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-kube-api-access-kqhqh\") pod \"openstackclient\" (UID: \"1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc\") " pod="openstack/openstackclient" Oct 02 12:38:43 crc kubenswrapper[4766]: I1002 12:38:43.965806 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-openstack-config-secret\") pod \"openstackclient\" (UID: \"1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc\") " pod="openstack/openstackclient" Oct 02 12:38:43 crc kubenswrapper[4766]: I1002 12:38:43.965841 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-openstack-config\") pod \"openstackclient\" (UID: \"1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc\") " pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.028496 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.089016 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqhqh\" (UniqueName: \"kubernetes.io/projected/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-kube-api-access-kqhqh\") pod \"openstackclient\" (UID: \"1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc\") " pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.089129 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-openstack-config-secret\") pod \"openstackclient\" (UID: \"1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc\") " pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.089161 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-openstack-config\") pod \"openstackclient\" (UID: \"1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc\") " pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: E1002 12:38:44.093118 4766 projected.go:194] Error preparing data for projected volume kube-api-access-kqhqh for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc) does not match the UID in record. The object might have been deleted and then recreated Oct 02 12:38:44 crc kubenswrapper[4766]: E1002 12:38:44.093240 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-kube-api-access-kqhqh podName:1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc nodeName:}" failed. No retries permitted until 2025-10-02 12:38:44.593210354 +0000 UTC m=+6439.536081298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kqhqh" (UniqueName: "kubernetes.io/projected/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-kube-api-access-kqhqh") pod "openstackclient" (UID: "1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc) does not match the UID in record. The object might have been deleted and then recreated Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.102215 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-openstack-config\") pod \"openstackclient\" (UID: \"1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc\") " pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.105583 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.109606 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-openstack-config-secret\") pod \"openstackclient\" (UID: \"1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc\") " pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.110770 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.122348 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.191095 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg2tc\" (UniqueName: \"kubernetes.io/projected/fa24afd2-9499-490a-bc1a-8261b74d0dae-kube-api-access-lg2tc\") pod \"openstackclient\" (UID: \"fa24afd2-9499-490a-bc1a-8261b74d0dae\") " pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.191598 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fa24afd2-9499-490a-bc1a-8261b74d0dae-openstack-config\") pod \"openstackclient\" (UID: \"fa24afd2-9499-490a-bc1a-8261b74d0dae\") " pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.191738 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fa24afd2-9499-490a-bc1a-8261b74d0dae-openstack-config-secret\") pod \"openstackclient\" (UID: \"fa24afd2-9499-490a-bc1a-8261b74d0dae\") " pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.230166 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.239009 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.247317 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-5f7f7" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.252389 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.293868 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fa24afd2-9499-490a-bc1a-8261b74d0dae-openstack-config\") pod \"openstackclient\" (UID: \"fa24afd2-9499-490a-bc1a-8261b74d0dae\") " pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.294031 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fa24afd2-9499-490a-bc1a-8261b74d0dae-openstack-config-secret\") pod \"openstackclient\" (UID: \"fa24afd2-9499-490a-bc1a-8261b74d0dae\") " pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.294106 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t52c7\" (UniqueName: \"kubernetes.io/projected/9d25d74d-1b30-4bb2-8cc2-401004b37624-kube-api-access-t52c7\") pod \"kube-state-metrics-0\" (UID: \"9d25d74d-1b30-4bb2-8cc2-401004b37624\") " pod="openstack/kube-state-metrics-0" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.294147 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg2tc\" (UniqueName: \"kubernetes.io/projected/fa24afd2-9499-490a-bc1a-8261b74d0dae-kube-api-access-lg2tc\") pod \"openstackclient\" (UID: \"fa24afd2-9499-490a-bc1a-8261b74d0dae\") " pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.295359 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fa24afd2-9499-490a-bc1a-8261b74d0dae-openstack-config\") pod \"openstackclient\" (UID: \"fa24afd2-9499-490a-bc1a-8261b74d0dae\") " pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.304562 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fa24afd2-9499-490a-bc1a-8261b74d0dae-openstack-config-secret\") pod \"openstackclient\" (UID: \"fa24afd2-9499-490a-bc1a-8261b74d0dae\") " pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.358608 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg2tc\" (UniqueName: \"kubernetes.io/projected/fa24afd2-9499-490a-bc1a-8261b74d0dae-kube-api-access-lg2tc\") pod \"openstackclient\" (UID: \"fa24afd2-9499-490a-bc1a-8261b74d0dae\") " pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.395958 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t52c7\" (UniqueName: \"kubernetes.io/projected/9d25d74d-1b30-4bb2-8cc2-401004b37624-kube-api-access-t52c7\") pod \"kube-state-metrics-0\" (UID: \"9d25d74d-1b30-4bb2-8cc2-401004b37624\") " pod="openstack/kube-state-metrics-0" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.454973 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t52c7\" (UniqueName: \"kubernetes.io/projected/9d25d74d-1b30-4bb2-8cc2-401004b37624-kube-api-access-t52c7\") pod \"kube-state-metrics-0\" (UID: \"9d25d74d-1b30-4bb2-8cc2-401004b37624\") " pod="openstack/kube-state-metrics-0" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.486465 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.566546 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.601405 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqhqh\" (UniqueName: \"kubernetes.io/projected/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-kube-api-access-kqhqh\") pod \"openstackclient\" (UID: \"1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc\") " pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: E1002 12:38:44.606893 4766 projected.go:194] Error preparing data for projected volume kube-api-access-kqhqh for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc) does not match the UID in record. The object might have been deleted and then recreated Oct 02 12:38:44 crc kubenswrapper[4766]: E1002 12:38:44.606998 4766 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-kube-api-access-kqhqh podName:1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc nodeName:}" failed. No retries permitted until 2025-10-02 12:38:45.60697083 +0000 UTC m=+6440.549841774 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-kqhqh" (UniqueName: "kubernetes.io/projected/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-kube-api-access-kqhqh") pod "openstackclient" (UID: "1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc) does not match the UID in record. The object might have been deleted and then recreated Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.763602 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.777795 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc" podUID="fa24afd2-9499-490a-bc1a-8261b74d0dae" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.784853 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.809095 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-openstack-config\") pod \"1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc\" (UID: \"1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc\") " Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.809163 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-openstack-config-secret\") pod \"1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc\" (UID: \"1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc\") " Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.809763 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqhqh\" (UniqueName: \"kubernetes.io/projected/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-kube-api-access-kqhqh\") on node \"crc\" DevicePath \"\"" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.812001 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc" (UID: "1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.816868 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc" (UID: "1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.914242 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:38:44 crc kubenswrapper[4766]: I1002 12:38:44.914287 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.022300 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.034418 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.043418 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.043562 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-q6l2j" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.043647 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.043561 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.095272 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.233406 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dc85d4fb-980e-4303-8850-ec3da21b43b2-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.233478 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dc85d4fb-980e-4303-8850-ec3da21b43b2-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.233595 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dc85d4fb-980e-4303-8850-ec3da21b43b2-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.233643 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/dc85d4fb-980e-4303-8850-ec3da21b43b2-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.233694 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dc85d4fb-980e-4303-8850-ec3da21b43b2-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.233730 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn2nb\" (UniqueName: \"kubernetes.io/projected/dc85d4fb-980e-4303-8850-ec3da21b43b2-kube-api-access-kn2nb\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.335865 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dc85d4fb-980e-4303-8850-ec3da21b43b2-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.335924 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dc85d4fb-980e-4303-8850-ec3da21b43b2-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.336022 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dc85d4fb-980e-4303-8850-ec3da21b43b2-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.336063 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/dc85d4fb-980e-4303-8850-ec3da21b43b2-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.336098 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dc85d4fb-980e-4303-8850-ec3da21b43b2-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.336134 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn2nb\" (UniqueName: \"kubernetes.io/projected/dc85d4fb-980e-4303-8850-ec3da21b43b2-kube-api-access-kn2nb\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.342032 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/dc85d4fb-980e-4303-8850-ec3da21b43b2-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.347305 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dc85d4fb-980e-4303-8850-ec3da21b43b2-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.351616 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dc85d4fb-980e-4303-8850-ec3da21b43b2-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.365319 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dc85d4fb-980e-4303-8850-ec3da21b43b2-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.369411 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dc85d4fb-980e-4303-8850-ec3da21b43b2-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.425317 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn2nb\" (UniqueName: \"kubernetes.io/projected/dc85d4fb-980e-4303-8850-ec3da21b43b2-kube-api-access-kn2nb\") pod \"alertmanager-metric-storage-0\" (UID: \"dc85d4fb-980e-4303-8850-ec3da21b43b2\") " pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.525783 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.606027 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.611446 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.660592 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.672077 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.681241 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.681604 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-tj28t" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.681714 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.681912 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.682051 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.691107 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.691834 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.704914 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.755189 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-43846a8c-24c3-4401-b197-cb99f6993eda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43846a8c-24c3-4401-b197-cb99f6993eda\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.755264 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.755318 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.755372 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.755408 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.755484 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.755565 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-config\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.755607 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh2qk\" (UniqueName: \"kubernetes.io/projected/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-kube-api-access-hh2qk\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.849360 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9d25d74d-1b30-4bb2-8cc2-401004b37624","Type":"ContainerStarted","Data":"f7200867e40112f3356cb8f732cd237a92970c00427acaf19a4146e6a9f40ab8"} Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.857917 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.859670 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-config\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.859838 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh2qk\" (UniqueName: \"kubernetes.io/projected/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-kube-api-access-hh2qk\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.872452 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-43846a8c-24c3-4401-b197-cb99f6993eda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43846a8c-24c3-4401-b197-cb99f6993eda\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.872703 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.872880 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.873008 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.873116 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.883288 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.885545 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.885635 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fa24afd2-9499-490a-bc1a-8261b74d0dae","Type":"ContainerStarted","Data":"863e37ff700b333bd89c07db0fa8876992eb26b40e27f16a9dae1766356e3092"} Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.891255 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.907111 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc" podUID="fa24afd2-9499-490a-bc1a-8261b74d0dae" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.911249 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-config\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.922985 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.924454 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.944223 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:45 crc kubenswrapper[4766]: I1002 12:38:45.997115 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc" path="/var/lib/kubelet/pods/1b5e03c6-bb1b-4d5f-a5e2-f55cf4c6efcc/volumes" Oct 02 12:38:46 crc kubenswrapper[4766]: I1002 12:38:46.007457 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh2qk\" (UniqueName: \"kubernetes.io/projected/6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e-kube-api-access-hh2qk\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:46 crc kubenswrapper[4766]: I1002 12:38:46.239810 4766 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 02 12:38:46 crc kubenswrapper[4766]: I1002 12:38:46.239888 4766 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-43846a8c-24c3-4401-b197-cb99f6993eda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43846a8c-24c3-4401-b197-cb99f6993eda\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/26fec008f39e4254733d15ee83af525dc67fcdfa8cdc9bb751893e38906d24d8/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:46 crc kubenswrapper[4766]: I1002 12:38:46.368683 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-43846a8c-24c3-4401-b197-cb99f6993eda\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-43846a8c-24c3-4401-b197-cb99f6993eda\") pod \"prometheus-metric-storage-0\" (UID: \"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e\") " pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:46 crc kubenswrapper[4766]: I1002 12:38:46.638065 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 02 12:38:46 crc kubenswrapper[4766]: I1002 12:38:46.764929 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Oct 02 12:38:46 crc kubenswrapper[4766]: W1002 12:38:46.789540 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc85d4fb_980e_4303_8850_ec3da21b43b2.slice/crio-ccd0ef2a7d8bd6e03395018ba4c9861366c0c4cd4fa3695953d2ee03f618ef4e WatchSource:0}: Error finding container ccd0ef2a7d8bd6e03395018ba4c9861366c0c4cd4fa3695953d2ee03f618ef4e: Status 404 returned error can't find the container with id ccd0ef2a7d8bd6e03395018ba4c9861366c0c4cd4fa3695953d2ee03f618ef4e Oct 02 12:38:46 crc kubenswrapper[4766]: I1002 12:38:46.898496 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"dc85d4fb-980e-4303-8850-ec3da21b43b2","Type":"ContainerStarted","Data":"ccd0ef2a7d8bd6e03395018ba4c9861366c0c4cd4fa3695953d2ee03f618ef4e"} Oct 02 12:38:47 crc kubenswrapper[4766]: I1002 12:38:47.283090 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 02 12:38:47 crc kubenswrapper[4766]: I1002 12:38:47.909210 4766 generic.go:334] "Generic (PLEG): container finished" podID="34f0b55d-1a54-413b-8131-71b5816277c4" containerID="a1a03a1fcf20e0b259fd245736580119ca7cb59fac730795381c654f08dfdbca" exitCode=137 Oct 02 12:38:47 crc kubenswrapper[4766]: I1002 12:38:47.910517 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e","Type":"ContainerStarted","Data":"d437f7250219f0bc92a498a45e5dea2c58e54fedd155b157bc07c6e97abce4e9"} Oct 02 12:38:48 crc kubenswrapper[4766]: I1002 12:38:48.039906 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-chmnn"] Oct 02 12:38:48 crc kubenswrapper[4766]: I1002 12:38:48.049617 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-chmnn"] Oct 02 12:38:48 crc kubenswrapper[4766]: I1002 12:38:48.921970 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f092805a9d179564d3c0274530667768f86e383c4800b92b41acad31dbdf96d" Oct 02 12:38:48 crc kubenswrapper[4766]: I1002 12:38:48.923754 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fa24afd2-9499-490a-bc1a-8261b74d0dae","Type":"ContainerStarted","Data":"9ac310813f3e07ab8b0873e304ae8fea493935b5015e753f596968a88eaa4a0f"} Oct 02 12:38:48 crc kubenswrapper[4766]: I1002 12:38:48.944160 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=5.944137648 podStartE2EDuration="5.944137648s" podCreationTimestamp="2025-10-02 12:38:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:38:48.942700172 +0000 UTC m=+6443.885571136" watchObservedRunningTime="2025-10-02 12:38:48.944137648 +0000 UTC m=+6443.887008592" Oct 02 12:38:49 crc kubenswrapper[4766]: I1002 12:38:49.003668 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 12:38:49 crc kubenswrapper[4766]: I1002 12:38:49.007998 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="34f0b55d-1a54-413b-8131-71b5816277c4" podUID="fa24afd2-9499-490a-bc1a-8261b74d0dae" Oct 02 12:38:49 crc kubenswrapper[4766]: I1002 12:38:49.098245 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/34f0b55d-1a54-413b-8131-71b5816277c4-openstack-config\") pod \"34f0b55d-1a54-413b-8131-71b5816277c4\" (UID: \"34f0b55d-1a54-413b-8131-71b5816277c4\") " Oct 02 12:38:49 crc kubenswrapper[4766]: I1002 12:38:49.098794 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v4ld\" (UniqueName: \"kubernetes.io/projected/34f0b55d-1a54-413b-8131-71b5816277c4-kube-api-access-8v4ld\") pod \"34f0b55d-1a54-413b-8131-71b5816277c4\" (UID: \"34f0b55d-1a54-413b-8131-71b5816277c4\") " Oct 02 12:38:49 crc kubenswrapper[4766]: I1002 12:38:49.099107 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/34f0b55d-1a54-413b-8131-71b5816277c4-openstack-config-secret\") pod \"34f0b55d-1a54-413b-8131-71b5816277c4\" (UID: \"34f0b55d-1a54-413b-8131-71b5816277c4\") " Oct 02 12:38:49 crc kubenswrapper[4766]: I1002 12:38:49.139005 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34f0b55d-1a54-413b-8131-71b5816277c4-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "34f0b55d-1a54-413b-8131-71b5816277c4" (UID: "34f0b55d-1a54-413b-8131-71b5816277c4"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:38:49 crc kubenswrapper[4766]: I1002 12:38:49.161999 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34f0b55d-1a54-413b-8131-71b5816277c4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "34f0b55d-1a54-413b-8131-71b5816277c4" (UID: "34f0b55d-1a54-413b-8131-71b5816277c4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:38:49 crc kubenswrapper[4766]: I1002 12:38:49.207043 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/34f0b55d-1a54-413b-8131-71b5816277c4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 02 12:38:49 crc kubenswrapper[4766]: I1002 12:38:49.207122 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/34f0b55d-1a54-413b-8131-71b5816277c4-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:38:49 crc kubenswrapper[4766]: I1002 12:38:49.525769 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34f0b55d-1a54-413b-8131-71b5816277c4-kube-api-access-8v4ld" (OuterVolumeSpecName: "kube-api-access-8v4ld") pod "34f0b55d-1a54-413b-8131-71b5816277c4" (UID: "34f0b55d-1a54-413b-8131-71b5816277c4"). InnerVolumeSpecName "kube-api-access-8v4ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:38:49 crc kubenswrapper[4766]: I1002 12:38:49.616365 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v4ld\" (UniqueName: \"kubernetes.io/projected/34f0b55d-1a54-413b-8131-71b5816277c4-kube-api-access-8v4ld\") on node \"crc\" DevicePath \"\"" Oct 02 12:38:49 crc kubenswrapper[4766]: I1002 12:38:49.892767 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34f0b55d-1a54-413b-8131-71b5816277c4" path="/var/lib/kubelet/pods/34f0b55d-1a54-413b-8131-71b5816277c4/volumes" Oct 02 12:38:49 crc kubenswrapper[4766]: I1002 12:38:49.893402 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dc41884-f51d-48d7-b8ef-61e1148759ea" path="/var/lib/kubelet/pods/3dc41884-f51d-48d7-b8ef-61e1148759ea/volumes" Oct 02 12:38:49 crc kubenswrapper[4766]: I1002 12:38:49.936190 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 02 12:38:49 crc kubenswrapper[4766]: I1002 12:38:49.941597 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="34f0b55d-1a54-413b-8131-71b5816277c4" podUID="fa24afd2-9499-490a-bc1a-8261b74d0dae" Oct 02 12:38:49 crc kubenswrapper[4766]: I1002 12:38:49.953088 4766 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="34f0b55d-1a54-413b-8131-71b5816277c4" podUID="fa24afd2-9499-490a-bc1a-8261b74d0dae" Oct 02 12:38:50 crc kubenswrapper[4766]: I1002 12:38:50.039342 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-gf5l8"] Oct 02 12:38:50 crc kubenswrapper[4766]: I1002 12:38:50.050022 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-gf5l8"] Oct 02 12:38:51 crc kubenswrapper[4766]: I1002 12:38:51.894199 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f26d4d8b-0c18-4fb9-b57a-cd154cb211eb" path="/var/lib/kubelet/pods/f26d4d8b-0c18-4fb9-b57a-cd154cb211eb/volumes" Oct 02 12:38:55 crc kubenswrapper[4766]: I1002 12:38:55.896924 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:38:55 crc kubenswrapper[4766]: E1002 12:38:55.898039 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:38:59 crc kubenswrapper[4766]: I1002 12:38:59.066604 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9d25d74d-1b30-4bb2-8cc2-401004b37624","Type":"ContainerStarted","Data":"6d26a36e8ba84e3fbbd82075cbff5a3183425c9dda82103925ce93c49cac430f"} Oct 02 12:38:59 crc kubenswrapper[4766]: I1002 12:38:59.067574 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 02 12:38:59 crc kubenswrapper[4766]: I1002 12:38:59.087806 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.980339604 podStartE2EDuration="15.087768369s" podCreationTimestamp="2025-10-02 12:38:44 +0000 UTC" firstStartedPulling="2025-10-02 12:38:45.611189777 +0000 UTC m=+6440.554060711" lastFinishedPulling="2025-10-02 12:38:57.718618532 +0000 UTC m=+6452.661489476" observedRunningTime="2025-10-02 12:38:59.086745796 +0000 UTC m=+6454.029616780" watchObservedRunningTime="2025-10-02 12:38:59.087768369 +0000 UTC m=+6454.030639353" Oct 02 12:39:03 crc kubenswrapper[4766]: I1002 12:39:03.116325 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"dc85d4fb-980e-4303-8850-ec3da21b43b2","Type":"ContainerStarted","Data":"bc809d0b6403e6f5385036c3a89705009a8316f7525e30a79f10a3a056c6e9ce"} Oct 02 12:39:03 crc kubenswrapper[4766]: I1002 12:39:03.134434 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e","Type":"ContainerStarted","Data":"714adb219a500a186b95aefe7f7fb5b8676c52320c6caa443f979e0999fb6db4"} Oct 02 12:39:04 crc kubenswrapper[4766]: I1002 12:39:04.574364 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 02 12:39:07 crc kubenswrapper[4766]: I1002 12:39:07.896621 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:39:07 crc kubenswrapper[4766]: E1002 12:39:07.898147 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:39:08 crc kubenswrapper[4766]: I1002 12:39:08.046590 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-pd994"] Oct 02 12:39:08 crc kubenswrapper[4766]: I1002 12:39:08.058684 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-pd994"] Oct 02 12:39:09 crc kubenswrapper[4766]: I1002 12:39:09.901922 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b15aa28-bb1c-4f14-9aeb-06c0033ae58b" path="/var/lib/kubelet/pods/7b15aa28-bb1c-4f14-9aeb-06c0033ae58b/volumes" Oct 02 12:39:13 crc kubenswrapper[4766]: I1002 12:39:13.280787 4766 generic.go:334] "Generic (PLEG): container finished" podID="6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e" containerID="714adb219a500a186b95aefe7f7fb5b8676c52320c6caa443f979e0999fb6db4" exitCode=0 Oct 02 12:39:13 crc kubenswrapper[4766]: I1002 12:39:13.280943 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e","Type":"ContainerDied","Data":"714adb219a500a186b95aefe7f7fb5b8676c52320c6caa443f979e0999fb6db4"} Oct 02 12:39:14 crc kubenswrapper[4766]: I1002 12:39:14.304848 4766 generic.go:334] "Generic (PLEG): container finished" podID="dc85d4fb-980e-4303-8850-ec3da21b43b2" containerID="bc809d0b6403e6f5385036c3a89705009a8316f7525e30a79f10a3a056c6e9ce" exitCode=0 Oct 02 12:39:14 crc kubenswrapper[4766]: I1002 12:39:14.305428 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"dc85d4fb-980e-4303-8850-ec3da21b43b2","Type":"ContainerDied","Data":"bc809d0b6403e6f5385036c3a89705009a8316f7525e30a79f10a3a056c6e9ce"} Oct 02 12:39:20 crc kubenswrapper[4766]: I1002 12:39:20.400011 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"dc85d4fb-980e-4303-8850-ec3da21b43b2","Type":"ContainerStarted","Data":"4c61cc3eb6240ce7bd8a45267fe2f4388441ed3457b4f27b760a30359997e9e0"} Oct 02 12:39:21 crc kubenswrapper[4766]: I1002 12:39:21.416342 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e","Type":"ContainerStarted","Data":"024f2a9155ba6002e3da76917c3c8cc57ecaf6d22932fdc21b996fd78967f1c3"} Oct 02 12:39:21 crc kubenswrapper[4766]: I1002 12:39:21.882222 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:39:21 crc kubenswrapper[4766]: E1002 12:39:21.883049 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:39:25 crc kubenswrapper[4766]: I1002 12:39:25.491918 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"dc85d4fb-980e-4303-8850-ec3da21b43b2","Type":"ContainerStarted","Data":"132d28d5b94d840980ac1d4f0bd3a9801609b954145c1c2771e791e31ca77493"} Oct 02 12:39:25 crc kubenswrapper[4766]: I1002 12:39:25.493358 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Oct 02 12:39:25 crc kubenswrapper[4766]: I1002 12:39:25.497457 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e","Type":"ContainerStarted","Data":"4ed098124df5d681b316e36cef5c6fd7c252c88d700055eee4064776eb6155ca"} Oct 02 12:39:25 crc kubenswrapper[4766]: I1002 12:39:25.505373 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Oct 02 12:39:25 crc kubenswrapper[4766]: I1002 12:39:25.544165 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=8.261358654 podStartE2EDuration="41.544133365s" podCreationTimestamp="2025-10-02 12:38:44 +0000 UTC" firstStartedPulling="2025-10-02 12:38:46.793038534 +0000 UTC m=+6441.735909478" lastFinishedPulling="2025-10-02 12:39:20.075813205 +0000 UTC m=+6475.018684189" observedRunningTime="2025-10-02 12:39:25.539339412 +0000 UTC m=+6480.482210366" watchObservedRunningTime="2025-10-02 12:39:25.544133365 +0000 UTC m=+6480.487004319" Oct 02 12:39:30 crc kubenswrapper[4766]: I1002 12:39:30.598568 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e","Type":"ContainerStarted","Data":"0ace31afbc9f1732206834ccf57bf34a62f16e2d10303a27113edaf5e3daa773"} Oct 02 12:39:30 crc kubenswrapper[4766]: I1002 12:39:30.642963 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.266840393 podStartE2EDuration="46.64294393s" podCreationTimestamp="2025-10-02 12:38:44 +0000 UTC" firstStartedPulling="2025-10-02 12:38:47.316912005 +0000 UTC m=+6442.259782939" lastFinishedPulling="2025-10-02 12:39:29.693015532 +0000 UTC m=+6484.635886476" observedRunningTime="2025-10-02 12:39:30.641010988 +0000 UTC m=+6485.583882002" watchObservedRunningTime="2025-10-02 12:39:30.64294393 +0000 UTC m=+6485.585814874" Oct 02 12:39:31 crc kubenswrapper[4766]: I1002 12:39:31.639499 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 02 12:39:31 crc kubenswrapper[4766]: I1002 12:39:31.640017 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 02 12:39:31 crc kubenswrapper[4766]: I1002 12:39:31.643018 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 02 12:39:32 crc kubenswrapper[4766]: I1002 12:39:32.627140 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.138932 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.144023 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.149490 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.149812 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.195151 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.329854 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5t82\" (UniqueName: \"kubernetes.io/projected/2ea3316b-5093-4269-97ad-ec3fdbba3837-kube-api-access-w5t82\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.330003 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-config-data\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.330046 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.330104 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.330121 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ea3316b-5093-4269-97ad-ec3fdbba3837-run-httpd\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.330234 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-scripts\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.330278 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ea3316b-5093-4269-97ad-ec3fdbba3837-log-httpd\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.432164 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.432243 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.432261 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ea3316b-5093-4269-97ad-ec3fdbba3837-run-httpd\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.432329 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-scripts\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.432349 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ea3316b-5093-4269-97ad-ec3fdbba3837-log-httpd\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.432401 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5t82\" (UniqueName: \"kubernetes.io/projected/2ea3316b-5093-4269-97ad-ec3fdbba3837-kube-api-access-w5t82\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.432492 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-config-data\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.433240 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ea3316b-5093-4269-97ad-ec3fdbba3837-log-httpd\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.434659 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ea3316b-5093-4269-97ad-ec3fdbba3837-run-httpd\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.443794 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-config-data\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.450990 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.451085 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-scripts\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.451729 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.453725 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5t82\" (UniqueName: \"kubernetes.io/projected/2ea3316b-5093-4269-97ad-ec3fdbba3837-kube-api-access-w5t82\") pod \"ceilometer-0\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.491314 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.899610 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c4zf9"] Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.903238 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c4zf9" Oct 02 12:39:33 crc kubenswrapper[4766]: I1002 12:39:33.918107 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c4zf9"] Oct 02 12:39:34 crc kubenswrapper[4766]: I1002 12:39:34.024364 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:39:34 crc kubenswrapper[4766]: W1002 12:39:34.028819 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ea3316b_5093_4269_97ad_ec3fdbba3837.slice/crio-5bce4033772dac214fa443894a89afbd9353d8c8b921daadfc74f4d313382aff WatchSource:0}: Error finding container 5bce4033772dac214fa443894a89afbd9353d8c8b921daadfc74f4d313382aff: Status 404 returned error can't find the container with id 5bce4033772dac214fa443894a89afbd9353d8c8b921daadfc74f4d313382aff Oct 02 12:39:34 crc kubenswrapper[4766]: I1002 12:39:34.047103 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de1831d6-eda6-444b-a277-951de5115465-catalog-content\") pod \"community-operators-c4zf9\" (UID: \"de1831d6-eda6-444b-a277-951de5115465\") " pod="openshift-marketplace/community-operators-c4zf9" Oct 02 12:39:34 crc kubenswrapper[4766]: I1002 12:39:34.047737 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb7tt\" (UniqueName: \"kubernetes.io/projected/de1831d6-eda6-444b-a277-951de5115465-kube-api-access-gb7tt\") pod \"community-operators-c4zf9\" (UID: \"de1831d6-eda6-444b-a277-951de5115465\") " pod="openshift-marketplace/community-operators-c4zf9" Oct 02 12:39:34 crc kubenswrapper[4766]: I1002 12:39:34.047848 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de1831d6-eda6-444b-a277-951de5115465-utilities\") pod \"community-operators-c4zf9\" (UID: \"de1831d6-eda6-444b-a277-951de5115465\") " pod="openshift-marketplace/community-operators-c4zf9" Oct 02 12:39:34 crc kubenswrapper[4766]: I1002 12:39:34.150223 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb7tt\" (UniqueName: \"kubernetes.io/projected/de1831d6-eda6-444b-a277-951de5115465-kube-api-access-gb7tt\") pod \"community-operators-c4zf9\" (UID: \"de1831d6-eda6-444b-a277-951de5115465\") " pod="openshift-marketplace/community-operators-c4zf9" Oct 02 12:39:34 crc kubenswrapper[4766]: I1002 12:39:34.150303 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de1831d6-eda6-444b-a277-951de5115465-utilities\") pod \"community-operators-c4zf9\" (UID: \"de1831d6-eda6-444b-a277-951de5115465\") " pod="openshift-marketplace/community-operators-c4zf9" Oct 02 12:39:34 crc kubenswrapper[4766]: I1002 12:39:34.150582 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de1831d6-eda6-444b-a277-951de5115465-catalog-content\") pod \"community-operators-c4zf9\" (UID: \"de1831d6-eda6-444b-a277-951de5115465\") " pod="openshift-marketplace/community-operators-c4zf9" Oct 02 12:39:34 crc kubenswrapper[4766]: I1002 12:39:34.151517 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de1831d6-eda6-444b-a277-951de5115465-catalog-content\") pod \"community-operators-c4zf9\" (UID: \"de1831d6-eda6-444b-a277-951de5115465\") " pod="openshift-marketplace/community-operators-c4zf9" Oct 02 12:39:34 crc kubenswrapper[4766]: I1002 12:39:34.152154 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de1831d6-eda6-444b-a277-951de5115465-utilities\") pod \"community-operators-c4zf9\" (UID: \"de1831d6-eda6-444b-a277-951de5115465\") " pod="openshift-marketplace/community-operators-c4zf9" Oct 02 12:39:34 crc kubenswrapper[4766]: I1002 12:39:34.184846 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb7tt\" (UniqueName: \"kubernetes.io/projected/de1831d6-eda6-444b-a277-951de5115465-kube-api-access-gb7tt\") pod \"community-operators-c4zf9\" (UID: \"de1831d6-eda6-444b-a277-951de5115465\") " pod="openshift-marketplace/community-operators-c4zf9" Oct 02 12:39:34 crc kubenswrapper[4766]: I1002 12:39:34.235541 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c4zf9" Oct 02 12:39:34 crc kubenswrapper[4766]: I1002 12:39:34.659941 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ea3316b-5093-4269-97ad-ec3fdbba3837","Type":"ContainerStarted","Data":"5bce4033772dac214fa443894a89afbd9353d8c8b921daadfc74f4d313382aff"} Oct 02 12:39:34 crc kubenswrapper[4766]: I1002 12:39:34.729025 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c4zf9"] Oct 02 12:39:35 crc kubenswrapper[4766]: I1002 12:39:35.670340 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ea3316b-5093-4269-97ad-ec3fdbba3837","Type":"ContainerStarted","Data":"f804c8aabb2432908cd15c97a641d921c076e2a065c3f7980e9165aa744bf0eb"} Oct 02 12:39:35 crc kubenswrapper[4766]: I1002 12:39:35.671880 4766 generic.go:334] "Generic (PLEG): container finished" podID="de1831d6-eda6-444b-a277-951de5115465" containerID="1f44403e1066d4e684be8d3b9637147f0666575b247c1ee4e8158d56409b74dc" exitCode=0 Oct 02 12:39:35 crc kubenswrapper[4766]: I1002 12:39:35.671926 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4zf9" event={"ID":"de1831d6-eda6-444b-a277-951de5115465","Type":"ContainerDied","Data":"1f44403e1066d4e684be8d3b9637147f0666575b247c1ee4e8158d56409b74dc"} Oct 02 12:39:35 crc kubenswrapper[4766]: I1002 12:39:35.671958 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4zf9" event={"ID":"de1831d6-eda6-444b-a277-951de5115465","Type":"ContainerStarted","Data":"d294de888386e13eac7e8d8896f110dc817b44237c8fc1359d3d4e5591a03170"} Oct 02 12:39:35 crc kubenswrapper[4766]: I1002 12:39:35.894561 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:39:35 crc kubenswrapper[4766]: E1002 12:39:35.895362 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:39:36 crc kubenswrapper[4766]: I1002 12:39:36.685168 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4zf9" event={"ID":"de1831d6-eda6-444b-a277-951de5115465","Type":"ContainerStarted","Data":"fc6c6ebe06e1a73715bb73056c77fa9ed849d3427980aeffbb300a73a1d0a01c"} Oct 02 12:39:36 crc kubenswrapper[4766]: I1002 12:39:36.690213 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ea3316b-5093-4269-97ad-ec3fdbba3837","Type":"ContainerStarted","Data":"9bd3c5cead1c11672a85b921a87eadf8c166fa61922f947c62dc6bad5196bfd4"} Oct 02 12:39:37 crc kubenswrapper[4766]: I1002 12:39:37.706164 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ea3316b-5093-4269-97ad-ec3fdbba3837","Type":"ContainerStarted","Data":"4bcda70898bc06aa9c31d5d90815d02f8c5b363e17eb3e740bca9521fad974ea"} Oct 02 12:39:38 crc kubenswrapper[4766]: I1002 12:39:38.704905 4766 scope.go:117] "RemoveContainer" containerID="de52ec449a0dc9b9f0562a67f7bb244ff9b254a1943e27a97287839bb09fda58" Oct 02 12:39:38 crc kubenswrapper[4766]: I1002 12:39:38.727607 4766 generic.go:334] "Generic (PLEG): container finished" podID="de1831d6-eda6-444b-a277-951de5115465" containerID="fc6c6ebe06e1a73715bb73056c77fa9ed849d3427980aeffbb300a73a1d0a01c" exitCode=0 Oct 02 12:39:38 crc kubenswrapper[4766]: I1002 12:39:38.727753 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4zf9" event={"ID":"de1831d6-eda6-444b-a277-951de5115465","Type":"ContainerDied","Data":"fc6c6ebe06e1a73715bb73056c77fa9ed849d3427980aeffbb300a73a1d0a01c"} Oct 02 12:39:38 crc kubenswrapper[4766]: I1002 12:39:38.817618 4766 scope.go:117] "RemoveContainer" containerID="68876f42998c9251e9cf4c9513dd01c7de078e952ab8d829ea92b87c1ac568b6" Oct 02 12:39:38 crc kubenswrapper[4766]: I1002 12:39:38.849723 4766 scope.go:117] "RemoveContainer" containerID="a1a03a1fcf20e0b259fd245736580119ca7cb59fac730795381c654f08dfdbca" Oct 02 12:39:38 crc kubenswrapper[4766]: I1002 12:39:38.913131 4766 scope.go:117] "RemoveContainer" containerID="f82976d8a1d07384533a107bd6daba5c9fe638af34a1fea8fa64be1c6c278ad2" Oct 02 12:39:38 crc kubenswrapper[4766]: I1002 12:39:38.982603 4766 scope.go:117] "RemoveContainer" containerID="3c8a324c9daf6cf9b26a35385d3377e86dcefd3c21151e2f937c8de222c83d0e" Oct 02 12:39:39 crc kubenswrapper[4766]: I1002 12:39:39.743043 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4zf9" event={"ID":"de1831d6-eda6-444b-a277-951de5115465","Type":"ContainerStarted","Data":"c8d5280c7b7a0e5e25da970a5ead16e50917f9e8e1a89b6570f4db7ba8645284"} Oct 02 12:39:39 crc kubenswrapper[4766]: I1002 12:39:39.751311 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ea3316b-5093-4269-97ad-ec3fdbba3837","Type":"ContainerStarted","Data":"8df07083178d7524a148d930f380c5f0b2f25fc52d2d174ffa92efd64a3a56dd"} Oct 02 12:39:39 crc kubenswrapper[4766]: I1002 12:39:39.751464 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 12:39:39 crc kubenswrapper[4766]: I1002 12:39:39.776542 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c4zf9" podStartSLOduration=3.200863041 podStartE2EDuration="6.776511846s" podCreationTimestamp="2025-10-02 12:39:33 +0000 UTC" firstStartedPulling="2025-10-02 12:39:35.673921422 +0000 UTC m=+6490.616792376" lastFinishedPulling="2025-10-02 12:39:39.249570237 +0000 UTC m=+6494.192441181" observedRunningTime="2025-10-02 12:39:39.764856932 +0000 UTC m=+6494.707727886" watchObservedRunningTime="2025-10-02 12:39:39.776511846 +0000 UTC m=+6494.719382790" Oct 02 12:39:39 crc kubenswrapper[4766]: I1002 12:39:39.794832 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.685358526 podStartE2EDuration="6.794802441s" podCreationTimestamp="2025-10-02 12:39:33 +0000 UTC" firstStartedPulling="2025-10-02 12:39:34.031306346 +0000 UTC m=+6488.974177300" lastFinishedPulling="2025-10-02 12:39:39.140750271 +0000 UTC m=+6494.083621215" observedRunningTime="2025-10-02 12:39:39.78883158 +0000 UTC m=+6494.731702534" watchObservedRunningTime="2025-10-02 12:39:39.794802441 +0000 UTC m=+6494.737673385" Oct 02 12:39:43 crc kubenswrapper[4766]: I1002 12:39:43.257192 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-22h8g"] Oct 02 12:39:43 crc kubenswrapper[4766]: I1002 12:39:43.259454 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-22h8g" Oct 02 12:39:43 crc kubenswrapper[4766]: I1002 12:39:43.276818 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-22h8g"] Oct 02 12:39:43 crc kubenswrapper[4766]: I1002 12:39:43.285846 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sstj7\" (UniqueName: \"kubernetes.io/projected/9c68434a-a559-4f0d-b7b7-ed2490989b58-kube-api-access-sstj7\") pod \"aodh-db-create-22h8g\" (UID: \"9c68434a-a559-4f0d-b7b7-ed2490989b58\") " pod="openstack/aodh-db-create-22h8g" Oct 02 12:39:43 crc kubenswrapper[4766]: I1002 12:39:43.388781 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sstj7\" (UniqueName: \"kubernetes.io/projected/9c68434a-a559-4f0d-b7b7-ed2490989b58-kube-api-access-sstj7\") pod \"aodh-db-create-22h8g\" (UID: \"9c68434a-a559-4f0d-b7b7-ed2490989b58\") " pod="openstack/aodh-db-create-22h8g" Oct 02 12:39:43 crc kubenswrapper[4766]: I1002 12:39:43.413637 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sstj7\" (UniqueName: \"kubernetes.io/projected/9c68434a-a559-4f0d-b7b7-ed2490989b58-kube-api-access-sstj7\") pod \"aodh-db-create-22h8g\" (UID: \"9c68434a-a559-4f0d-b7b7-ed2490989b58\") " pod="openstack/aodh-db-create-22h8g" Oct 02 12:39:43 crc kubenswrapper[4766]: I1002 12:39:43.591818 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-22h8g" Oct 02 12:39:44 crc kubenswrapper[4766]: I1002 12:39:44.088524 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-22h8g"] Oct 02 12:39:44 crc kubenswrapper[4766]: I1002 12:39:44.235787 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c4zf9" Oct 02 12:39:44 crc kubenswrapper[4766]: I1002 12:39:44.235872 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c4zf9" Oct 02 12:39:44 crc kubenswrapper[4766]: I1002 12:39:44.345652 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c4zf9" Oct 02 12:39:44 crc kubenswrapper[4766]: I1002 12:39:44.819076 4766 generic.go:334] "Generic (PLEG): container finished" podID="9c68434a-a559-4f0d-b7b7-ed2490989b58" containerID="ef2a20319350fa1c2836fa8d8bfce79811592c922bfd6aa9bd23910741460bf0" exitCode=0 Oct 02 12:39:44 crc kubenswrapper[4766]: I1002 12:39:44.820018 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-22h8g" event={"ID":"9c68434a-a559-4f0d-b7b7-ed2490989b58","Type":"ContainerDied","Data":"ef2a20319350fa1c2836fa8d8bfce79811592c922bfd6aa9bd23910741460bf0"} Oct 02 12:39:44 crc kubenswrapper[4766]: I1002 12:39:44.827606 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-22h8g" event={"ID":"9c68434a-a559-4f0d-b7b7-ed2490989b58","Type":"ContainerStarted","Data":"8000fb8d5debbeec2a99273039b64be163997abd78a7e7896c295e488668bf56"} Oct 02 12:39:44 crc kubenswrapper[4766]: I1002 12:39:44.902402 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c4zf9" Oct 02 12:39:44 crc kubenswrapper[4766]: I1002 12:39:44.966659 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c4zf9"] Oct 02 12:39:46 crc kubenswrapper[4766]: I1002 12:39:46.298567 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-22h8g" Oct 02 12:39:46 crc kubenswrapper[4766]: I1002 12:39:46.382170 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sstj7\" (UniqueName: \"kubernetes.io/projected/9c68434a-a559-4f0d-b7b7-ed2490989b58-kube-api-access-sstj7\") pod \"9c68434a-a559-4f0d-b7b7-ed2490989b58\" (UID: \"9c68434a-a559-4f0d-b7b7-ed2490989b58\") " Oct 02 12:39:46 crc kubenswrapper[4766]: I1002 12:39:46.388906 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c68434a-a559-4f0d-b7b7-ed2490989b58-kube-api-access-sstj7" (OuterVolumeSpecName: "kube-api-access-sstj7") pod "9c68434a-a559-4f0d-b7b7-ed2490989b58" (UID: "9c68434a-a559-4f0d-b7b7-ed2490989b58"). InnerVolumeSpecName "kube-api-access-sstj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:39:46 crc kubenswrapper[4766]: I1002 12:39:46.485542 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sstj7\" (UniqueName: \"kubernetes.io/projected/9c68434a-a559-4f0d-b7b7-ed2490989b58-kube-api-access-sstj7\") on node \"crc\" DevicePath \"\"" Oct 02 12:39:46 crc kubenswrapper[4766]: I1002 12:39:46.863685 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-22h8g" event={"ID":"9c68434a-a559-4f0d-b7b7-ed2490989b58","Type":"ContainerDied","Data":"8000fb8d5debbeec2a99273039b64be163997abd78a7e7896c295e488668bf56"} Oct 02 12:39:46 crc kubenswrapper[4766]: I1002 12:39:46.863754 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8000fb8d5debbeec2a99273039b64be163997abd78a7e7896c295e488668bf56" Oct 02 12:39:46 crc kubenswrapper[4766]: I1002 12:39:46.863835 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c4zf9" podUID="de1831d6-eda6-444b-a277-951de5115465" containerName="registry-server" containerID="cri-o://c8d5280c7b7a0e5e25da970a5ead16e50917f9e8e1a89b6570f4db7ba8645284" gracePeriod=2 Oct 02 12:39:46 crc kubenswrapper[4766]: I1002 12:39:46.863959 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-22h8g" Oct 02 12:39:46 crc kubenswrapper[4766]: I1002 12:39:46.882486 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:39:46 crc kubenswrapper[4766]: E1002 12:39:46.883087 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.485641 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c4zf9" Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.626670 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de1831d6-eda6-444b-a277-951de5115465-utilities\") pod \"de1831d6-eda6-444b-a277-951de5115465\" (UID: \"de1831d6-eda6-444b-a277-951de5115465\") " Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.626761 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb7tt\" (UniqueName: \"kubernetes.io/projected/de1831d6-eda6-444b-a277-951de5115465-kube-api-access-gb7tt\") pod \"de1831d6-eda6-444b-a277-951de5115465\" (UID: \"de1831d6-eda6-444b-a277-951de5115465\") " Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.626820 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de1831d6-eda6-444b-a277-951de5115465-catalog-content\") pod \"de1831d6-eda6-444b-a277-951de5115465\" (UID: \"de1831d6-eda6-444b-a277-951de5115465\") " Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.627885 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de1831d6-eda6-444b-a277-951de5115465-utilities" (OuterVolumeSpecName: "utilities") pod "de1831d6-eda6-444b-a277-951de5115465" (UID: "de1831d6-eda6-444b-a277-951de5115465"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.632763 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de1831d6-eda6-444b-a277-951de5115465-kube-api-access-gb7tt" (OuterVolumeSpecName: "kube-api-access-gb7tt") pod "de1831d6-eda6-444b-a277-951de5115465" (UID: "de1831d6-eda6-444b-a277-951de5115465"). InnerVolumeSpecName "kube-api-access-gb7tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.680412 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de1831d6-eda6-444b-a277-951de5115465-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de1831d6-eda6-444b-a277-951de5115465" (UID: "de1831d6-eda6-444b-a277-951de5115465"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.729766 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de1831d6-eda6-444b-a277-951de5115465-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.729807 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb7tt\" (UniqueName: \"kubernetes.io/projected/de1831d6-eda6-444b-a277-951de5115465-kube-api-access-gb7tt\") on node \"crc\" DevicePath \"\"" Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.729822 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de1831d6-eda6-444b-a277-951de5115465-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.888542 4766 generic.go:334] "Generic (PLEG): container finished" podID="de1831d6-eda6-444b-a277-951de5115465" containerID="c8d5280c7b7a0e5e25da970a5ead16e50917f9e8e1a89b6570f4db7ba8645284" exitCode=0 Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.888662 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c4zf9" Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.904488 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4zf9" event={"ID":"de1831d6-eda6-444b-a277-951de5115465","Type":"ContainerDied","Data":"c8d5280c7b7a0e5e25da970a5ead16e50917f9e8e1a89b6570f4db7ba8645284"} Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.904596 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c4zf9" event={"ID":"de1831d6-eda6-444b-a277-951de5115465","Type":"ContainerDied","Data":"d294de888386e13eac7e8d8896f110dc817b44237c8fc1359d3d4e5591a03170"} Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.904638 4766 scope.go:117] "RemoveContainer" containerID="c8d5280c7b7a0e5e25da970a5ead16e50917f9e8e1a89b6570f4db7ba8645284" Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.964368 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c4zf9"] Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.977542 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c4zf9"] Oct 02 12:39:47 crc kubenswrapper[4766]: I1002 12:39:47.980412 4766 scope.go:117] "RemoveContainer" containerID="fc6c6ebe06e1a73715bb73056c77fa9ed849d3427980aeffbb300a73a1d0a01c" Oct 02 12:39:48 crc kubenswrapper[4766]: I1002 12:39:48.005704 4766 scope.go:117] "RemoveContainer" containerID="1f44403e1066d4e684be8d3b9637147f0666575b247c1ee4e8158d56409b74dc" Oct 02 12:39:48 crc kubenswrapper[4766]: I1002 12:39:48.059035 4766 scope.go:117] "RemoveContainer" containerID="c8d5280c7b7a0e5e25da970a5ead16e50917f9e8e1a89b6570f4db7ba8645284" Oct 02 12:39:48 crc kubenswrapper[4766]: E1002 12:39:48.059781 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8d5280c7b7a0e5e25da970a5ead16e50917f9e8e1a89b6570f4db7ba8645284\": container with ID starting with c8d5280c7b7a0e5e25da970a5ead16e50917f9e8e1a89b6570f4db7ba8645284 not found: ID does not exist" containerID="c8d5280c7b7a0e5e25da970a5ead16e50917f9e8e1a89b6570f4db7ba8645284" Oct 02 12:39:48 crc kubenswrapper[4766]: I1002 12:39:48.059842 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8d5280c7b7a0e5e25da970a5ead16e50917f9e8e1a89b6570f4db7ba8645284"} err="failed to get container status \"c8d5280c7b7a0e5e25da970a5ead16e50917f9e8e1a89b6570f4db7ba8645284\": rpc error: code = NotFound desc = could not find container \"c8d5280c7b7a0e5e25da970a5ead16e50917f9e8e1a89b6570f4db7ba8645284\": container with ID starting with c8d5280c7b7a0e5e25da970a5ead16e50917f9e8e1a89b6570f4db7ba8645284 not found: ID does not exist" Oct 02 12:39:48 crc kubenswrapper[4766]: I1002 12:39:48.059884 4766 scope.go:117] "RemoveContainer" containerID="fc6c6ebe06e1a73715bb73056c77fa9ed849d3427980aeffbb300a73a1d0a01c" Oct 02 12:39:48 crc kubenswrapper[4766]: E1002 12:39:48.060518 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc6c6ebe06e1a73715bb73056c77fa9ed849d3427980aeffbb300a73a1d0a01c\": container with ID starting with fc6c6ebe06e1a73715bb73056c77fa9ed849d3427980aeffbb300a73a1d0a01c not found: ID does not exist" containerID="fc6c6ebe06e1a73715bb73056c77fa9ed849d3427980aeffbb300a73a1d0a01c" Oct 02 12:39:48 crc kubenswrapper[4766]: I1002 12:39:48.060557 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc6c6ebe06e1a73715bb73056c77fa9ed849d3427980aeffbb300a73a1d0a01c"} err="failed to get container status \"fc6c6ebe06e1a73715bb73056c77fa9ed849d3427980aeffbb300a73a1d0a01c\": rpc error: code = NotFound desc = could not find container \"fc6c6ebe06e1a73715bb73056c77fa9ed849d3427980aeffbb300a73a1d0a01c\": container with ID starting with fc6c6ebe06e1a73715bb73056c77fa9ed849d3427980aeffbb300a73a1d0a01c not found: ID does not exist" Oct 02 12:39:48 crc kubenswrapper[4766]: I1002 12:39:48.060578 4766 scope.go:117] "RemoveContainer" containerID="1f44403e1066d4e684be8d3b9637147f0666575b247c1ee4e8158d56409b74dc" Oct 02 12:39:48 crc kubenswrapper[4766]: E1002 12:39:48.060954 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f44403e1066d4e684be8d3b9637147f0666575b247c1ee4e8158d56409b74dc\": container with ID starting with 1f44403e1066d4e684be8d3b9637147f0666575b247c1ee4e8158d56409b74dc not found: ID does not exist" containerID="1f44403e1066d4e684be8d3b9637147f0666575b247c1ee4e8158d56409b74dc" Oct 02 12:39:48 crc kubenswrapper[4766]: I1002 12:39:48.061004 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f44403e1066d4e684be8d3b9637147f0666575b247c1ee4e8158d56409b74dc"} err="failed to get container status \"1f44403e1066d4e684be8d3b9637147f0666575b247c1ee4e8158d56409b74dc\": rpc error: code = NotFound desc = could not find container \"1f44403e1066d4e684be8d3b9637147f0666575b247c1ee4e8158d56409b74dc\": container with ID starting with 1f44403e1066d4e684be8d3b9637147f0666575b247c1ee4e8158d56409b74dc not found: ID does not exist" Oct 02 12:39:49 crc kubenswrapper[4766]: I1002 12:39:49.896016 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de1831d6-eda6-444b-a277-951de5115465" path="/var/lib/kubelet/pods/de1831d6-eda6-444b-a277-951de5115465/volumes" Oct 02 12:39:52 crc kubenswrapper[4766]: I1002 12:39:52.048176 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-f2tmv"] Oct 02 12:39:52 crc kubenswrapper[4766]: I1002 12:39:52.062575 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-f2tmv"] Oct 02 12:39:53 crc kubenswrapper[4766]: I1002 12:39:53.371790 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-3546-account-create-t9dgr"] Oct 02 12:39:53 crc kubenswrapper[4766]: E1002 12:39:53.373141 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1831d6-eda6-444b-a277-951de5115465" containerName="registry-server" Oct 02 12:39:53 crc kubenswrapper[4766]: I1002 12:39:53.373175 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1831d6-eda6-444b-a277-951de5115465" containerName="registry-server" Oct 02 12:39:53 crc kubenswrapper[4766]: E1002 12:39:53.373236 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1831d6-eda6-444b-a277-951de5115465" containerName="extract-content" Oct 02 12:39:53 crc kubenswrapper[4766]: I1002 12:39:53.373253 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1831d6-eda6-444b-a277-951de5115465" containerName="extract-content" Oct 02 12:39:53 crc kubenswrapper[4766]: E1002 12:39:53.373304 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1831d6-eda6-444b-a277-951de5115465" containerName="extract-utilities" Oct 02 12:39:53 crc kubenswrapper[4766]: I1002 12:39:53.373354 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1831d6-eda6-444b-a277-951de5115465" containerName="extract-utilities" Oct 02 12:39:53 crc kubenswrapper[4766]: E1002 12:39:53.373373 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c68434a-a559-4f0d-b7b7-ed2490989b58" containerName="mariadb-database-create" Oct 02 12:39:53 crc kubenswrapper[4766]: I1002 12:39:53.373389 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c68434a-a559-4f0d-b7b7-ed2490989b58" containerName="mariadb-database-create" Oct 02 12:39:53 crc kubenswrapper[4766]: I1002 12:39:53.373935 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="de1831d6-eda6-444b-a277-951de5115465" containerName="registry-server" Oct 02 12:39:53 crc kubenswrapper[4766]: I1002 12:39:53.373996 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c68434a-a559-4f0d-b7b7-ed2490989b58" containerName="mariadb-database-create" Oct 02 12:39:53 crc kubenswrapper[4766]: I1002 12:39:53.375831 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-3546-account-create-t9dgr" Oct 02 12:39:53 crc kubenswrapper[4766]: I1002 12:39:53.379125 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Oct 02 12:39:53 crc kubenswrapper[4766]: I1002 12:39:53.388125 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-3546-account-create-t9dgr"] Oct 02 12:39:53 crc kubenswrapper[4766]: I1002 12:39:53.504189 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz5tz\" (UniqueName: \"kubernetes.io/projected/4183ea30-6477-4d5c-bae6-89e3e8e07591-kube-api-access-nz5tz\") pod \"aodh-3546-account-create-t9dgr\" (UID: \"4183ea30-6477-4d5c-bae6-89e3e8e07591\") " pod="openstack/aodh-3546-account-create-t9dgr" Oct 02 12:39:53 crc kubenswrapper[4766]: I1002 12:39:53.606849 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz5tz\" (UniqueName: \"kubernetes.io/projected/4183ea30-6477-4d5c-bae6-89e3e8e07591-kube-api-access-nz5tz\") pod \"aodh-3546-account-create-t9dgr\" (UID: \"4183ea30-6477-4d5c-bae6-89e3e8e07591\") " pod="openstack/aodh-3546-account-create-t9dgr" Oct 02 12:39:53 crc kubenswrapper[4766]: I1002 12:39:53.641072 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz5tz\" (UniqueName: \"kubernetes.io/projected/4183ea30-6477-4d5c-bae6-89e3e8e07591-kube-api-access-nz5tz\") pod \"aodh-3546-account-create-t9dgr\" (UID: \"4183ea30-6477-4d5c-bae6-89e3e8e07591\") " pod="openstack/aodh-3546-account-create-t9dgr" Oct 02 12:39:53 crc kubenswrapper[4766]: I1002 12:39:53.713829 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-3546-account-create-t9dgr" Oct 02 12:39:53 crc kubenswrapper[4766]: I1002 12:39:53.901810 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39c9842b-d4c5-4942-832c-568207d18446" path="/var/lib/kubelet/pods/39c9842b-d4c5-4942-832c-568207d18446/volumes" Oct 02 12:39:54 crc kubenswrapper[4766]: I1002 12:39:54.233033 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-3546-account-create-t9dgr"] Oct 02 12:39:55 crc kubenswrapper[4766]: I1002 12:39:55.022632 4766 generic.go:334] "Generic (PLEG): container finished" podID="4183ea30-6477-4d5c-bae6-89e3e8e07591" containerID="7dbd00e4dae346409b8d5550010bc9edbfee6d9f2645a8c7d3041a9b94c14c7f" exitCode=0 Oct 02 12:39:55 crc kubenswrapper[4766]: I1002 12:39:55.022748 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-3546-account-create-t9dgr" event={"ID":"4183ea30-6477-4d5c-bae6-89e3e8e07591","Type":"ContainerDied","Data":"7dbd00e4dae346409b8d5550010bc9edbfee6d9f2645a8c7d3041a9b94c14c7f"} Oct 02 12:39:55 crc kubenswrapper[4766]: I1002 12:39:55.025173 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-3546-account-create-t9dgr" event={"ID":"4183ea30-6477-4d5c-bae6-89e3e8e07591","Type":"ContainerStarted","Data":"501178f9968297bcff868767131bee4bc1461cac5e347303e44d77a3c2045323"} Oct 02 12:39:56 crc kubenswrapper[4766]: I1002 12:39:56.550346 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-3546-account-create-t9dgr" Oct 02 12:39:56 crc kubenswrapper[4766]: I1002 12:39:56.689736 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz5tz\" (UniqueName: \"kubernetes.io/projected/4183ea30-6477-4d5c-bae6-89e3e8e07591-kube-api-access-nz5tz\") pod \"4183ea30-6477-4d5c-bae6-89e3e8e07591\" (UID: \"4183ea30-6477-4d5c-bae6-89e3e8e07591\") " Oct 02 12:39:56 crc kubenswrapper[4766]: I1002 12:39:56.699405 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4183ea30-6477-4d5c-bae6-89e3e8e07591-kube-api-access-nz5tz" (OuterVolumeSpecName: "kube-api-access-nz5tz") pod "4183ea30-6477-4d5c-bae6-89e3e8e07591" (UID: "4183ea30-6477-4d5c-bae6-89e3e8e07591"). InnerVolumeSpecName "kube-api-access-nz5tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:39:56 crc kubenswrapper[4766]: I1002 12:39:56.793181 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz5tz\" (UniqueName: \"kubernetes.io/projected/4183ea30-6477-4d5c-bae6-89e3e8e07591-kube-api-access-nz5tz\") on node \"crc\" DevicePath \"\"" Oct 02 12:39:57 crc kubenswrapper[4766]: I1002 12:39:57.051747 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-3546-account-create-t9dgr" event={"ID":"4183ea30-6477-4d5c-bae6-89e3e8e07591","Type":"ContainerDied","Data":"501178f9968297bcff868767131bee4bc1461cac5e347303e44d77a3c2045323"} Oct 02 12:39:57 crc kubenswrapper[4766]: I1002 12:39:57.052293 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="501178f9968297bcff868767131bee4bc1461cac5e347303e44d77a3c2045323" Oct 02 12:39:57 crc kubenswrapper[4766]: I1002 12:39:57.051818 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-3546-account-create-t9dgr" Oct 02 12:39:58 crc kubenswrapper[4766]: I1002 12:39:58.832173 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-gg7hw"] Oct 02 12:39:58 crc kubenswrapper[4766]: E1002 12:39:58.833275 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4183ea30-6477-4d5c-bae6-89e3e8e07591" containerName="mariadb-account-create" Oct 02 12:39:58 crc kubenswrapper[4766]: I1002 12:39:58.833303 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4183ea30-6477-4d5c-bae6-89e3e8e07591" containerName="mariadb-account-create" Oct 02 12:39:58 crc kubenswrapper[4766]: I1002 12:39:58.833614 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="4183ea30-6477-4d5c-bae6-89e3e8e07591" containerName="mariadb-account-create" Oct 02 12:39:58 crc kubenswrapper[4766]: I1002 12:39:58.834701 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-gg7hw" Oct 02 12:39:58 crc kubenswrapper[4766]: I1002 12:39:58.847327 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 02 12:39:58 crc kubenswrapper[4766]: I1002 12:39:58.847623 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9zbzf" Oct 02 12:39:58 crc kubenswrapper[4766]: I1002 12:39:58.847803 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 02 12:39:58 crc kubenswrapper[4766]: I1002 12:39:58.857225 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-gg7hw"] Oct 02 12:39:58 crc kubenswrapper[4766]: I1002 12:39:58.950825 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-scripts\") pod \"aodh-db-sync-gg7hw\" (UID: \"74f60e52-cef0-4224-83e9-ec914df2bd9d\") " pod="openstack/aodh-db-sync-gg7hw" Oct 02 12:39:58 crc kubenswrapper[4766]: I1002 12:39:58.950895 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq72t\" (UniqueName: \"kubernetes.io/projected/74f60e52-cef0-4224-83e9-ec914df2bd9d-kube-api-access-gq72t\") pod \"aodh-db-sync-gg7hw\" (UID: \"74f60e52-cef0-4224-83e9-ec914df2bd9d\") " pod="openstack/aodh-db-sync-gg7hw" Oct 02 12:39:58 crc kubenswrapper[4766]: I1002 12:39:58.950976 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-combined-ca-bundle\") pod \"aodh-db-sync-gg7hw\" (UID: \"74f60e52-cef0-4224-83e9-ec914df2bd9d\") " pod="openstack/aodh-db-sync-gg7hw" Oct 02 12:39:58 crc kubenswrapper[4766]: I1002 12:39:58.951019 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-config-data\") pod \"aodh-db-sync-gg7hw\" (UID: \"74f60e52-cef0-4224-83e9-ec914df2bd9d\") " pod="openstack/aodh-db-sync-gg7hw" Oct 02 12:39:59 crc kubenswrapper[4766]: I1002 12:39:59.053362 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq72t\" (UniqueName: \"kubernetes.io/projected/74f60e52-cef0-4224-83e9-ec914df2bd9d-kube-api-access-gq72t\") pod \"aodh-db-sync-gg7hw\" (UID: \"74f60e52-cef0-4224-83e9-ec914df2bd9d\") " pod="openstack/aodh-db-sync-gg7hw" Oct 02 12:39:59 crc kubenswrapper[4766]: I1002 12:39:59.053496 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-combined-ca-bundle\") pod \"aodh-db-sync-gg7hw\" (UID: \"74f60e52-cef0-4224-83e9-ec914df2bd9d\") " pod="openstack/aodh-db-sync-gg7hw" Oct 02 12:39:59 crc kubenswrapper[4766]: I1002 12:39:59.053555 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-config-data\") pod \"aodh-db-sync-gg7hw\" (UID: \"74f60e52-cef0-4224-83e9-ec914df2bd9d\") " pod="openstack/aodh-db-sync-gg7hw" Oct 02 12:39:59 crc kubenswrapper[4766]: I1002 12:39:59.055424 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-scripts\") pod \"aodh-db-sync-gg7hw\" (UID: \"74f60e52-cef0-4224-83e9-ec914df2bd9d\") " pod="openstack/aodh-db-sync-gg7hw" Oct 02 12:39:59 crc kubenswrapper[4766]: I1002 12:39:59.061283 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-config-data\") pod \"aodh-db-sync-gg7hw\" (UID: \"74f60e52-cef0-4224-83e9-ec914df2bd9d\") " pod="openstack/aodh-db-sync-gg7hw" Oct 02 12:39:59 crc kubenswrapper[4766]: I1002 12:39:59.061649 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-combined-ca-bundle\") pod \"aodh-db-sync-gg7hw\" (UID: \"74f60e52-cef0-4224-83e9-ec914df2bd9d\") " pod="openstack/aodh-db-sync-gg7hw" Oct 02 12:39:59 crc kubenswrapper[4766]: I1002 12:39:59.063035 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-scripts\") pod \"aodh-db-sync-gg7hw\" (UID: \"74f60e52-cef0-4224-83e9-ec914df2bd9d\") " pod="openstack/aodh-db-sync-gg7hw" Oct 02 12:39:59 crc kubenswrapper[4766]: I1002 12:39:59.075773 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq72t\" (UniqueName: \"kubernetes.io/projected/74f60e52-cef0-4224-83e9-ec914df2bd9d-kube-api-access-gq72t\") pod \"aodh-db-sync-gg7hw\" (UID: \"74f60e52-cef0-4224-83e9-ec914df2bd9d\") " pod="openstack/aodh-db-sync-gg7hw" Oct 02 12:39:59 crc kubenswrapper[4766]: I1002 12:39:59.168740 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-gg7hw" Oct 02 12:39:59 crc kubenswrapper[4766]: I1002 12:39:59.711428 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-gg7hw"] Oct 02 12:40:00 crc kubenswrapper[4766]: I1002 12:40:00.089672 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-gg7hw" event={"ID":"74f60e52-cef0-4224-83e9-ec914df2bd9d","Type":"ContainerStarted","Data":"30e799c85fe82cbbac59d0eccf85fa010c95138026c1ca59604448877b59a6dc"} Oct 02 12:40:01 crc kubenswrapper[4766]: I1002 12:40:01.881228 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:40:01 crc kubenswrapper[4766]: E1002 12:40:01.882176 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:40:02 crc kubenswrapper[4766]: I1002 12:40:02.031935 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-bc2d-account-create-9j459"] Oct 02 12:40:02 crc kubenswrapper[4766]: I1002 12:40:02.044630 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-bc2d-account-create-9j459"] Oct 02 12:40:03 crc kubenswrapper[4766]: I1002 12:40:03.503700 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 12:40:03 crc kubenswrapper[4766]: I1002 12:40:03.899370 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01" path="/var/lib/kubelet/pods/2cb6f7b5-0bf7-45fe-911a-e9b722c8fc01/volumes" Oct 02 12:40:09 crc kubenswrapper[4766]: I1002 12:40:09.203468 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-gg7hw" event={"ID":"74f60e52-cef0-4224-83e9-ec914df2bd9d","Type":"ContainerStarted","Data":"4f5a0273e90ff42f896232eebe95a70aa0fd21390d84e5047438714cb59cc948"} Oct 02 12:40:09 crc kubenswrapper[4766]: I1002 12:40:09.235083 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-gg7hw" podStartSLOduration=2.6932161580000002 podStartE2EDuration="11.23506222s" podCreationTimestamp="2025-10-02 12:39:58 +0000 UTC" firstStartedPulling="2025-10-02 12:39:59.725675386 +0000 UTC m=+6514.668546330" lastFinishedPulling="2025-10-02 12:40:08.267521428 +0000 UTC m=+6523.210392392" observedRunningTime="2025-10-02 12:40:09.230005818 +0000 UTC m=+6524.172876762" watchObservedRunningTime="2025-10-02 12:40:09.23506222 +0000 UTC m=+6524.177933164" Oct 02 12:40:10 crc kubenswrapper[4766]: I1002 12:40:10.041668 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-84v6j"] Oct 02 12:40:10 crc kubenswrapper[4766]: I1002 12:40:10.054894 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-84v6j"] Oct 02 12:40:11 crc kubenswrapper[4766]: I1002 12:40:11.897284 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38f0193d-efaf-4515-9101-767548e3e007" path="/var/lib/kubelet/pods/38f0193d-efaf-4515-9101-767548e3e007/volumes" Oct 02 12:40:12 crc kubenswrapper[4766]: I1002 12:40:12.881834 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:40:12 crc kubenswrapper[4766]: E1002 12:40:12.882547 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:40:21 crc kubenswrapper[4766]: I1002 12:40:21.363710 4766 generic.go:334] "Generic (PLEG): container finished" podID="74f60e52-cef0-4224-83e9-ec914df2bd9d" containerID="4f5a0273e90ff42f896232eebe95a70aa0fd21390d84e5047438714cb59cc948" exitCode=0 Oct 02 12:40:21 crc kubenswrapper[4766]: I1002 12:40:21.363825 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-gg7hw" event={"ID":"74f60e52-cef0-4224-83e9-ec914df2bd9d","Type":"ContainerDied","Data":"4f5a0273e90ff42f896232eebe95a70aa0fd21390d84e5047438714cb59cc948"} Oct 02 12:40:22 crc kubenswrapper[4766]: I1002 12:40:22.830075 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-gg7hw" Oct 02 12:40:22 crc kubenswrapper[4766]: I1002 12:40:22.887890 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-combined-ca-bundle\") pod \"74f60e52-cef0-4224-83e9-ec914df2bd9d\" (UID: \"74f60e52-cef0-4224-83e9-ec914df2bd9d\") " Oct 02 12:40:22 crc kubenswrapper[4766]: I1002 12:40:22.888030 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-config-data\") pod \"74f60e52-cef0-4224-83e9-ec914df2bd9d\" (UID: \"74f60e52-cef0-4224-83e9-ec914df2bd9d\") " Oct 02 12:40:22 crc kubenswrapper[4766]: I1002 12:40:22.888175 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-scripts\") pod \"74f60e52-cef0-4224-83e9-ec914df2bd9d\" (UID: \"74f60e52-cef0-4224-83e9-ec914df2bd9d\") " Oct 02 12:40:22 crc kubenswrapper[4766]: I1002 12:40:22.888464 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq72t\" (UniqueName: \"kubernetes.io/projected/74f60e52-cef0-4224-83e9-ec914df2bd9d-kube-api-access-gq72t\") pod \"74f60e52-cef0-4224-83e9-ec914df2bd9d\" (UID: \"74f60e52-cef0-4224-83e9-ec914df2bd9d\") " Oct 02 12:40:22 crc kubenswrapper[4766]: I1002 12:40:22.902475 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-scripts" (OuterVolumeSpecName: "scripts") pod "74f60e52-cef0-4224-83e9-ec914df2bd9d" (UID: "74f60e52-cef0-4224-83e9-ec914df2bd9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:40:22 crc kubenswrapper[4766]: I1002 12:40:22.903485 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74f60e52-cef0-4224-83e9-ec914df2bd9d-kube-api-access-gq72t" (OuterVolumeSpecName: "kube-api-access-gq72t") pod "74f60e52-cef0-4224-83e9-ec914df2bd9d" (UID: "74f60e52-cef0-4224-83e9-ec914df2bd9d"). InnerVolumeSpecName "kube-api-access-gq72t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:40:22 crc kubenswrapper[4766]: I1002 12:40:22.940432 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-config-data" (OuterVolumeSpecName: "config-data") pod "74f60e52-cef0-4224-83e9-ec914df2bd9d" (UID: "74f60e52-cef0-4224-83e9-ec914df2bd9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:40:22 crc kubenswrapper[4766]: I1002 12:40:22.944452 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74f60e52-cef0-4224-83e9-ec914df2bd9d" (UID: "74f60e52-cef0-4224-83e9-ec914df2bd9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:22.999904 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.009888 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.009937 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74f60e52-cef0-4224-83e9-ec914df2bd9d-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.009962 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq72t\" (UniqueName: \"kubernetes.io/projected/74f60e52-cef0-4224-83e9-ec914df2bd9d-kube-api-access-gq72t\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.391521 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-gg7hw" event={"ID":"74f60e52-cef0-4224-83e9-ec914df2bd9d","Type":"ContainerDied","Data":"30e799c85fe82cbbac59d0eccf85fa010c95138026c1ca59604448877b59a6dc"} Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.391573 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30e799c85fe82cbbac59d0eccf85fa010c95138026c1ca59604448877b59a6dc" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.391700 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-gg7hw" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.599790 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Oct 02 12:40:23 crc kubenswrapper[4766]: E1002 12:40:23.601104 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f60e52-cef0-4224-83e9-ec914df2bd9d" containerName="aodh-db-sync" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.601144 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f60e52-cef0-4224-83e9-ec914df2bd9d" containerName="aodh-db-sync" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.601551 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="74f60e52-cef0-4224-83e9-ec914df2bd9d" containerName="aodh-db-sync" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.605610 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.608260 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9zbzf" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.608458 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.611046 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.621871 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.728674 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/823c5010-35e5-4ab1-8b6b-d8c41b014442-combined-ca-bundle\") pod \"aodh-0\" (UID: \"823c5010-35e5-4ab1-8b6b-d8c41b014442\") " pod="openstack/aodh-0" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.728762 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85cqr\" (UniqueName: \"kubernetes.io/projected/823c5010-35e5-4ab1-8b6b-d8c41b014442-kube-api-access-85cqr\") pod \"aodh-0\" (UID: \"823c5010-35e5-4ab1-8b6b-d8c41b014442\") " pod="openstack/aodh-0" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.728990 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/823c5010-35e5-4ab1-8b6b-d8c41b014442-config-data\") pod \"aodh-0\" (UID: \"823c5010-35e5-4ab1-8b6b-d8c41b014442\") " pod="openstack/aodh-0" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.729075 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/823c5010-35e5-4ab1-8b6b-d8c41b014442-scripts\") pod \"aodh-0\" (UID: \"823c5010-35e5-4ab1-8b6b-d8c41b014442\") " pod="openstack/aodh-0" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.831532 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/823c5010-35e5-4ab1-8b6b-d8c41b014442-config-data\") pod \"aodh-0\" (UID: \"823c5010-35e5-4ab1-8b6b-d8c41b014442\") " pod="openstack/aodh-0" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.831603 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/823c5010-35e5-4ab1-8b6b-d8c41b014442-scripts\") pod \"aodh-0\" (UID: \"823c5010-35e5-4ab1-8b6b-d8c41b014442\") " pod="openstack/aodh-0" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.831789 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/823c5010-35e5-4ab1-8b6b-d8c41b014442-combined-ca-bundle\") pod \"aodh-0\" (UID: \"823c5010-35e5-4ab1-8b6b-d8c41b014442\") " pod="openstack/aodh-0" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.831833 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85cqr\" (UniqueName: \"kubernetes.io/projected/823c5010-35e5-4ab1-8b6b-d8c41b014442-kube-api-access-85cqr\") pod \"aodh-0\" (UID: \"823c5010-35e5-4ab1-8b6b-d8c41b014442\") " pod="openstack/aodh-0" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.836136 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/823c5010-35e5-4ab1-8b6b-d8c41b014442-scripts\") pod \"aodh-0\" (UID: \"823c5010-35e5-4ab1-8b6b-d8c41b014442\") " pod="openstack/aodh-0" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.838532 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/823c5010-35e5-4ab1-8b6b-d8c41b014442-config-data\") pod \"aodh-0\" (UID: \"823c5010-35e5-4ab1-8b6b-d8c41b014442\") " pod="openstack/aodh-0" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.852929 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/823c5010-35e5-4ab1-8b6b-d8c41b014442-combined-ca-bundle\") pod \"aodh-0\" (UID: \"823c5010-35e5-4ab1-8b6b-d8c41b014442\") " pod="openstack/aodh-0" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.853948 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85cqr\" (UniqueName: \"kubernetes.io/projected/823c5010-35e5-4ab1-8b6b-d8c41b014442-kube-api-access-85cqr\") pod \"aodh-0\" (UID: \"823c5010-35e5-4ab1-8b6b-d8c41b014442\") " pod="openstack/aodh-0" Oct 02 12:40:23 crc kubenswrapper[4766]: I1002 12:40:23.928387 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Oct 02 12:40:24 crc kubenswrapper[4766]: I1002 12:40:24.514642 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Oct 02 12:40:24 crc kubenswrapper[4766]: W1002 12:40:24.526383 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod823c5010_35e5_4ab1_8b6b_d8c41b014442.slice/crio-07303f70fa9aa74e77054f486baaa1e5e71bbb329b60ac17e4adb0435d3ec69d WatchSource:0}: Error finding container 07303f70fa9aa74e77054f486baaa1e5e71bbb329b60ac17e4adb0435d3ec69d: Status 404 returned error can't find the container with id 07303f70fa9aa74e77054f486baaa1e5e71bbb329b60ac17e4adb0435d3ec69d Oct 02 12:40:25 crc kubenswrapper[4766]: I1002 12:40:25.415630 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"823c5010-35e5-4ab1-8b6b-d8c41b014442","Type":"ContainerStarted","Data":"07303f70fa9aa74e77054f486baaa1e5e71bbb329b60ac17e4adb0435d3ec69d"} Oct 02 12:40:26 crc kubenswrapper[4766]: I1002 12:40:26.490903 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:40:26 crc kubenswrapper[4766]: I1002 12:40:26.491584 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerName="ceilometer-central-agent" containerID="cri-o://f804c8aabb2432908cd15c97a641d921c076e2a065c3f7980e9165aa744bf0eb" gracePeriod=30 Oct 02 12:40:26 crc kubenswrapper[4766]: I1002 12:40:26.491633 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerName="sg-core" containerID="cri-o://4bcda70898bc06aa9c31d5d90815d02f8c5b363e17eb3e740bca9521fad974ea" gracePeriod=30 Oct 02 12:40:26 crc kubenswrapper[4766]: I1002 12:40:26.491642 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerName="proxy-httpd" containerID="cri-o://8df07083178d7524a148d930f380c5f0b2f25fc52d2d174ffa92efd64a3a56dd" gracePeriod=30 Oct 02 12:40:26 crc kubenswrapper[4766]: I1002 12:40:26.491745 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerName="ceilometer-notification-agent" containerID="cri-o://9bd3c5cead1c11672a85b921a87eadf8c166fa61922f947c62dc6bad5196bfd4" gracePeriod=30 Oct 02 12:40:26 crc kubenswrapper[4766]: I1002 12:40:26.881417 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:40:26 crc kubenswrapper[4766]: E1002 12:40:26.882268 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:40:27 crc kubenswrapper[4766]: I1002 12:40:27.444544 4766 generic.go:334] "Generic (PLEG): container finished" podID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerID="8df07083178d7524a148d930f380c5f0b2f25fc52d2d174ffa92efd64a3a56dd" exitCode=0 Oct 02 12:40:27 crc kubenswrapper[4766]: I1002 12:40:27.444587 4766 generic.go:334] "Generic (PLEG): container finished" podID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerID="4bcda70898bc06aa9c31d5d90815d02f8c5b363e17eb3e740bca9521fad974ea" exitCode=2 Oct 02 12:40:27 crc kubenswrapper[4766]: I1002 12:40:27.444597 4766 generic.go:334] "Generic (PLEG): container finished" podID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerID="f804c8aabb2432908cd15c97a641d921c076e2a065c3f7980e9165aa744bf0eb" exitCode=0 Oct 02 12:40:27 crc kubenswrapper[4766]: I1002 12:40:27.444574 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ea3316b-5093-4269-97ad-ec3fdbba3837","Type":"ContainerDied","Data":"8df07083178d7524a148d930f380c5f0b2f25fc52d2d174ffa92efd64a3a56dd"} Oct 02 12:40:27 crc kubenswrapper[4766]: I1002 12:40:27.444662 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ea3316b-5093-4269-97ad-ec3fdbba3837","Type":"ContainerDied","Data":"4bcda70898bc06aa9c31d5d90815d02f8c5b363e17eb3e740bca9521fad974ea"} Oct 02 12:40:27 crc kubenswrapper[4766]: I1002 12:40:27.444675 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ea3316b-5093-4269-97ad-ec3fdbba3837","Type":"ContainerDied","Data":"f804c8aabb2432908cd15c97a641d921c076e2a065c3f7980e9165aa744bf0eb"} Oct 02 12:40:27 crc kubenswrapper[4766]: I1002 12:40:27.447628 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"823c5010-35e5-4ab1-8b6b-d8c41b014442","Type":"ContainerStarted","Data":"db12fd23adee1f301b593642ea1cd188832ccd219b9659058f715cec72cd2275"} Oct 02 12:40:30 crc kubenswrapper[4766]: I1002 12:40:30.497716 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"823c5010-35e5-4ab1-8b6b-d8c41b014442","Type":"ContainerStarted","Data":"b9380ffc2fd6ac8b5ebced5e8e2bd7cc839365c84232517961c4ab8b60732a49"} Oct 02 12:40:31 crc kubenswrapper[4766]: I1002 12:40:31.515866 4766 generic.go:334] "Generic (PLEG): container finished" podID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerID="9bd3c5cead1c11672a85b921a87eadf8c166fa61922f947c62dc6bad5196bfd4" exitCode=0 Oct 02 12:40:31 crc kubenswrapper[4766]: I1002 12:40:31.517150 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ea3316b-5093-4269-97ad-ec3fdbba3837","Type":"ContainerDied","Data":"9bd3c5cead1c11672a85b921a87eadf8c166fa61922f947c62dc6bad5196bfd4"} Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.736076 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.877771 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ea3316b-5093-4269-97ad-ec3fdbba3837-log-httpd\") pod \"2ea3316b-5093-4269-97ad-ec3fdbba3837\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.877887 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5t82\" (UniqueName: \"kubernetes.io/projected/2ea3316b-5093-4269-97ad-ec3fdbba3837-kube-api-access-w5t82\") pod \"2ea3316b-5093-4269-97ad-ec3fdbba3837\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.878137 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-combined-ca-bundle\") pod \"2ea3316b-5093-4269-97ad-ec3fdbba3837\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.878212 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-scripts\") pod \"2ea3316b-5093-4269-97ad-ec3fdbba3837\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.878241 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-sg-core-conf-yaml\") pod \"2ea3316b-5093-4269-97ad-ec3fdbba3837\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.878362 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ea3316b-5093-4269-97ad-ec3fdbba3837-run-httpd\") pod \"2ea3316b-5093-4269-97ad-ec3fdbba3837\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.878438 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-config-data\") pod \"2ea3316b-5093-4269-97ad-ec3fdbba3837\" (UID: \"2ea3316b-5093-4269-97ad-ec3fdbba3837\") " Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.883733 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ea3316b-5093-4269-97ad-ec3fdbba3837-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2ea3316b-5093-4269-97ad-ec3fdbba3837" (UID: "2ea3316b-5093-4269-97ad-ec3fdbba3837"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.883770 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ea3316b-5093-4269-97ad-ec3fdbba3837-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2ea3316b-5093-4269-97ad-ec3fdbba3837" (UID: "2ea3316b-5093-4269-97ad-ec3fdbba3837"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.892526 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ea3316b-5093-4269-97ad-ec3fdbba3837-kube-api-access-w5t82" (OuterVolumeSpecName: "kube-api-access-w5t82") pod "2ea3316b-5093-4269-97ad-ec3fdbba3837" (UID: "2ea3316b-5093-4269-97ad-ec3fdbba3837"). InnerVolumeSpecName "kube-api-access-w5t82". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.902869 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-scripts" (OuterVolumeSpecName: "scripts") pod "2ea3316b-5093-4269-97ad-ec3fdbba3837" (UID: "2ea3316b-5093-4269-97ad-ec3fdbba3837"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.929328 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2ea3316b-5093-4269-97ad-ec3fdbba3837" (UID: "2ea3316b-5093-4269-97ad-ec3fdbba3837"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.980364 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ea3316b-5093-4269-97ad-ec3fdbba3837" (UID: "2ea3316b-5093-4269-97ad-ec3fdbba3837"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.982083 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.982119 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.982127 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.982137 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ea3316b-5093-4269-97ad-ec3fdbba3837-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.982148 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ea3316b-5093-4269-97ad-ec3fdbba3837-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:32 crc kubenswrapper[4766]: I1002 12:40:32.982157 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5t82\" (UniqueName: \"kubernetes.io/projected/2ea3316b-5093-4269-97ad-ec3fdbba3837-kube-api-access-w5t82\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.040629 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-config-data" (OuterVolumeSpecName: "config-data") pod "2ea3316b-5093-4269-97ad-ec3fdbba3837" (UID: "2ea3316b-5093-4269-97ad-ec3fdbba3837"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.084703 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ea3316b-5093-4269-97ad-ec3fdbba3837-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.556144 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ea3316b-5093-4269-97ad-ec3fdbba3837","Type":"ContainerDied","Data":"5bce4033772dac214fa443894a89afbd9353d8c8b921daadfc74f4d313382aff"} Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.556238 4766 scope.go:117] "RemoveContainer" containerID="8df07083178d7524a148d930f380c5f0b2f25fc52d2d174ffa92efd64a3a56dd" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.556289 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.649276 4766 scope.go:117] "RemoveContainer" containerID="4bcda70898bc06aa9c31d5d90815d02f8c5b363e17eb3e740bca9521fad974ea" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.660280 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.676894 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.695655 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:40:33 crc kubenswrapper[4766]: E1002 12:40:33.696350 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerName="sg-core" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.696375 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerName="sg-core" Oct 02 12:40:33 crc kubenswrapper[4766]: E1002 12:40:33.696405 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerName="ceilometer-notification-agent" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.696416 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerName="ceilometer-notification-agent" Oct 02 12:40:33 crc kubenswrapper[4766]: E1002 12:40:33.696482 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerName="ceilometer-central-agent" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.696553 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerName="ceilometer-central-agent" Oct 02 12:40:33 crc kubenswrapper[4766]: E1002 12:40:33.696577 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerName="proxy-httpd" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.696585 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerName="proxy-httpd" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.696872 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerName="ceilometer-notification-agent" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.696891 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerName="proxy-httpd" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.696928 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerName="ceilometer-central-agent" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.696955 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ea3316b-5093-4269-97ad-ec3fdbba3837" containerName="sg-core" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.700639 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.703099 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.703292 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.709073 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.745794 4766 scope.go:117] "RemoveContainer" containerID="9bd3c5cead1c11672a85b921a87eadf8c166fa61922f947c62dc6bad5196bfd4" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.779016 4766 scope.go:117] "RemoveContainer" containerID="f804c8aabb2432908cd15c97a641d921c076e2a065c3f7980e9165aa744bf0eb" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.802356 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-config-data\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.802435 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.802549 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.802620 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28m46\" (UniqueName: \"kubernetes.io/projected/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-kube-api-access-28m46\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.802683 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-log-httpd\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.802709 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-run-httpd\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.802740 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-scripts\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.901262 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ea3316b-5093-4269-97ad-ec3fdbba3837" path="/var/lib/kubelet/pods/2ea3316b-5093-4269-97ad-ec3fdbba3837/volumes" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.903999 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-log-httpd\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.904039 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-run-httpd\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.904067 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-scripts\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.904105 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-config-data\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.904133 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.904188 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.904244 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28m46\" (UniqueName: \"kubernetes.io/projected/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-kube-api-access-28m46\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.904922 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-log-httpd\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.905164 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-run-httpd\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.912157 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.912342 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-config-data\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.913857 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.915799 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-scripts\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:33 crc kubenswrapper[4766]: I1002 12:40:33.931670 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28m46\" (UniqueName: \"kubernetes.io/projected/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-kube-api-access-28m46\") pod \"ceilometer-0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " pod="openstack/ceilometer-0" Oct 02 12:40:34 crc kubenswrapper[4766]: I1002 12:40:34.051732 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:40:34 crc kubenswrapper[4766]: I1002 12:40:34.581381 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"823c5010-35e5-4ab1-8b6b-d8c41b014442","Type":"ContainerStarted","Data":"453d868a313441f4711ca64af3b72223a1bed4a02a9b260347a3a94702a777b4"} Oct 02 12:40:34 crc kubenswrapper[4766]: I1002 12:40:34.614485 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:40:35 crc kubenswrapper[4766]: I1002 12:40:35.608683 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0","Type":"ContainerStarted","Data":"cc97f7e56aa01abb4575ec15c6e94b99bf609c839ecfc25cfe8e748f1b489532"} Oct 02 12:40:35 crc kubenswrapper[4766]: I1002 12:40:35.609686 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0","Type":"ContainerStarted","Data":"f590679caf92013e6909f532ca4a9de56609c39941666682a6d6c74f16ab292d"} Oct 02 12:40:39 crc kubenswrapper[4766]: I1002 12:40:39.270473 4766 scope.go:117] "RemoveContainer" containerID="c3cb913deafbca13578a59bbc38999d89f70caf971421486233a1d745ea80365" Oct 02 12:40:39 crc kubenswrapper[4766]: I1002 12:40:39.749119 4766 scope.go:117] "RemoveContainer" containerID="4d87032f43fab594831fd93d70bcbae93900e68d70d9a41f5e2eeb3549e3ccc2" Oct 02 12:40:39 crc kubenswrapper[4766]: I1002 12:40:39.825581 4766 scope.go:117] "RemoveContainer" containerID="92478fb37e7efbc7d0842cb54c7b2f68a2cad7eacb4afe2ffbe5518aefe7e0a0" Oct 02 12:40:40 crc kubenswrapper[4766]: I1002 12:40:40.671809 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0","Type":"ContainerStarted","Data":"c64039966b6fd62b0951f7b67f2d582de446a70d20abb24895fe0eeb5a1f0380"} Oct 02 12:40:40 crc kubenswrapper[4766]: I1002 12:40:40.674680 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"823c5010-35e5-4ab1-8b6b-d8c41b014442","Type":"ContainerStarted","Data":"86c562da807f099feef08c46a45e9a2672c6849dd3bceee9205c4b26135f538e"} Oct 02 12:40:40 crc kubenswrapper[4766]: I1002 12:40:40.708994 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.41366235 podStartE2EDuration="17.708958168s" podCreationTimestamp="2025-10-02 12:40:23 +0000 UTC" firstStartedPulling="2025-10-02 12:40:24.531227224 +0000 UTC m=+6539.474098168" lastFinishedPulling="2025-10-02 12:40:39.826523032 +0000 UTC m=+6554.769393986" observedRunningTime="2025-10-02 12:40:40.697803321 +0000 UTC m=+6555.640674275" watchObservedRunningTime="2025-10-02 12:40:40.708958168 +0000 UTC m=+6555.651829122" Oct 02 12:40:41 crc kubenswrapper[4766]: I1002 12:40:41.689388 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0","Type":"ContainerStarted","Data":"73a04b4d6ca244c5535fa45f4c43cbbf4e3f69539073998964bbc4e13c6e00ce"} Oct 02 12:40:41 crc kubenswrapper[4766]: I1002 12:40:41.883762 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:40:41 crc kubenswrapper[4766]: E1002 12:40:41.884173 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:40:42 crc kubenswrapper[4766]: I1002 12:40:42.704140 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0","Type":"ContainerStarted","Data":"cb9ac8f9a48cfe3c637d3564c511fc0eab3c76c917a6e4dad2034823a12e4f63"} Oct 02 12:40:42 crc kubenswrapper[4766]: I1002 12:40:42.704492 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 12:40:42 crc kubenswrapper[4766]: I1002 12:40:42.729266 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9618724969999999 podStartE2EDuration="9.729238611s" podCreationTimestamp="2025-10-02 12:40:33 +0000 UTC" firstStartedPulling="2025-10-02 12:40:34.624036026 +0000 UTC m=+6549.566906980" lastFinishedPulling="2025-10-02 12:40:42.39140215 +0000 UTC m=+6557.334273094" observedRunningTime="2025-10-02 12:40:42.727165285 +0000 UTC m=+6557.670036229" watchObservedRunningTime="2025-10-02 12:40:42.729238611 +0000 UTC m=+6557.672109555" Oct 02 12:40:49 crc kubenswrapper[4766]: I1002 12:40:49.117034 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-wrf46"] Oct 02 12:40:49 crc kubenswrapper[4766]: I1002 12:40:49.119772 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wrf46" Oct 02 12:40:49 crc kubenswrapper[4766]: I1002 12:40:49.131031 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-wrf46"] Oct 02 12:40:49 crc kubenswrapper[4766]: I1002 12:40:49.147357 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d596g\" (UniqueName: \"kubernetes.io/projected/aab08b25-08d7-4dd3-837c-24863de3ab01-kube-api-access-d596g\") pod \"manila-db-create-wrf46\" (UID: \"aab08b25-08d7-4dd3-837c-24863de3ab01\") " pod="openstack/manila-db-create-wrf46" Oct 02 12:40:49 crc kubenswrapper[4766]: I1002 12:40:49.249928 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d596g\" (UniqueName: \"kubernetes.io/projected/aab08b25-08d7-4dd3-837c-24863de3ab01-kube-api-access-d596g\") pod \"manila-db-create-wrf46\" (UID: \"aab08b25-08d7-4dd3-837c-24863de3ab01\") " pod="openstack/manila-db-create-wrf46" Oct 02 12:40:49 crc kubenswrapper[4766]: I1002 12:40:49.274603 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d596g\" (UniqueName: \"kubernetes.io/projected/aab08b25-08d7-4dd3-837c-24863de3ab01-kube-api-access-d596g\") pod \"manila-db-create-wrf46\" (UID: \"aab08b25-08d7-4dd3-837c-24863de3ab01\") " pod="openstack/manila-db-create-wrf46" Oct 02 12:40:49 crc kubenswrapper[4766]: I1002 12:40:49.452691 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wrf46" Oct 02 12:40:50 crc kubenswrapper[4766]: I1002 12:40:50.110023 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-wrf46"] Oct 02 12:40:50 crc kubenswrapper[4766]: I1002 12:40:50.800400 4766 generic.go:334] "Generic (PLEG): container finished" podID="aab08b25-08d7-4dd3-837c-24863de3ab01" containerID="c4bed18926e9e6bb7b83fb8174e4f461e3bee4196997bfc78417bfb54ab04b83" exitCode=0 Oct 02 12:40:50 crc kubenswrapper[4766]: I1002 12:40:50.800496 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wrf46" event={"ID":"aab08b25-08d7-4dd3-837c-24863de3ab01","Type":"ContainerDied","Data":"c4bed18926e9e6bb7b83fb8174e4f461e3bee4196997bfc78417bfb54ab04b83"} Oct 02 12:40:50 crc kubenswrapper[4766]: I1002 12:40:50.800845 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wrf46" event={"ID":"aab08b25-08d7-4dd3-837c-24863de3ab01","Type":"ContainerStarted","Data":"7134978f38e13615511050acb4dc2aaf85c244fdab8116ad1b7950070354c0b9"} Oct 02 12:40:52 crc kubenswrapper[4766]: I1002 12:40:52.278575 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wrf46" Oct 02 12:40:52 crc kubenswrapper[4766]: I1002 12:40:52.322426 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d596g\" (UniqueName: \"kubernetes.io/projected/aab08b25-08d7-4dd3-837c-24863de3ab01-kube-api-access-d596g\") pod \"aab08b25-08d7-4dd3-837c-24863de3ab01\" (UID: \"aab08b25-08d7-4dd3-837c-24863de3ab01\") " Oct 02 12:40:52 crc kubenswrapper[4766]: I1002 12:40:52.337216 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aab08b25-08d7-4dd3-837c-24863de3ab01-kube-api-access-d596g" (OuterVolumeSpecName: "kube-api-access-d596g") pod "aab08b25-08d7-4dd3-837c-24863de3ab01" (UID: "aab08b25-08d7-4dd3-837c-24863de3ab01"). InnerVolumeSpecName "kube-api-access-d596g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:40:52 crc kubenswrapper[4766]: I1002 12:40:52.428557 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d596g\" (UniqueName: \"kubernetes.io/projected/aab08b25-08d7-4dd3-837c-24863de3ab01-kube-api-access-d596g\") on node \"crc\" DevicePath \"\"" Oct 02 12:40:52 crc kubenswrapper[4766]: I1002 12:40:52.827780 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wrf46" event={"ID":"aab08b25-08d7-4dd3-837c-24863de3ab01","Type":"ContainerDied","Data":"7134978f38e13615511050acb4dc2aaf85c244fdab8116ad1b7950070354c0b9"} Oct 02 12:40:52 crc kubenswrapper[4766]: I1002 12:40:52.827828 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7134978f38e13615511050acb4dc2aaf85c244fdab8116ad1b7950070354c0b9" Oct 02 12:40:52 crc kubenswrapper[4766]: I1002 12:40:52.827894 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wrf46" Oct 02 12:40:55 crc kubenswrapper[4766]: I1002 12:40:55.890738 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:40:55 crc kubenswrapper[4766]: E1002 12:40:55.892218 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:40:59 crc kubenswrapper[4766]: I1002 12:40:59.286109 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-75c6-account-create-hgfjc"] Oct 02 12:40:59 crc kubenswrapper[4766]: E1002 12:40:59.288343 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab08b25-08d7-4dd3-837c-24863de3ab01" containerName="mariadb-database-create" Oct 02 12:40:59 crc kubenswrapper[4766]: I1002 12:40:59.288431 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab08b25-08d7-4dd3-837c-24863de3ab01" containerName="mariadb-database-create" Oct 02 12:40:59 crc kubenswrapper[4766]: I1002 12:40:59.288816 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="aab08b25-08d7-4dd3-837c-24863de3ab01" containerName="mariadb-database-create" Oct 02 12:40:59 crc kubenswrapper[4766]: I1002 12:40:59.289884 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-75c6-account-create-hgfjc" Oct 02 12:40:59 crc kubenswrapper[4766]: I1002 12:40:59.292598 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Oct 02 12:40:59 crc kubenswrapper[4766]: I1002 12:40:59.307244 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-75c6-account-create-hgfjc"] Oct 02 12:40:59 crc kubenswrapper[4766]: I1002 12:40:59.398113 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knc28\" (UniqueName: \"kubernetes.io/projected/60696ba0-e879-4b95-b020-9c3a81a922af-kube-api-access-knc28\") pod \"manila-75c6-account-create-hgfjc\" (UID: \"60696ba0-e879-4b95-b020-9c3a81a922af\") " pod="openstack/manila-75c6-account-create-hgfjc" Oct 02 12:40:59 crc kubenswrapper[4766]: I1002 12:40:59.502105 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knc28\" (UniqueName: \"kubernetes.io/projected/60696ba0-e879-4b95-b020-9c3a81a922af-kube-api-access-knc28\") pod \"manila-75c6-account-create-hgfjc\" (UID: \"60696ba0-e879-4b95-b020-9c3a81a922af\") " pod="openstack/manila-75c6-account-create-hgfjc" Oct 02 12:40:59 crc kubenswrapper[4766]: I1002 12:40:59.522887 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knc28\" (UniqueName: \"kubernetes.io/projected/60696ba0-e879-4b95-b020-9c3a81a922af-kube-api-access-knc28\") pod \"manila-75c6-account-create-hgfjc\" (UID: \"60696ba0-e879-4b95-b020-9c3a81a922af\") " pod="openstack/manila-75c6-account-create-hgfjc" Oct 02 12:40:59 crc kubenswrapper[4766]: I1002 12:40:59.652319 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-75c6-account-create-hgfjc" Oct 02 12:41:00 crc kubenswrapper[4766]: I1002 12:41:00.148818 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-75c6-account-create-hgfjc"] Oct 02 12:41:00 crc kubenswrapper[4766]: W1002 12:41:00.153925 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60696ba0_e879_4b95_b020_9c3a81a922af.slice/crio-99ebccd7a35c44a3b842bc089c6c515f4260614ecb55db743c8ef95186398f72 WatchSource:0}: Error finding container 99ebccd7a35c44a3b842bc089c6c515f4260614ecb55db743c8ef95186398f72: Status 404 returned error can't find the container with id 99ebccd7a35c44a3b842bc089c6c515f4260614ecb55db743c8ef95186398f72 Oct 02 12:41:00 crc kubenswrapper[4766]: I1002 12:41:00.927395 4766 generic.go:334] "Generic (PLEG): container finished" podID="60696ba0-e879-4b95-b020-9c3a81a922af" containerID="a5de29348ba2324c852fb46cfa523213b8b0e248a6271ceb4c8b8e0764f33d33" exitCode=0 Oct 02 12:41:00 crc kubenswrapper[4766]: I1002 12:41:00.927522 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-75c6-account-create-hgfjc" event={"ID":"60696ba0-e879-4b95-b020-9c3a81a922af","Type":"ContainerDied","Data":"a5de29348ba2324c852fb46cfa523213b8b0e248a6271ceb4c8b8e0764f33d33"} Oct 02 12:41:00 crc kubenswrapper[4766]: I1002 12:41:00.928215 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-75c6-account-create-hgfjc" event={"ID":"60696ba0-e879-4b95-b020-9c3a81a922af","Type":"ContainerStarted","Data":"99ebccd7a35c44a3b842bc089c6c515f4260614ecb55db743c8ef95186398f72"} Oct 02 12:41:02 crc kubenswrapper[4766]: I1002 12:41:02.336472 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-75c6-account-create-hgfjc" Oct 02 12:41:02 crc kubenswrapper[4766]: I1002 12:41:02.392833 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knc28\" (UniqueName: \"kubernetes.io/projected/60696ba0-e879-4b95-b020-9c3a81a922af-kube-api-access-knc28\") pod \"60696ba0-e879-4b95-b020-9c3a81a922af\" (UID: \"60696ba0-e879-4b95-b020-9c3a81a922af\") " Oct 02 12:41:02 crc kubenswrapper[4766]: I1002 12:41:02.407080 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60696ba0-e879-4b95-b020-9c3a81a922af-kube-api-access-knc28" (OuterVolumeSpecName: "kube-api-access-knc28") pod "60696ba0-e879-4b95-b020-9c3a81a922af" (UID: "60696ba0-e879-4b95-b020-9c3a81a922af"). InnerVolumeSpecName "kube-api-access-knc28". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:41:02 crc kubenswrapper[4766]: I1002 12:41:02.496572 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knc28\" (UniqueName: \"kubernetes.io/projected/60696ba0-e879-4b95-b020-9c3a81a922af-kube-api-access-knc28\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:02 crc kubenswrapper[4766]: I1002 12:41:02.955609 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-75c6-account-create-hgfjc" event={"ID":"60696ba0-e879-4b95-b020-9c3a81a922af","Type":"ContainerDied","Data":"99ebccd7a35c44a3b842bc089c6c515f4260614ecb55db743c8ef95186398f72"} Oct 02 12:41:02 crc kubenswrapper[4766]: I1002 12:41:02.955682 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99ebccd7a35c44a3b842bc089c6c515f4260614ecb55db743c8ef95186398f72" Oct 02 12:41:02 crc kubenswrapper[4766]: I1002 12:41:02.955752 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-75c6-account-create-hgfjc" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.060366 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.648091 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-dcvnv"] Oct 02 12:41:04 crc kubenswrapper[4766]: E1002 12:41:04.648766 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60696ba0-e879-4b95-b020-9c3a81a922af" containerName="mariadb-account-create" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.648790 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="60696ba0-e879-4b95-b020-9c3a81a922af" containerName="mariadb-account-create" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.651158 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="60696ba0-e879-4b95-b020-9c3a81a922af" containerName="mariadb-account-create" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.652210 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-dcvnv" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.655872 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.658629 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-9v86h" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.664085 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-dcvnv"] Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.751696 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-combined-ca-bundle\") pod \"manila-db-sync-dcvnv\" (UID: \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\") " pod="openstack/manila-db-sync-dcvnv" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.751812 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24km9\" (UniqueName: \"kubernetes.io/projected/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-kube-api-access-24km9\") pod \"manila-db-sync-dcvnv\" (UID: \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\") " pod="openstack/manila-db-sync-dcvnv" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.751833 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-job-config-data\") pod \"manila-db-sync-dcvnv\" (UID: \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\") " pod="openstack/manila-db-sync-dcvnv" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.751862 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-config-data\") pod \"manila-db-sync-dcvnv\" (UID: \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\") " pod="openstack/manila-db-sync-dcvnv" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.854434 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24km9\" (UniqueName: \"kubernetes.io/projected/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-kube-api-access-24km9\") pod \"manila-db-sync-dcvnv\" (UID: \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\") " pod="openstack/manila-db-sync-dcvnv" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.854495 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-job-config-data\") pod \"manila-db-sync-dcvnv\" (UID: \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\") " pod="openstack/manila-db-sync-dcvnv" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.854541 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-config-data\") pod \"manila-db-sync-dcvnv\" (UID: \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\") " pod="openstack/manila-db-sync-dcvnv" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.854692 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-combined-ca-bundle\") pod \"manila-db-sync-dcvnv\" (UID: \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\") " pod="openstack/manila-db-sync-dcvnv" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.862302 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-combined-ca-bundle\") pod \"manila-db-sync-dcvnv\" (UID: \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\") " pod="openstack/manila-db-sync-dcvnv" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.865089 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-job-config-data\") pod \"manila-db-sync-dcvnv\" (UID: \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\") " pod="openstack/manila-db-sync-dcvnv" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.866327 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-config-data\") pod \"manila-db-sync-dcvnv\" (UID: \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\") " pod="openstack/manila-db-sync-dcvnv" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.874455 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24km9\" (UniqueName: \"kubernetes.io/projected/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-kube-api-access-24km9\") pod \"manila-db-sync-dcvnv\" (UID: \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\") " pod="openstack/manila-db-sync-dcvnv" Oct 02 12:41:04 crc kubenswrapper[4766]: I1002 12:41:04.975435 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-dcvnv" Oct 02 12:41:06 crc kubenswrapper[4766]: I1002 12:41:06.035233 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-dcvnv"] Oct 02 12:41:07 crc kubenswrapper[4766]: I1002 12:41:07.001667 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-dcvnv" event={"ID":"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4","Type":"ContainerStarted","Data":"d655e8bceb7a246e6c10f6e74ec7bdcd5eaf707b8cbfe7497fcf5b0e91f224b1"} Oct 02 12:41:08 crc kubenswrapper[4766]: I1002 12:41:08.885995 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:41:08 crc kubenswrapper[4766]: E1002 12:41:08.887076 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:41:12 crc kubenswrapper[4766]: I1002 12:41:12.058641 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-dcvnv" event={"ID":"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4","Type":"ContainerStarted","Data":"8490be9a62e7164e63066e7caf031d83817a6ebbc07f351ea4c97617a0c4ef0f"} Oct 02 12:41:12 crc kubenswrapper[4766]: I1002 12:41:12.098386 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-dcvnv" podStartSLOduration=2.786321606 podStartE2EDuration="8.098355321s" podCreationTimestamp="2025-10-02 12:41:04 +0000 UTC" firstStartedPulling="2025-10-02 12:41:06.052208681 +0000 UTC m=+6580.995079625" lastFinishedPulling="2025-10-02 12:41:11.364242396 +0000 UTC m=+6586.307113340" observedRunningTime="2025-10-02 12:41:12.084124815 +0000 UTC m=+6587.026995799" watchObservedRunningTime="2025-10-02 12:41:12.098355321 +0000 UTC m=+6587.041226265" Oct 02 12:41:14 crc kubenswrapper[4766]: I1002 12:41:14.120036 4766 generic.go:334] "Generic (PLEG): container finished" podID="24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4" containerID="8490be9a62e7164e63066e7caf031d83817a6ebbc07f351ea4c97617a0c4ef0f" exitCode=0 Oct 02 12:41:14 crc kubenswrapper[4766]: I1002 12:41:14.120146 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-dcvnv" event={"ID":"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4","Type":"ContainerDied","Data":"8490be9a62e7164e63066e7caf031d83817a6ebbc07f351ea4c97617a0c4ef0f"} Oct 02 12:41:15 crc kubenswrapper[4766]: I1002 12:41:15.745091 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-dcvnv" Oct 02 12:41:15 crc kubenswrapper[4766]: I1002 12:41:15.878088 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24km9\" (UniqueName: \"kubernetes.io/projected/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-kube-api-access-24km9\") pod \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\" (UID: \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\") " Oct 02 12:41:15 crc kubenswrapper[4766]: I1002 12:41:15.878172 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-job-config-data\") pod \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\" (UID: \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\") " Oct 02 12:41:15 crc kubenswrapper[4766]: I1002 12:41:15.878354 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-config-data\") pod \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\" (UID: \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\") " Oct 02 12:41:15 crc kubenswrapper[4766]: I1002 12:41:15.878709 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-combined-ca-bundle\") pod \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\" (UID: \"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4\") " Oct 02 12:41:15 crc kubenswrapper[4766]: I1002 12:41:15.889456 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4" (UID: "24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:41:15 crc kubenswrapper[4766]: I1002 12:41:15.889617 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-kube-api-access-24km9" (OuterVolumeSpecName: "kube-api-access-24km9") pod "24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4" (UID: "24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4"). InnerVolumeSpecName "kube-api-access-24km9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:41:15 crc kubenswrapper[4766]: I1002 12:41:15.892673 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-config-data" (OuterVolumeSpecName: "config-data") pod "24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4" (UID: "24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:41:15 crc kubenswrapper[4766]: I1002 12:41:15.923707 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4" (UID: "24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:41:15 crc kubenswrapper[4766]: I1002 12:41:15.982194 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:15 crc kubenswrapper[4766]: I1002 12:41:15.982240 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24km9\" (UniqueName: \"kubernetes.io/projected/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-kube-api-access-24km9\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:15 crc kubenswrapper[4766]: I1002 12:41:15.982257 4766 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-job-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:15 crc kubenswrapper[4766]: I1002 12:41:15.982270 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.154153 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-dcvnv" event={"ID":"24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4","Type":"ContainerDied","Data":"d655e8bceb7a246e6c10f6e74ec7bdcd5eaf707b8cbfe7497fcf5b0e91f224b1"} Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.154242 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-dcvnv" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.154253 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d655e8bceb7a246e6c10f6e74ec7bdcd5eaf707b8cbfe7497fcf5b0e91f224b1" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.644845 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 12:41:16 crc kubenswrapper[4766]: E1002 12:41:16.645645 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4" containerName="manila-db-sync" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.645676 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4" containerName="manila-db-sync" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.645969 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4" containerName="manila-db-sync" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.647668 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.650628 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-9v86h" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.650727 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.650638 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.651811 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.658727 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.661868 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.664898 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.680658 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.697111 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.725296 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59b859dfd5-srz5p"] Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.727878 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.788150 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59b859dfd5-srz5p"] Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.802582 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/48612c1a-5be8-48a6-bed8-26f26d78ef8e-ceph\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.802638 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48612c1a-5be8-48a6-bed8-26f26d78ef8e-scripts\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.802671 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.802758 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.802800 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48612c1a-5be8-48a6-bed8-26f26d78ef8e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.802830 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f78ck\" (UniqueName: \"kubernetes.io/projected/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-kube-api-access-f78ck\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.802952 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48612c1a-5be8-48a6-bed8-26f26d78ef8e-config-data\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.803004 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/48612c1a-5be8-48a6-bed8-26f26d78ef8e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.803112 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-scripts\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.803187 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-config-data\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.803249 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48612c1a-5be8-48a6-bed8-26f26d78ef8e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.803285 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48612c1a-5be8-48a6-bed8-26f26d78ef8e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.803316 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.803362 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbbbv\" (UniqueName: \"kubernetes.io/projected/48612c1a-5be8-48a6-bed8-26f26d78ef8e-kube-api-access-bbbbv\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.908007 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-scripts\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.908081 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-config-data\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.908119 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48612c1a-5be8-48a6-bed8-26f26d78ef8e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.908140 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48612c1a-5be8-48a6-bed8-26f26d78ef8e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.908157 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.908184 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbbbv\" (UniqueName: \"kubernetes.io/projected/48612c1a-5be8-48a6-bed8-26f26d78ef8e-kube-api-access-bbbbv\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.908259 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48612c1a-5be8-48a6-bed8-26f26d78ef8e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.908661 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.908715 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-config\") pod \"dnsmasq-dns-59b859dfd5-srz5p\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.908874 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r52q\" (UniqueName: \"kubernetes.io/projected/50942d8a-167e-49a6-beeb-6f18aca8fa94-kube-api-access-5r52q\") pod \"dnsmasq-dns-59b859dfd5-srz5p\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.909003 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/48612c1a-5be8-48a6-bed8-26f26d78ef8e-ceph\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.909154 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48612c1a-5be8-48a6-bed8-26f26d78ef8e-scripts\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.909286 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-ovsdbserver-nb\") pod \"dnsmasq-dns-59b859dfd5-srz5p\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.909413 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.909594 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.909725 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48612c1a-5be8-48a6-bed8-26f26d78ef8e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.909844 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f78ck\" (UniqueName: \"kubernetes.io/projected/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-kube-api-access-f78ck\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.909947 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-dns-svc\") pod \"dnsmasq-dns-59b859dfd5-srz5p\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.910102 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48612c1a-5be8-48a6-bed8-26f26d78ef8e-config-data\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.910208 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/48612c1a-5be8-48a6-bed8-26f26d78ef8e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.910426 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-ovsdbserver-sb\") pod \"dnsmasq-dns-59b859dfd5-srz5p\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.911848 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/48612c1a-5be8-48a6-bed8-26f26d78ef8e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.920862 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.921381 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48612c1a-5be8-48a6-bed8-26f26d78ef8e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.927994 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48612c1a-5be8-48a6-bed8-26f26d78ef8e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.930972 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-scripts\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.931145 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-config-data\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.933878 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48612c1a-5be8-48a6-bed8-26f26d78ef8e-scripts\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.934582 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.935292 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48612c1a-5be8-48a6-bed8-26f26d78ef8e-config-data\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.937640 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/48612c1a-5be8-48a6-bed8-26f26d78ef8e-ceph\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.938050 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbbbv\" (UniqueName: \"kubernetes.io/projected/48612c1a-5be8-48a6-bed8-26f26d78ef8e-kube-api-access-bbbbv\") pod \"manila-share-share1-0\" (UID: \"48612c1a-5be8-48a6-bed8-26f26d78ef8e\") " pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.938296 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f78ck\" (UniqueName: \"kubernetes.io/projected/fb671f48-8819-4b2f-b52e-5bb8d5161e4c-kube-api-access-f78ck\") pod \"manila-scheduler-0\" (UID: \"fb671f48-8819-4b2f-b52e-5bb8d5161e4c\") " pod="openstack/manila-scheduler-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.975439 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Oct 02 12:41:16 crc kubenswrapper[4766]: I1002 12:41:16.991393 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.005888 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.024249 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-config\") pod \"dnsmasq-dns-59b859dfd5-srz5p\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.024640 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r52q\" (UniqueName: \"kubernetes.io/projected/50942d8a-167e-49a6-beeb-6f18aca8fa94-kube-api-access-5r52q\") pod \"dnsmasq-dns-59b859dfd5-srz5p\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.024875 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-ovsdbserver-nb\") pod \"dnsmasq-dns-59b859dfd5-srz5p\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.025143 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-dns-svc\") pod \"dnsmasq-dns-59b859dfd5-srz5p\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.025488 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-ovsdbserver-sb\") pod \"dnsmasq-dns-59b859dfd5-srz5p\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.029303 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-dns-svc\") pod \"dnsmasq-dns-59b859dfd5-srz5p\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.031713 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-ovsdbserver-sb\") pod \"dnsmasq-dns-59b859dfd5-srz5p\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.033331 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-config\") pod \"dnsmasq-dns-59b859dfd5-srz5p\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.037524 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-ovsdbserver-nb\") pod \"dnsmasq-dns-59b859dfd5-srz5p\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.070767 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r52q\" (UniqueName: \"kubernetes.io/projected/50942d8a-167e-49a6-beeb-6f18aca8fa94-kube-api-access-5r52q\") pod \"dnsmasq-dns-59b859dfd5-srz5p\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.105768 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.105934 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.113320 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.236391 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0573958-ab76-414b-9eb8-b0ae73580f5a-logs\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.236966 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0573958-ab76-414b-9eb8-b0ae73580f5a-scripts\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.237686 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvwzg\" (UniqueName: \"kubernetes.io/projected/a0573958-ab76-414b-9eb8-b0ae73580f5a-kube-api-access-tvwzg\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.237777 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0573958-ab76-414b-9eb8-b0ae73580f5a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.237865 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0573958-ab76-414b-9eb8-b0ae73580f5a-etc-machine-id\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.237929 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0573958-ab76-414b-9eb8-b0ae73580f5a-config-data\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.238094 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0573958-ab76-414b-9eb8-b0ae73580f5a-config-data-custom\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.355351 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.355578 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0573958-ab76-414b-9eb8-b0ae73580f5a-scripts\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.355689 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvwzg\" (UniqueName: \"kubernetes.io/projected/a0573958-ab76-414b-9eb8-b0ae73580f5a-kube-api-access-tvwzg\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.355757 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0573958-ab76-414b-9eb8-b0ae73580f5a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.355812 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0573958-ab76-414b-9eb8-b0ae73580f5a-etc-machine-id\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.355856 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0573958-ab76-414b-9eb8-b0ae73580f5a-config-data\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.355926 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0573958-ab76-414b-9eb8-b0ae73580f5a-config-data-custom\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.355963 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0573958-ab76-414b-9eb8-b0ae73580f5a-logs\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.356080 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a0573958-ab76-414b-9eb8-b0ae73580f5a-etc-machine-id\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.356456 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0573958-ab76-414b-9eb8-b0ae73580f5a-logs\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.363567 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0573958-ab76-414b-9eb8-b0ae73580f5a-scripts\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.363899 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a0573958-ab76-414b-9eb8-b0ae73580f5a-config-data-custom\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.364089 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0573958-ab76-414b-9eb8-b0ae73580f5a-config-data\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.365960 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0573958-ab76-414b-9eb8-b0ae73580f5a-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.378830 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvwzg\" (UniqueName: \"kubernetes.io/projected/a0573958-ab76-414b-9eb8-b0ae73580f5a-kube-api-access-tvwzg\") pod \"manila-api-0\" (UID: \"a0573958-ab76-414b-9eb8-b0ae73580f5a\") " pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.547298 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.629146 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Oct 02 12:41:17 crc kubenswrapper[4766]: W1002 12:41:17.639951 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb671f48_8819_4b2f_b52e_5bb8d5161e4c.slice/crio-69d8efd7d4c1b155161dbc7cafef50e9fe18e533e54e7df6625902b3643ecb18 WatchSource:0}: Error finding container 69d8efd7d4c1b155161dbc7cafef50e9fe18e533e54e7df6625902b3643ecb18: Status 404 returned error can't find the container with id 69d8efd7d4c1b155161dbc7cafef50e9fe18e533e54e7df6625902b3643ecb18 Oct 02 12:41:17 crc kubenswrapper[4766]: I1002 12:41:17.759119 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Oct 02 12:41:18 crc kubenswrapper[4766]: I1002 12:41:18.044251 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59b859dfd5-srz5p"] Oct 02 12:41:18 crc kubenswrapper[4766]: W1002 12:41:18.053667 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50942d8a_167e_49a6_beeb_6f18aca8fa94.slice/crio-ed3e81fd542e4ce0c31366bc62277fcc78d4f89ed8ee5842b25c982b875e2e04 WatchSource:0}: Error finding container ed3e81fd542e4ce0c31366bc62277fcc78d4f89ed8ee5842b25c982b875e2e04: Status 404 returned error can't find the container with id ed3e81fd542e4ce0c31366bc62277fcc78d4f89ed8ee5842b25c982b875e2e04 Oct 02 12:41:18 crc kubenswrapper[4766]: I1002 12:41:18.221586 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"fb671f48-8819-4b2f-b52e-5bb8d5161e4c","Type":"ContainerStarted","Data":"69d8efd7d4c1b155161dbc7cafef50e9fe18e533e54e7df6625902b3643ecb18"} Oct 02 12:41:18 crc kubenswrapper[4766]: I1002 12:41:18.227813 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"48612c1a-5be8-48a6-bed8-26f26d78ef8e","Type":"ContainerStarted","Data":"4c408ffe69f305f8fd250d5d698ba76284566d6032ec100167644f7f31624a20"} Oct 02 12:41:18 crc kubenswrapper[4766]: I1002 12:41:18.235060 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" event={"ID":"50942d8a-167e-49a6-beeb-6f18aca8fa94","Type":"ContainerStarted","Data":"ed3e81fd542e4ce0c31366bc62277fcc78d4f89ed8ee5842b25c982b875e2e04"} Oct 02 12:41:18 crc kubenswrapper[4766]: I1002 12:41:18.297174 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Oct 02 12:41:19 crc kubenswrapper[4766]: I1002 12:41:19.287589 4766 generic.go:334] "Generic (PLEG): container finished" podID="50942d8a-167e-49a6-beeb-6f18aca8fa94" containerID="df66ecc13aa9d11c6487a44cd0165794ea23318ec1d4afadca44745a5bd3cd52" exitCode=0 Oct 02 12:41:19 crc kubenswrapper[4766]: I1002 12:41:19.288362 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" event={"ID":"50942d8a-167e-49a6-beeb-6f18aca8fa94","Type":"ContainerDied","Data":"df66ecc13aa9d11c6487a44cd0165794ea23318ec1d4afadca44745a5bd3cd52"} Oct 02 12:41:19 crc kubenswrapper[4766]: I1002 12:41:19.299640 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"fb671f48-8819-4b2f-b52e-5bb8d5161e4c","Type":"ContainerStarted","Data":"b69e55cb6357ee090657632b4f968ee25aa7246584adfa5e9f47f10ffd69cd94"} Oct 02 12:41:19 crc kubenswrapper[4766]: I1002 12:41:19.304856 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a0573958-ab76-414b-9eb8-b0ae73580f5a","Type":"ContainerStarted","Data":"8c9fefcc6b88d21e3c7402eb5b36bf5b35b4f138552c829b61a7d82f16a4eded"} Oct 02 12:41:19 crc kubenswrapper[4766]: I1002 12:41:19.304901 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a0573958-ab76-414b-9eb8-b0ae73580f5a","Type":"ContainerStarted","Data":"259c855910252c8be2f85cf7002768bd8819003552eeba6067b57671252d220e"} Oct 02 12:41:20 crc kubenswrapper[4766]: I1002 12:41:20.319321 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"a0573958-ab76-414b-9eb8-b0ae73580f5a","Type":"ContainerStarted","Data":"c263c1bf86584709b89093453c66489a3324b4a53e4048536ffb844ad154195b"} Oct 02 12:41:20 crc kubenswrapper[4766]: I1002 12:41:20.320120 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Oct 02 12:41:20 crc kubenswrapper[4766]: I1002 12:41:20.332363 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" event={"ID":"50942d8a-167e-49a6-beeb-6f18aca8fa94","Type":"ContainerStarted","Data":"82e2a0c435d41f32ee1e02c6795bde69fffa18ff44c754f4cc9ae3809bf5c084"} Oct 02 12:41:20 crc kubenswrapper[4766]: I1002 12:41:20.332583 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:20 crc kubenswrapper[4766]: I1002 12:41:20.339691 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"fb671f48-8819-4b2f-b52e-5bb8d5161e4c","Type":"ContainerStarted","Data":"86b21086b67d09089a7b20d34e3125948d0153c28207949eaac3216e171609b5"} Oct 02 12:41:20 crc kubenswrapper[4766]: I1002 12:41:20.365352 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.365322758 podStartE2EDuration="4.365322758s" podCreationTimestamp="2025-10-02 12:41:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:41:20.349987276 +0000 UTC m=+6595.292858220" watchObservedRunningTime="2025-10-02 12:41:20.365322758 +0000 UTC m=+6595.308193702" Oct 02 12:41:20 crc kubenswrapper[4766]: I1002 12:41:20.439199 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.5756844340000002 podStartE2EDuration="4.439171152s" podCreationTimestamp="2025-10-02 12:41:16 +0000 UTC" firstStartedPulling="2025-10-02 12:41:17.647228642 +0000 UTC m=+6592.590099576" lastFinishedPulling="2025-10-02 12:41:18.51071535 +0000 UTC m=+6593.453586294" observedRunningTime="2025-10-02 12:41:20.390887437 +0000 UTC m=+6595.333758381" watchObservedRunningTime="2025-10-02 12:41:20.439171152 +0000 UTC m=+6595.382042106" Oct 02 12:41:20 crc kubenswrapper[4766]: I1002 12:41:20.450888 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" podStartSLOduration=4.450864757 podStartE2EDuration="4.450864757s" podCreationTimestamp="2025-10-02 12:41:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:41:20.428335135 +0000 UTC m=+6595.371206079" watchObservedRunningTime="2025-10-02 12:41:20.450864757 +0000 UTC m=+6595.393735701" Oct 02 12:41:22 crc kubenswrapper[4766]: I1002 12:41:22.882780 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:41:22 crc kubenswrapper[4766]: E1002 12:41:22.884309 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:41:26 crc kubenswrapper[4766]: I1002 12:41:26.427317 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"48612c1a-5be8-48a6-bed8-26f26d78ef8e","Type":"ContainerStarted","Data":"3e2d16634f809068a3379e8df34b8913e9276cda4ce112372592785aeab9324d"} Oct 02 12:41:26 crc kubenswrapper[4766]: I1002 12:41:26.993007 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Oct 02 12:41:27 crc kubenswrapper[4766]: I1002 12:41:27.357708 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:41:27 crc kubenswrapper[4766]: I1002 12:41:27.433405 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c7d7b9bc7-27r4j"] Oct 02 12:41:27 crc kubenswrapper[4766]: I1002 12:41:27.433752 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" podUID="1ef60cf9-8ee9-452b-b8c5-87e84783901d" containerName="dnsmasq-dns" containerID="cri-o://cca9cf9e180da68992d03a262ee7adbd5fe6bdfda996d2fa195b8546edb0f76d" gracePeriod=10 Oct 02 12:41:27 crc kubenswrapper[4766]: I1002 12:41:27.475129 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"48612c1a-5be8-48a6-bed8-26f26d78ef8e","Type":"ContainerStarted","Data":"cb953bef105b1e87535f79c86f72a2bf2c64295ba5995c39af2239b460f81a1d"} Oct 02 12:41:27 crc kubenswrapper[4766]: I1002 12:41:27.516371 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.458221902 podStartE2EDuration="11.516344858s" podCreationTimestamp="2025-10-02 12:41:16 +0000 UTC" firstStartedPulling="2025-10-02 12:41:17.761242384 +0000 UTC m=+6592.704113328" lastFinishedPulling="2025-10-02 12:41:25.81936532 +0000 UTC m=+6600.762236284" observedRunningTime="2025-10-02 12:41:27.511059349 +0000 UTC m=+6602.453930293" watchObservedRunningTime="2025-10-02 12:41:27.516344858 +0000 UTC m=+6602.459215802" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.124338 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.311014 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-dns-svc\") pod \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.311297 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt4wx\" (UniqueName: \"kubernetes.io/projected/1ef60cf9-8ee9-452b-b8c5-87e84783901d-kube-api-access-tt4wx\") pod \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.311330 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-ovsdbserver-nb\") pod \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.311411 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-config\") pod \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.311479 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-ovsdbserver-sb\") pod \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\" (UID: \"1ef60cf9-8ee9-452b-b8c5-87e84783901d\") " Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.321693 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef60cf9-8ee9-452b-b8c5-87e84783901d-kube-api-access-tt4wx" (OuterVolumeSpecName: "kube-api-access-tt4wx") pod "1ef60cf9-8ee9-452b-b8c5-87e84783901d" (UID: "1ef60cf9-8ee9-452b-b8c5-87e84783901d"). InnerVolumeSpecName "kube-api-access-tt4wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.373386 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1ef60cf9-8ee9-452b-b8c5-87e84783901d" (UID: "1ef60cf9-8ee9-452b-b8c5-87e84783901d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.389230 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1ef60cf9-8ee9-452b-b8c5-87e84783901d" (UID: "1ef60cf9-8ee9-452b-b8c5-87e84783901d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.389525 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1ef60cf9-8ee9-452b-b8c5-87e84783901d" (UID: "1ef60cf9-8ee9-452b-b8c5-87e84783901d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.411271 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-config" (OuterVolumeSpecName: "config") pod "1ef60cf9-8ee9-452b-b8c5-87e84783901d" (UID: "1ef60cf9-8ee9-452b-b8c5-87e84783901d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.415782 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.415827 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt4wx\" (UniqueName: \"kubernetes.io/projected/1ef60cf9-8ee9-452b-b8c5-87e84783901d-kube-api-access-tt4wx\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.415842 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.415852 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.415861 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1ef60cf9-8ee9-452b-b8c5-87e84783901d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.486595 4766 generic.go:334] "Generic (PLEG): container finished" podID="1ef60cf9-8ee9-452b-b8c5-87e84783901d" containerID="cca9cf9e180da68992d03a262ee7adbd5fe6bdfda996d2fa195b8546edb0f76d" exitCode=0 Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.486675 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" event={"ID":"1ef60cf9-8ee9-452b-b8c5-87e84783901d","Type":"ContainerDied","Data":"cca9cf9e180da68992d03a262ee7adbd5fe6bdfda996d2fa195b8546edb0f76d"} Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.486731 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" event={"ID":"1ef60cf9-8ee9-452b-b8c5-87e84783901d","Type":"ContainerDied","Data":"1c8bfb820ab13a6aab658b3be51117c85b9c91bf250be37b3380325dee17cbde"} Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.486725 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c7d7b9bc7-27r4j" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.486755 4766 scope.go:117] "RemoveContainer" containerID="cca9cf9e180da68992d03a262ee7adbd5fe6bdfda996d2fa195b8546edb0f76d" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.536959 4766 scope.go:117] "RemoveContainer" containerID="06520c87df7275dbd01fedb073480d0eb6a019a46f0914474f56e655dc00f70a" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.538042 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c7d7b9bc7-27r4j"] Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.552524 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c7d7b9bc7-27r4j"] Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.562200 4766 scope.go:117] "RemoveContainer" containerID="cca9cf9e180da68992d03a262ee7adbd5fe6bdfda996d2fa195b8546edb0f76d" Oct 02 12:41:28 crc kubenswrapper[4766]: E1002 12:41:28.562704 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cca9cf9e180da68992d03a262ee7adbd5fe6bdfda996d2fa195b8546edb0f76d\": container with ID starting with cca9cf9e180da68992d03a262ee7adbd5fe6bdfda996d2fa195b8546edb0f76d not found: ID does not exist" containerID="cca9cf9e180da68992d03a262ee7adbd5fe6bdfda996d2fa195b8546edb0f76d" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.562740 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cca9cf9e180da68992d03a262ee7adbd5fe6bdfda996d2fa195b8546edb0f76d"} err="failed to get container status \"cca9cf9e180da68992d03a262ee7adbd5fe6bdfda996d2fa195b8546edb0f76d\": rpc error: code = NotFound desc = could not find container \"cca9cf9e180da68992d03a262ee7adbd5fe6bdfda996d2fa195b8546edb0f76d\": container with ID starting with cca9cf9e180da68992d03a262ee7adbd5fe6bdfda996d2fa195b8546edb0f76d not found: ID does not exist" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.562765 4766 scope.go:117] "RemoveContainer" containerID="06520c87df7275dbd01fedb073480d0eb6a019a46f0914474f56e655dc00f70a" Oct 02 12:41:28 crc kubenswrapper[4766]: E1002 12:41:28.563041 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06520c87df7275dbd01fedb073480d0eb6a019a46f0914474f56e655dc00f70a\": container with ID starting with 06520c87df7275dbd01fedb073480d0eb6a019a46f0914474f56e655dc00f70a not found: ID does not exist" containerID="06520c87df7275dbd01fedb073480d0eb6a019a46f0914474f56e655dc00f70a" Oct 02 12:41:28 crc kubenswrapper[4766]: I1002 12:41:28.563063 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06520c87df7275dbd01fedb073480d0eb6a019a46f0914474f56e655dc00f70a"} err="failed to get container status \"06520c87df7275dbd01fedb073480d0eb6a019a46f0914474f56e655dc00f70a\": rpc error: code = NotFound desc = could not find container \"06520c87df7275dbd01fedb073480d0eb6a019a46f0914474f56e655dc00f70a\": container with ID starting with 06520c87df7275dbd01fedb073480d0eb6a019a46f0914474f56e655dc00f70a not found: ID does not exist" Oct 02 12:41:29 crc kubenswrapper[4766]: I1002 12:41:29.898271 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef60cf9-8ee9-452b-b8c5-87e84783901d" path="/var/lib/kubelet/pods/1ef60cf9-8ee9-452b-b8c5-87e84783901d/volumes" Oct 02 12:41:30 crc kubenswrapper[4766]: I1002 12:41:30.570071 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:41:30 crc kubenswrapper[4766]: I1002 12:41:30.570862 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerName="ceilometer-central-agent" containerID="cri-o://cc97f7e56aa01abb4575ec15c6e94b99bf609c839ecfc25cfe8e748f1b489532" gracePeriod=30 Oct 02 12:41:30 crc kubenswrapper[4766]: I1002 12:41:30.570982 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerName="proxy-httpd" containerID="cri-o://cb9ac8f9a48cfe3c637d3564c511fc0eab3c76c917a6e4dad2034823a12e4f63" gracePeriod=30 Oct 02 12:41:30 crc kubenswrapper[4766]: I1002 12:41:30.571007 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerName="sg-core" containerID="cri-o://73a04b4d6ca244c5535fa45f4c43cbbf4e3f69539073998964bbc4e13c6e00ce" gracePeriod=30 Oct 02 12:41:30 crc kubenswrapper[4766]: I1002 12:41:30.571049 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerName="ceilometer-notification-agent" containerID="cri-o://c64039966b6fd62b0951f7b67f2d582de446a70d20abb24895fe0eeb5a1f0380" gracePeriod=30 Oct 02 12:41:31 crc kubenswrapper[4766]: I1002 12:41:31.526393 4766 generic.go:334] "Generic (PLEG): container finished" podID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerID="cb9ac8f9a48cfe3c637d3564c511fc0eab3c76c917a6e4dad2034823a12e4f63" exitCode=0 Oct 02 12:41:31 crc kubenswrapper[4766]: I1002 12:41:31.526435 4766 generic.go:334] "Generic (PLEG): container finished" podID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerID="73a04b4d6ca244c5535fa45f4c43cbbf4e3f69539073998964bbc4e13c6e00ce" exitCode=2 Oct 02 12:41:31 crc kubenswrapper[4766]: I1002 12:41:31.526447 4766 generic.go:334] "Generic (PLEG): container finished" podID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerID="cc97f7e56aa01abb4575ec15c6e94b99bf609c839ecfc25cfe8e748f1b489532" exitCode=0 Oct 02 12:41:31 crc kubenswrapper[4766]: I1002 12:41:31.526478 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0","Type":"ContainerDied","Data":"cb9ac8f9a48cfe3c637d3564c511fc0eab3c76c917a6e4dad2034823a12e4f63"} Oct 02 12:41:31 crc kubenswrapper[4766]: I1002 12:41:31.526530 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0","Type":"ContainerDied","Data":"73a04b4d6ca244c5535fa45f4c43cbbf4e3f69539073998964bbc4e13c6e00ce"} Oct 02 12:41:31 crc kubenswrapper[4766]: I1002 12:41:31.526550 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0","Type":"ContainerDied","Data":"cc97f7e56aa01abb4575ec15c6e94b99bf609c839ecfc25cfe8e748f1b489532"} Oct 02 12:41:33 crc kubenswrapper[4766]: I1002 12:41:33.594460 4766 generic.go:334] "Generic (PLEG): container finished" podID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerID="c64039966b6fd62b0951f7b67f2d582de446a70d20abb24895fe0eeb5a1f0380" exitCode=0 Oct 02 12:41:33 crc kubenswrapper[4766]: I1002 12:41:33.595538 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0","Type":"ContainerDied","Data":"c64039966b6fd62b0951f7b67f2d582de446a70d20abb24895fe0eeb5a1f0380"} Oct 02 12:41:33 crc kubenswrapper[4766]: I1002 12:41:33.882062 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:41:33 crc kubenswrapper[4766]: E1002 12:41:33.882818 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:41:33 crc kubenswrapper[4766]: I1002 12:41:33.965905 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.086432 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-run-httpd\") pod \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.086925 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" (UID: "b88eb5b5-0ef6-485d-b95e-e37f96ee16f0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.090698 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-scripts\") pod \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.090966 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-combined-ca-bundle\") pod \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.091067 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-sg-core-conf-yaml\") pod \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.091290 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-log-httpd\") pod \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.091344 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28m46\" (UniqueName: \"kubernetes.io/projected/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-kube-api-access-28m46\") pod \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.091369 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-config-data\") pod \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\" (UID: \"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0\") " Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.092660 4766 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.096826 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" (UID: "b88eb5b5-0ef6-485d-b95e-e37f96ee16f0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.102294 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-kube-api-access-28m46" (OuterVolumeSpecName: "kube-api-access-28m46") pod "b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" (UID: "b88eb5b5-0ef6-485d-b95e-e37f96ee16f0"). InnerVolumeSpecName "kube-api-access-28m46". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.108722 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-scripts" (OuterVolumeSpecName: "scripts") pod "b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" (UID: "b88eb5b5-0ef6-485d-b95e-e37f96ee16f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.131744 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" (UID: "b88eb5b5-0ef6-485d-b95e-e37f96ee16f0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.196404 4766 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.196452 4766 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.196467 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28m46\" (UniqueName: \"kubernetes.io/projected/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-kube-api-access-28m46\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.196482 4766 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-scripts\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.218250 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" (UID: "b88eb5b5-0ef6-485d-b95e-e37f96ee16f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.223868 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-config-data" (OuterVolumeSpecName: "config-data") pod "b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" (UID: "b88eb5b5-0ef6-485d-b95e-e37f96ee16f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.299847 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.299900 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.609818 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88eb5b5-0ef6-485d-b95e-e37f96ee16f0","Type":"ContainerDied","Data":"f590679caf92013e6909f532ca4a9de56609c39941666682a6d6c74f16ab292d"} Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.609877 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.611146 4766 scope.go:117] "RemoveContainer" containerID="cb9ac8f9a48cfe3c637d3564c511fc0eab3c76c917a6e4dad2034823a12e4f63" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.655035 4766 scope.go:117] "RemoveContainer" containerID="73a04b4d6ca244c5535fa45f4c43cbbf4e3f69539073998964bbc4e13c6e00ce" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.666070 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.694921 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.703729 4766 scope.go:117] "RemoveContainer" containerID="c64039966b6fd62b0951f7b67f2d582de446a70d20abb24895fe0eeb5a1f0380" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.709475 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:41:34 crc kubenswrapper[4766]: E1002 12:41:34.710226 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerName="proxy-httpd" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.710248 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerName="proxy-httpd" Oct 02 12:41:34 crc kubenswrapper[4766]: E1002 12:41:34.710279 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef60cf9-8ee9-452b-b8c5-87e84783901d" containerName="init" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.710288 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef60cf9-8ee9-452b-b8c5-87e84783901d" containerName="init" Oct 02 12:41:34 crc kubenswrapper[4766]: E1002 12:41:34.710306 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerName="ceilometer-central-agent" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.710316 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerName="ceilometer-central-agent" Oct 02 12:41:34 crc kubenswrapper[4766]: E1002 12:41:34.710340 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerName="ceilometer-notification-agent" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.710350 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerName="ceilometer-notification-agent" Oct 02 12:41:34 crc kubenswrapper[4766]: E1002 12:41:34.710374 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef60cf9-8ee9-452b-b8c5-87e84783901d" containerName="dnsmasq-dns" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.710386 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef60cf9-8ee9-452b-b8c5-87e84783901d" containerName="dnsmasq-dns" Oct 02 12:41:34 crc kubenswrapper[4766]: E1002 12:41:34.710412 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerName="sg-core" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.710422 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerName="sg-core" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.710727 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerName="ceilometer-central-agent" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.710747 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef60cf9-8ee9-452b-b8c5-87e84783901d" containerName="dnsmasq-dns" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.711359 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerName="ceilometer-notification-agent" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.711375 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerName="sg-core" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.711393 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" containerName="proxy-httpd" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.723337 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.727385 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.728293 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.728438 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.791916 4766 scope.go:117] "RemoveContainer" containerID="cc97f7e56aa01abb4575ec15c6e94b99bf609c839ecfc25cfe8e748f1b489532" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.816907 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e07a5a4-b364-4459-84cf-badcf5cccab9-scripts\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.817009 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bt2w\" (UniqueName: \"kubernetes.io/projected/3e07a5a4-b364-4459-84cf-badcf5cccab9-kube-api-access-6bt2w\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.817050 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e07a5a4-b364-4459-84cf-badcf5cccab9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.817454 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e07a5a4-b364-4459-84cf-badcf5cccab9-config-data\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.817564 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e07a5a4-b364-4459-84cf-badcf5cccab9-log-httpd\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.817660 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e07a5a4-b364-4459-84cf-badcf5cccab9-run-httpd\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.817993 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e07a5a4-b364-4459-84cf-badcf5cccab9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.921407 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e07a5a4-b364-4459-84cf-badcf5cccab9-scripts\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.921569 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bt2w\" (UniqueName: \"kubernetes.io/projected/3e07a5a4-b364-4459-84cf-badcf5cccab9-kube-api-access-6bt2w\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.921638 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e07a5a4-b364-4459-84cf-badcf5cccab9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.922856 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e07a5a4-b364-4459-84cf-badcf5cccab9-config-data\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.923019 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e07a5a4-b364-4459-84cf-badcf5cccab9-log-httpd\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.923219 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e07a5a4-b364-4459-84cf-badcf5cccab9-run-httpd\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.923340 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e07a5a4-b364-4459-84cf-badcf5cccab9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.923686 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e07a5a4-b364-4459-84cf-badcf5cccab9-log-httpd\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.923768 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3e07a5a4-b364-4459-84cf-badcf5cccab9-run-httpd\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.926896 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3e07a5a4-b364-4459-84cf-badcf5cccab9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.927302 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e07a5a4-b364-4459-84cf-badcf5cccab9-scripts\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.928375 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e07a5a4-b364-4459-84cf-badcf5cccab9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.929702 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e07a5a4-b364-4459-84cf-badcf5cccab9-config-data\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:34 crc kubenswrapper[4766]: I1002 12:41:34.940805 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bt2w\" (UniqueName: \"kubernetes.io/projected/3e07a5a4-b364-4459-84cf-badcf5cccab9-kube-api-access-6bt2w\") pod \"ceilometer-0\" (UID: \"3e07a5a4-b364-4459-84cf-badcf5cccab9\") " pod="openstack/ceilometer-0" Oct 02 12:41:35 crc kubenswrapper[4766]: I1002 12:41:35.077963 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 02 12:41:35 crc kubenswrapper[4766]: I1002 12:41:35.604043 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 02 12:41:35 crc kubenswrapper[4766]: I1002 12:41:35.635932 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e07a5a4-b364-4459-84cf-badcf5cccab9","Type":"ContainerStarted","Data":"e8acaafda318f85510a948303d5579a32626634a5a6fdd8666d4aba60514bab3"} Oct 02 12:41:35 crc kubenswrapper[4766]: I1002 12:41:35.906485 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b88eb5b5-0ef6-485d-b95e-e37f96ee16f0" path="/var/lib/kubelet/pods/b88eb5b5-0ef6-485d-b95e-e37f96ee16f0/volumes" Oct 02 12:41:36 crc kubenswrapper[4766]: I1002 12:41:36.652673 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e07a5a4-b364-4459-84cf-badcf5cccab9","Type":"ContainerStarted","Data":"caf6a2ec49cbed5fb37bf2b2ed59a2fe93324bdebc95f17ad3bfcf9903d1da9c"} Oct 02 12:41:36 crc kubenswrapper[4766]: I1002 12:41:36.975831 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Oct 02 12:41:37 crc kubenswrapper[4766]: I1002 12:41:37.672800 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e07a5a4-b364-4459-84cf-badcf5cccab9","Type":"ContainerStarted","Data":"aa5061c6906a349ce8b2b166d9d5eb3ec9dfbab9645e08e160a7ba59b6fe8262"} Oct 02 12:41:38 crc kubenswrapper[4766]: I1002 12:41:38.688107 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e07a5a4-b364-4459-84cf-badcf5cccab9","Type":"ContainerStarted","Data":"1c852ec1c38560fcf613862a9c9611fc708f99cf2f19ab59413a9fd5af19938e"} Oct 02 12:41:38 crc kubenswrapper[4766]: I1002 12:41:38.811348 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Oct 02 12:41:39 crc kubenswrapper[4766]: I1002 12:41:39.045377 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Oct 02 12:41:39 crc kubenswrapper[4766]: I1002 12:41:39.433330 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Oct 02 12:41:40 crc kubenswrapper[4766]: I1002 12:41:40.033872 4766 scope.go:117] "RemoveContainer" containerID="c64bf4995641dab91c56fda7ae96d849253e4ef453c3f5da7ca6e68cb9aa52be" Oct 02 12:41:40 crc kubenswrapper[4766]: I1002 12:41:40.229459 4766 scope.go:117] "RemoveContainer" containerID="b81881333fddbc48e419a202f15a835cf46d38d3517e10c56e69fb16145956ef" Oct 02 12:41:40 crc kubenswrapper[4766]: I1002 12:41:40.711633 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3e07a5a4-b364-4459-84cf-badcf5cccab9","Type":"ContainerStarted","Data":"d391680ff1eedacaa58a0dd7e78ca8b271b41ce0e6475b26124263417cb6835c"} Oct 02 12:41:40 crc kubenswrapper[4766]: I1002 12:41:40.711826 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 02 12:41:48 crc kubenswrapper[4766]: I1002 12:41:48.881553 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:41:48 crc kubenswrapper[4766]: E1002 12:41:48.883184 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:42:01 crc kubenswrapper[4766]: I1002 12:42:01.881880 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:42:01 crc kubenswrapper[4766]: E1002 12:42:01.883161 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:42:05 crc kubenswrapper[4766]: I1002 12:42:05.085454 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 02 12:42:05 crc kubenswrapper[4766]: I1002 12:42:05.116281 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=26.91181947 podStartE2EDuration="31.116252816s" podCreationTimestamp="2025-10-02 12:41:34 +0000 UTC" firstStartedPulling="2025-10-02 12:41:35.610976935 +0000 UTC m=+6610.553847879" lastFinishedPulling="2025-10-02 12:41:39.815410281 +0000 UTC m=+6614.758281225" observedRunningTime="2025-10-02 12:41:40.73757704 +0000 UTC m=+6615.680447984" watchObservedRunningTime="2025-10-02 12:42:05.116252816 +0000 UTC m=+6640.059123760" Oct 02 12:42:13 crc kubenswrapper[4766]: I1002 12:42:13.883693 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:42:13 crc kubenswrapper[4766]: E1002 12:42:13.884717 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:42:27 crc kubenswrapper[4766]: I1002 12:42:27.914799 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c79f85559-czpz9"] Oct 02 12:42:27 crc kubenswrapper[4766]: I1002 12:42:27.917155 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:27 crc kubenswrapper[4766]: I1002 12:42:27.927786 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Oct 02 12:42:27 crc kubenswrapper[4766]: I1002 12:42:27.943131 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c79f85559-czpz9"] Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.017405 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7r9n\" (UniqueName: \"kubernetes.io/projected/ee1674b6-7c4a-49c6-8ac0-99188be60a90-kube-api-access-d7r9n\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.017819 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-ovsdbserver-nb\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.017942 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-dns-svc\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.018195 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-config\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.018371 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-ovsdbserver-sb\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.018453 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-openstack-cell1\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.120983 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-ovsdbserver-nb\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.121049 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-dns-svc\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.121115 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-config\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.121169 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-ovsdbserver-sb\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.121201 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-openstack-cell1\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.121242 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7r9n\" (UniqueName: \"kubernetes.io/projected/ee1674b6-7c4a-49c6-8ac0-99188be60a90-kube-api-access-d7r9n\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.122533 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-ovsdbserver-nb\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.123119 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-dns-svc\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.123813 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-config\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.124486 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-ovsdbserver-sb\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.127432 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-openstack-cell1\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.159974 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7r9n\" (UniqueName: \"kubernetes.io/projected/ee1674b6-7c4a-49c6-8ac0-99188be60a90-kube-api-access-d7r9n\") pod \"dnsmasq-dns-7c79f85559-czpz9\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.303192 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:28 crc kubenswrapper[4766]: I1002 12:42:28.882263 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:42:29 crc kubenswrapper[4766]: I1002 12:42:29.038041 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c79f85559-czpz9"] Oct 02 12:42:29 crc kubenswrapper[4766]: I1002 12:42:29.282536 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c79f85559-czpz9" event={"ID":"ee1674b6-7c4a-49c6-8ac0-99188be60a90","Type":"ContainerStarted","Data":"e240c9a44a2f62626a32a4199476651b7f72630972e1c4d810d6a4a2437be984"} Oct 02 12:42:29 crc kubenswrapper[4766]: I1002 12:42:29.288125 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"c2b4353ffed00a874e54b842c2e985395b749e0bf31632b418df66f7c022195d"} Oct 02 12:42:30 crc kubenswrapper[4766]: I1002 12:42:30.314755 4766 generic.go:334] "Generic (PLEG): container finished" podID="ee1674b6-7c4a-49c6-8ac0-99188be60a90" containerID="7f3f51e47d61d0bc1a05c9d25457c141502cddd7834c9bde7c442c13c514322b" exitCode=0 Oct 02 12:42:30 crc kubenswrapper[4766]: I1002 12:42:30.314842 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c79f85559-czpz9" event={"ID":"ee1674b6-7c4a-49c6-8ac0-99188be60a90","Type":"ContainerDied","Data":"7f3f51e47d61d0bc1a05c9d25457c141502cddd7834c9bde7c442c13c514322b"} Oct 02 12:42:31 crc kubenswrapper[4766]: I1002 12:42:31.328391 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c79f85559-czpz9" event={"ID":"ee1674b6-7c4a-49c6-8ac0-99188be60a90","Type":"ContainerStarted","Data":"03147055038d6f63bcdf96621276523e2bcd9772e9f1d47346b0269127c3a8e0"} Oct 02 12:42:31 crc kubenswrapper[4766]: I1002 12:42:31.330586 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:31 crc kubenswrapper[4766]: I1002 12:42:31.353102 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c79f85559-czpz9" podStartSLOduration=4.353072291 podStartE2EDuration="4.353072291s" podCreationTimestamp="2025-10-02 12:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:42:31.348872467 +0000 UTC m=+6666.291743421" watchObservedRunningTime="2025-10-02 12:42:31.353072291 +0000 UTC m=+6666.295943235" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.304722 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.371762 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59b859dfd5-srz5p"] Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.372056 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" podUID="50942d8a-167e-49a6-beeb-6f18aca8fa94" containerName="dnsmasq-dns" containerID="cri-o://82e2a0c435d41f32ee1e02c6795bde69fffa18ff44c754f4cc9ae3809bf5c084" gracePeriod=10 Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.554486 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bf98f57-9h5jh"] Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.558035 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.570890 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bf98f57-9h5jh"] Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.602019 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/675d2c19-820c-4e5e-b461-b44b4afe9d41-ovsdbserver-sb\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.602076 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/675d2c19-820c-4e5e-b461-b44b4afe9d41-dns-svc\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.602150 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/675d2c19-820c-4e5e-b461-b44b4afe9d41-openstack-cell1\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.602193 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/675d2c19-820c-4e5e-b461-b44b4afe9d41-ovsdbserver-nb\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.602380 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/675d2c19-820c-4e5e-b461-b44b4afe9d41-config\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.602725 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9s54\" (UniqueName: \"kubernetes.io/projected/675d2c19-820c-4e5e-b461-b44b4afe9d41-kube-api-access-c9s54\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.706449 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/675d2c19-820c-4e5e-b461-b44b4afe9d41-dns-svc\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.706561 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/675d2c19-820c-4e5e-b461-b44b4afe9d41-openstack-cell1\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.706602 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/675d2c19-820c-4e5e-b461-b44b4afe9d41-ovsdbserver-nb\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.706631 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/675d2c19-820c-4e5e-b461-b44b4afe9d41-config\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.706688 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9s54\" (UniqueName: \"kubernetes.io/projected/675d2c19-820c-4e5e-b461-b44b4afe9d41-kube-api-access-c9s54\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.706748 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/675d2c19-820c-4e5e-b461-b44b4afe9d41-ovsdbserver-sb\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.707908 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/675d2c19-820c-4e5e-b461-b44b4afe9d41-ovsdbserver-sb\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.708138 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/675d2c19-820c-4e5e-b461-b44b4afe9d41-dns-svc\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.708431 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/675d2c19-820c-4e5e-b461-b44b4afe9d41-openstack-cell1\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.708567 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/675d2c19-820c-4e5e-b461-b44b4afe9d41-config\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.710036 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/675d2c19-820c-4e5e-b461-b44b4afe9d41-ovsdbserver-nb\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.732415 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9s54\" (UniqueName: \"kubernetes.io/projected/675d2c19-820c-4e5e-b461-b44b4afe9d41-kube-api-access-c9s54\") pod \"dnsmasq-dns-bf98f57-9h5jh\" (UID: \"675d2c19-820c-4e5e-b461-b44b4afe9d41\") " pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:38 crc kubenswrapper[4766]: I1002 12:42:38.933881 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.046399 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.116773 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-dns-svc\") pod \"50942d8a-167e-49a6-beeb-6f18aca8fa94\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.117112 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-ovsdbserver-nb\") pod \"50942d8a-167e-49a6-beeb-6f18aca8fa94\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.117247 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r52q\" (UniqueName: \"kubernetes.io/projected/50942d8a-167e-49a6-beeb-6f18aca8fa94-kube-api-access-5r52q\") pod \"50942d8a-167e-49a6-beeb-6f18aca8fa94\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.117280 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-config\") pod \"50942d8a-167e-49a6-beeb-6f18aca8fa94\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.117986 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-ovsdbserver-sb\") pod \"50942d8a-167e-49a6-beeb-6f18aca8fa94\" (UID: \"50942d8a-167e-49a6-beeb-6f18aca8fa94\") " Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.127057 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50942d8a-167e-49a6-beeb-6f18aca8fa94-kube-api-access-5r52q" (OuterVolumeSpecName: "kube-api-access-5r52q") pod "50942d8a-167e-49a6-beeb-6f18aca8fa94" (UID: "50942d8a-167e-49a6-beeb-6f18aca8fa94"). InnerVolumeSpecName "kube-api-access-5r52q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.209963 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "50942d8a-167e-49a6-beeb-6f18aca8fa94" (UID: "50942d8a-167e-49a6-beeb-6f18aca8fa94"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.218178 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "50942d8a-167e-49a6-beeb-6f18aca8fa94" (UID: "50942d8a-167e-49a6-beeb-6f18aca8fa94"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.228941 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "50942d8a-167e-49a6-beeb-6f18aca8fa94" (UID: "50942d8a-167e-49a6-beeb-6f18aca8fa94"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.229042 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r52q\" (UniqueName: \"kubernetes.io/projected/50942d8a-167e-49a6-beeb-6f18aca8fa94-kube-api-access-5r52q\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.229066 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.229078 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.309787 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-config" (OuterVolumeSpecName: "config") pod "50942d8a-167e-49a6-beeb-6f18aca8fa94" (UID: "50942d8a-167e-49a6-beeb-6f18aca8fa94"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.334466 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.334497 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50942d8a-167e-49a6-beeb-6f18aca8fa94-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.453956 4766 generic.go:334] "Generic (PLEG): container finished" podID="50942d8a-167e-49a6-beeb-6f18aca8fa94" containerID="82e2a0c435d41f32ee1e02c6795bde69fffa18ff44c754f4cc9ae3809bf5c084" exitCode=0 Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.454038 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" event={"ID":"50942d8a-167e-49a6-beeb-6f18aca8fa94","Type":"ContainerDied","Data":"82e2a0c435d41f32ee1e02c6795bde69fffa18ff44c754f4cc9ae3809bf5c084"} Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.454092 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" event={"ID":"50942d8a-167e-49a6-beeb-6f18aca8fa94","Type":"ContainerDied","Data":"ed3e81fd542e4ce0c31366bc62277fcc78d4f89ed8ee5842b25c982b875e2e04"} Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.454114 4766 scope.go:117] "RemoveContainer" containerID="82e2a0c435d41f32ee1e02c6795bde69fffa18ff44c754f4cc9ae3809bf5c084" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.454364 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59b859dfd5-srz5p" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.519347 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59b859dfd5-srz5p"] Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.522917 4766 scope.go:117] "RemoveContainer" containerID="df66ecc13aa9d11c6487a44cd0165794ea23318ec1d4afadca44745a5bd3cd52" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.533221 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59b859dfd5-srz5p"] Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.599087 4766 scope.go:117] "RemoveContainer" containerID="82e2a0c435d41f32ee1e02c6795bde69fffa18ff44c754f4cc9ae3809bf5c084" Oct 02 12:42:39 crc kubenswrapper[4766]: E1002 12:42:39.600781 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82e2a0c435d41f32ee1e02c6795bde69fffa18ff44c754f4cc9ae3809bf5c084\": container with ID starting with 82e2a0c435d41f32ee1e02c6795bde69fffa18ff44c754f4cc9ae3809bf5c084 not found: ID does not exist" containerID="82e2a0c435d41f32ee1e02c6795bde69fffa18ff44c754f4cc9ae3809bf5c084" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.600828 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e2a0c435d41f32ee1e02c6795bde69fffa18ff44c754f4cc9ae3809bf5c084"} err="failed to get container status \"82e2a0c435d41f32ee1e02c6795bde69fffa18ff44c754f4cc9ae3809bf5c084\": rpc error: code = NotFound desc = could not find container \"82e2a0c435d41f32ee1e02c6795bde69fffa18ff44c754f4cc9ae3809bf5c084\": container with ID starting with 82e2a0c435d41f32ee1e02c6795bde69fffa18ff44c754f4cc9ae3809bf5c084 not found: ID does not exist" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.600856 4766 scope.go:117] "RemoveContainer" containerID="df66ecc13aa9d11c6487a44cd0165794ea23318ec1d4afadca44745a5bd3cd52" Oct 02 12:42:39 crc kubenswrapper[4766]: E1002 12:42:39.602177 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df66ecc13aa9d11c6487a44cd0165794ea23318ec1d4afadca44745a5bd3cd52\": container with ID starting with df66ecc13aa9d11c6487a44cd0165794ea23318ec1d4afadca44745a5bd3cd52 not found: ID does not exist" containerID="df66ecc13aa9d11c6487a44cd0165794ea23318ec1d4afadca44745a5bd3cd52" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.602200 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df66ecc13aa9d11c6487a44cd0165794ea23318ec1d4afadca44745a5bd3cd52"} err="failed to get container status \"df66ecc13aa9d11c6487a44cd0165794ea23318ec1d4afadca44745a5bd3cd52\": rpc error: code = NotFound desc = could not find container \"df66ecc13aa9d11c6487a44cd0165794ea23318ec1d4afadca44745a5bd3cd52\": container with ID starting with df66ecc13aa9d11c6487a44cd0165794ea23318ec1d4afadca44745a5bd3cd52 not found: ID does not exist" Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.663959 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bf98f57-9h5jh"] Oct 02 12:42:39 crc kubenswrapper[4766]: I1002 12:42:39.911466 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50942d8a-167e-49a6-beeb-6f18aca8fa94" path="/var/lib/kubelet/pods/50942d8a-167e-49a6-beeb-6f18aca8fa94/volumes" Oct 02 12:42:40 crc kubenswrapper[4766]: I1002 12:42:40.466912 4766 generic.go:334] "Generic (PLEG): container finished" podID="675d2c19-820c-4e5e-b461-b44b4afe9d41" containerID="d28cb72ae8458ab84dca56d2f146d5f8cf3f9556117bd81f87a9638468434920" exitCode=0 Oct 02 12:42:40 crc kubenswrapper[4766]: I1002 12:42:40.466963 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf98f57-9h5jh" event={"ID":"675d2c19-820c-4e5e-b461-b44b4afe9d41","Type":"ContainerDied","Data":"d28cb72ae8458ab84dca56d2f146d5f8cf3f9556117bd81f87a9638468434920"} Oct 02 12:42:40 crc kubenswrapper[4766]: I1002 12:42:40.467280 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf98f57-9h5jh" event={"ID":"675d2c19-820c-4e5e-b461-b44b4afe9d41","Type":"ContainerStarted","Data":"83d37ac926c1a5933907bf862115804b2de903ee02644520fe3b05472a3f4e17"} Oct 02 12:42:41 crc kubenswrapper[4766]: I1002 12:42:41.477362 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bf98f57-9h5jh" event={"ID":"675d2c19-820c-4e5e-b461-b44b4afe9d41","Type":"ContainerStarted","Data":"68992a6fc8d31e26a775c81b42cb3a31ec82a37e48cd1b70508688ce8d26ba12"} Oct 02 12:42:41 crc kubenswrapper[4766]: I1002 12:42:41.477838 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:41 crc kubenswrapper[4766]: I1002 12:42:41.528535 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bf98f57-9h5jh" podStartSLOduration=3.52851536 podStartE2EDuration="3.52851536s" podCreationTimestamp="2025-10-02 12:42:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:42:41.521827377 +0000 UTC m=+6676.464698321" watchObservedRunningTime="2025-10-02 12:42:41.52851536 +0000 UTC m=+6676.471386304" Oct 02 12:42:48 crc kubenswrapper[4766]: I1002 12:42:48.935768 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bf98f57-9h5jh" Oct 02 12:42:48 crc kubenswrapper[4766]: I1002 12:42:48.997778 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c79f85559-czpz9"] Oct 02 12:42:48 crc kubenswrapper[4766]: I1002 12:42:48.998033 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c79f85559-czpz9" podUID="ee1674b6-7c4a-49c6-8ac0-99188be60a90" containerName="dnsmasq-dns" containerID="cri-o://03147055038d6f63bcdf96621276523e2bcd9772e9f1d47346b0269127c3a8e0" gracePeriod=10 Oct 02 12:42:49 crc kubenswrapper[4766]: I1002 12:42:49.568416 4766 generic.go:334] "Generic (PLEG): container finished" podID="ee1674b6-7c4a-49c6-8ac0-99188be60a90" containerID="03147055038d6f63bcdf96621276523e2bcd9772e9f1d47346b0269127c3a8e0" exitCode=0 Oct 02 12:42:49 crc kubenswrapper[4766]: I1002 12:42:49.568514 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c79f85559-czpz9" event={"ID":"ee1674b6-7c4a-49c6-8ac0-99188be60a90","Type":"ContainerDied","Data":"03147055038d6f63bcdf96621276523e2bcd9772e9f1d47346b0269127c3a8e0"} Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.160742 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.319957 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-config\") pod \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.320410 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-dns-svc\") pod \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.321028 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-openstack-cell1\") pod \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.321187 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7r9n\" (UniqueName: \"kubernetes.io/projected/ee1674b6-7c4a-49c6-8ac0-99188be60a90-kube-api-access-d7r9n\") pod \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.321207 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-ovsdbserver-nb\") pod \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.321233 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-ovsdbserver-sb\") pod \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\" (UID: \"ee1674b6-7c4a-49c6-8ac0-99188be60a90\") " Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.328849 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee1674b6-7c4a-49c6-8ac0-99188be60a90-kube-api-access-d7r9n" (OuterVolumeSpecName: "kube-api-access-d7r9n") pod "ee1674b6-7c4a-49c6-8ac0-99188be60a90" (UID: "ee1674b6-7c4a-49c6-8ac0-99188be60a90"). InnerVolumeSpecName "kube-api-access-d7r9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.380994 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "ee1674b6-7c4a-49c6-8ac0-99188be60a90" (UID: "ee1674b6-7c4a-49c6-8ac0-99188be60a90"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.387594 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee1674b6-7c4a-49c6-8ac0-99188be60a90" (UID: "ee1674b6-7c4a-49c6-8ac0-99188be60a90"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.389412 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee1674b6-7c4a-49c6-8ac0-99188be60a90" (UID: "ee1674b6-7c4a-49c6-8ac0-99188be60a90"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.390116 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-config" (OuterVolumeSpecName: "config") pod "ee1674b6-7c4a-49c6-8ac0-99188be60a90" (UID: "ee1674b6-7c4a-49c6-8ac0-99188be60a90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.402335 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee1674b6-7c4a-49c6-8ac0-99188be60a90" (UID: "ee1674b6-7c4a-49c6-8ac0-99188be60a90"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.423769 4766 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-config\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.423809 4766 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.423820 4766 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.423832 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.423845 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7r9n\" (UniqueName: \"kubernetes.io/projected/ee1674b6-7c4a-49c6-8ac0-99188be60a90-kube-api-access-d7r9n\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.423857 4766 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee1674b6-7c4a-49c6-8ac0-99188be60a90-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.579733 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c79f85559-czpz9" event={"ID":"ee1674b6-7c4a-49c6-8ac0-99188be60a90","Type":"ContainerDied","Data":"e240c9a44a2f62626a32a4199476651b7f72630972e1c4d810d6a4a2437be984"} Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.579780 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c79f85559-czpz9" Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.579789 4766 scope.go:117] "RemoveContainer" containerID="03147055038d6f63bcdf96621276523e2bcd9772e9f1d47346b0269127c3a8e0" Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.616213 4766 scope.go:117] "RemoveContainer" containerID="7f3f51e47d61d0bc1a05c9d25457c141502cddd7834c9bde7c442c13c514322b" Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.620816 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c79f85559-czpz9"] Oct 02 12:42:50 crc kubenswrapper[4766]: I1002 12:42:50.637583 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c79f85559-czpz9"] Oct 02 12:42:51 crc kubenswrapper[4766]: I1002 12:42:51.895916 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee1674b6-7c4a-49c6-8ac0-99188be60a90" path="/var/lib/kubelet/pods/ee1674b6-7c4a-49c6-8ac0-99188be60a90/volumes" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.051020 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-rh557"] Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.063793 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-rh557"] Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.697661 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm"] Oct 02 12:42:59 crc kubenswrapper[4766]: E1002 12:42:59.698332 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1674b6-7c4a-49c6-8ac0-99188be60a90" containerName="init" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.698350 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1674b6-7c4a-49c6-8ac0-99188be60a90" containerName="init" Oct 02 12:42:59 crc kubenswrapper[4766]: E1002 12:42:59.698390 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50942d8a-167e-49a6-beeb-6f18aca8fa94" containerName="dnsmasq-dns" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.698399 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="50942d8a-167e-49a6-beeb-6f18aca8fa94" containerName="dnsmasq-dns" Oct 02 12:42:59 crc kubenswrapper[4766]: E1002 12:42:59.698426 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50942d8a-167e-49a6-beeb-6f18aca8fa94" containerName="init" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.698434 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="50942d8a-167e-49a6-beeb-6f18aca8fa94" containerName="init" Oct 02 12:42:59 crc kubenswrapper[4766]: E1002 12:42:59.698467 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1674b6-7c4a-49c6-8ac0-99188be60a90" containerName="dnsmasq-dns" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.698474 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1674b6-7c4a-49c6-8ac0-99188be60a90" containerName="dnsmasq-dns" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.698760 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="50942d8a-167e-49a6-beeb-6f18aca8fa94" containerName="dnsmasq-dns" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.698783 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee1674b6-7c4a-49c6-8ac0-99188be60a90" containerName="dnsmasq-dns" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.699997 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.705264 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.705678 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.705842 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.706003 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.768420 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm"] Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.837935 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.838018 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.838086 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.838162 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfvkh\" (UniqueName: \"kubernetes.io/projected/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-kube-api-access-qfvkh\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.838299 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.896186 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37995d46-e73a-436f-8eae-da2c72de6a66" path="/var/lib/kubelet/pods/37995d46-e73a-436f-8eae-da2c72de6a66/volumes" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.940391 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.940553 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfvkh\" (UniqueName: \"kubernetes.io/projected/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-kube-api-access-qfvkh\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.940672 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.940915 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.940955 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.947480 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.947475 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.947697 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.948921 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:42:59 crc kubenswrapper[4766]: I1002 12:42:59.964742 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfvkh\" (UniqueName: \"kubernetes.io/projected/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-kube-api-access-qfvkh\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:43:00 crc kubenswrapper[4766]: I1002 12:43:00.030672 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:43:00 crc kubenswrapper[4766]: W1002 12:43:00.665995 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d25c6d1_9077_4c8d_b356_44d4a0abb5fa.slice/crio-f7586084f3c801d93a335d8e08c795e49abb6125f95ae4d3bfd9effda6bae5c4 WatchSource:0}: Error finding container f7586084f3c801d93a335d8e08c795e49abb6125f95ae4d3bfd9effda6bae5c4: Status 404 returned error can't find the container with id f7586084f3c801d93a335d8e08c795e49abb6125f95ae4d3bfd9effda6bae5c4 Oct 02 12:43:00 crc kubenswrapper[4766]: I1002 12:43:00.668210 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm"] Oct 02 12:43:00 crc kubenswrapper[4766]: I1002 12:43:00.725036 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" event={"ID":"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa","Type":"ContainerStarted","Data":"f7586084f3c801d93a335d8e08c795e49abb6125f95ae4d3bfd9effda6bae5c4"} Oct 02 12:43:11 crc kubenswrapper[4766]: I1002 12:43:11.040181 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-97ae-account-create-4bbqr"] Oct 02 12:43:11 crc kubenswrapper[4766]: I1002 12:43:11.050181 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-97ae-account-create-4bbqr"] Oct 02 12:43:11 crc kubenswrapper[4766]: I1002 12:43:11.896507 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ac43509-36b7-4ec3-b5f4-814dfe7990c3" path="/var/lib/kubelet/pods/8ac43509-36b7-4ec3-b5f4-814dfe7990c3/volumes" Oct 02 12:43:16 crc kubenswrapper[4766]: I1002 12:43:16.941304 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" event={"ID":"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa","Type":"ContainerStarted","Data":"acc0be79d3eb4f7bcba1d5944490d52b747ba0821e05e78abc4ecfbf558a630f"} Oct 02 12:43:16 crc kubenswrapper[4766]: I1002 12:43:16.959216 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" podStartSLOduration=1.99006537 podStartE2EDuration="17.959192351s" podCreationTimestamp="2025-10-02 12:42:59 +0000 UTC" firstStartedPulling="2025-10-02 12:43:00.668735267 +0000 UTC m=+6695.611606201" lastFinishedPulling="2025-10-02 12:43:16.637862238 +0000 UTC m=+6711.580733182" observedRunningTime="2025-10-02 12:43:16.957171616 +0000 UTC m=+6711.900042570" watchObservedRunningTime="2025-10-02 12:43:16.959192351 +0000 UTC m=+6711.902063295" Oct 02 12:43:18 crc kubenswrapper[4766]: I1002 12:43:18.057272 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-4n6rv"] Oct 02 12:43:18 crc kubenswrapper[4766]: I1002 12:43:18.071380 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-4n6rv"] Oct 02 12:43:19 crc kubenswrapper[4766]: I1002 12:43:19.902842 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e6ee14-5258-4176-8127-49abee572c04" path="/var/lib/kubelet/pods/d6e6ee14-5258-4176-8127-49abee572c04/volumes" Oct 02 12:43:29 crc kubenswrapper[4766]: I1002 12:43:29.038876 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-d15f-account-create-8k9hq"] Oct 02 12:43:29 crc kubenswrapper[4766]: I1002 12:43:29.049369 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-d15f-account-create-8k9hq"] Oct 02 12:43:29 crc kubenswrapper[4766]: I1002 12:43:29.900351 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a98e8533-c85c-4959-bf17-1165d6b90c8a" path="/var/lib/kubelet/pods/a98e8533-c85c-4959-bf17-1165d6b90c8a/volumes" Oct 02 12:43:31 crc kubenswrapper[4766]: I1002 12:43:31.095595 4766 generic.go:334] "Generic (PLEG): container finished" podID="2d25c6d1-9077-4c8d-b356-44d4a0abb5fa" containerID="acc0be79d3eb4f7bcba1d5944490d52b747ba0821e05e78abc4ecfbf558a630f" exitCode=0 Oct 02 12:43:31 crc kubenswrapper[4766]: I1002 12:43:31.095654 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" event={"ID":"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa","Type":"ContainerDied","Data":"acc0be79d3eb4f7bcba1d5944490d52b747ba0821e05e78abc4ecfbf558a630f"} Oct 02 12:43:32 crc kubenswrapper[4766]: I1002 12:43:32.624974 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:43:32 crc kubenswrapper[4766]: I1002 12:43:32.718289 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-pre-adoption-validation-combined-ca-bundle\") pod \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " Oct 02 12:43:32 crc kubenswrapper[4766]: I1002 12:43:32.718554 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfvkh\" (UniqueName: \"kubernetes.io/projected/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-kube-api-access-qfvkh\") pod \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " Oct 02 12:43:32 crc kubenswrapper[4766]: I1002 12:43:32.718614 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-ceph\") pod \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " Oct 02 12:43:32 crc kubenswrapper[4766]: I1002 12:43:32.718639 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-inventory\") pod \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " Oct 02 12:43:32 crc kubenswrapper[4766]: I1002 12:43:32.718777 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-ssh-key\") pod \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\" (UID: \"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa\") " Oct 02 12:43:32 crc kubenswrapper[4766]: I1002 12:43:32.725648 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-ceph" (OuterVolumeSpecName: "ceph") pod "2d25c6d1-9077-4c8d-b356-44d4a0abb5fa" (UID: "2d25c6d1-9077-4c8d-b356-44d4a0abb5fa"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:43:32 crc kubenswrapper[4766]: I1002 12:43:32.725847 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-kube-api-access-qfvkh" (OuterVolumeSpecName: "kube-api-access-qfvkh") pod "2d25c6d1-9077-4c8d-b356-44d4a0abb5fa" (UID: "2d25c6d1-9077-4c8d-b356-44d4a0abb5fa"). InnerVolumeSpecName "kube-api-access-qfvkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:43:32 crc kubenswrapper[4766]: I1002 12:43:32.733757 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "2d25c6d1-9077-4c8d-b356-44d4a0abb5fa" (UID: "2d25c6d1-9077-4c8d-b356-44d4a0abb5fa"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:43:32 crc kubenswrapper[4766]: I1002 12:43:32.755626 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-inventory" (OuterVolumeSpecName: "inventory") pod "2d25c6d1-9077-4c8d-b356-44d4a0abb5fa" (UID: "2d25c6d1-9077-4c8d-b356-44d4a0abb5fa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:43:32 crc kubenswrapper[4766]: I1002 12:43:32.762281 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2d25c6d1-9077-4c8d-b356-44d4a0abb5fa" (UID: "2d25c6d1-9077-4c8d-b356-44d4a0abb5fa"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:43:32 crc kubenswrapper[4766]: I1002 12:43:32.822649 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 12:43:32 crc kubenswrapper[4766]: I1002 12:43:32.822692 4766 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:43:32 crc kubenswrapper[4766]: I1002 12:43:32.822704 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfvkh\" (UniqueName: \"kubernetes.io/projected/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-kube-api-access-qfvkh\") on node \"crc\" DevicePath \"\"" Oct 02 12:43:32 crc kubenswrapper[4766]: I1002 12:43:32.822716 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 12:43:32 crc kubenswrapper[4766]: I1002 12:43:32.822731 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d25c6d1-9077-4c8d-b356-44d4a0abb5fa-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 12:43:33 crc kubenswrapper[4766]: I1002 12:43:33.124840 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" event={"ID":"2d25c6d1-9077-4c8d-b356-44d4a0abb5fa","Type":"ContainerDied","Data":"f7586084f3c801d93a335d8e08c795e49abb6125f95ae4d3bfd9effda6bae5c4"} Oct 02 12:43:33 crc kubenswrapper[4766]: I1002 12:43:33.124894 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7586084f3c801d93a335d8e08c795e49abb6125f95ae4d3bfd9effda6bae5c4" Oct 02 12:43:33 crc kubenswrapper[4766]: I1002 12:43:33.124947 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm" Oct 02 12:43:40 crc kubenswrapper[4766]: I1002 12:43:40.360138 4766 scope.go:117] "RemoveContainer" containerID="51e66b40b6f8ce6661eda5149f4d3812bcd4a39b4ace1cfd5a0c2cc0bc95789b" Oct 02 12:43:40 crc kubenswrapper[4766]: I1002 12:43:40.398960 4766 scope.go:117] "RemoveContainer" containerID="d383351a3ea48b912726f76dd6360dedee50ae871d77eefa0844628306f87093" Oct 02 12:43:40 crc kubenswrapper[4766]: I1002 12:43:40.481380 4766 scope.go:117] "RemoveContainer" containerID="04bf335cb0a12d07614f8fc9c5970b17ac1c372dfcbaf5429b39e15443313c21" Oct 02 12:43:40 crc kubenswrapper[4766]: I1002 12:43:40.540713 4766 scope.go:117] "RemoveContainer" containerID="20a7a021c4471352b2cfdb347a2a41a4cfe27de6b6bfe57c9a1e5ef4a32d2ce7" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.581747 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58"] Oct 02 12:43:42 crc kubenswrapper[4766]: E1002 12:43:42.582969 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d25c6d1-9077-4c8d-b356-44d4a0abb5fa" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.582990 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d25c6d1-9077-4c8d-b356-44d4a0abb5fa" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.583227 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d25c6d1-9077-4c8d-b356-44d4a0abb5fa" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.584199 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.587164 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.587322 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.587566 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.589594 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.600151 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58"] Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.680828 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.681081 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.681271 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.681397 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr6lf\" (UniqueName: \"kubernetes.io/projected/88d78077-1bd0-416c-979a-b52075152089-kube-api-access-rr6lf\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.681460 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.783105 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.783213 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr6lf\" (UniqueName: \"kubernetes.io/projected/88d78077-1bd0-416c-979a-b52075152089-kube-api-access-rr6lf\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.783295 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.783482 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.783553 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.789985 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.791240 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.791987 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.792832 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.815917 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr6lf\" (UniqueName: \"kubernetes.io/projected/88d78077-1bd0-416c-979a-b52075152089-kube-api-access-rr6lf\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:43:42 crc kubenswrapper[4766]: I1002 12:43:42.921457 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:43:43 crc kubenswrapper[4766]: I1002 12:43:43.488363 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58"] Oct 02 12:43:43 crc kubenswrapper[4766]: W1002 12:43:43.491176 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88d78077_1bd0_416c_979a_b52075152089.slice/crio-9e0af97fe6b3119398d24affbf448f98e4ab854051e2bde610b083afaa2c8fa2 WatchSource:0}: Error finding container 9e0af97fe6b3119398d24affbf448f98e4ab854051e2bde610b083afaa2c8fa2: Status 404 returned error can't find the container with id 9e0af97fe6b3119398d24affbf448f98e4ab854051e2bde610b083afaa2c8fa2 Oct 02 12:43:44 crc kubenswrapper[4766]: I1002 12:43:44.288378 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" event={"ID":"88d78077-1bd0-416c-979a-b52075152089","Type":"ContainerStarted","Data":"73da9b9a999a7121a2d0d5056892db8bf0436940fd18829de76f9afbe06c983e"} Oct 02 12:43:44 crc kubenswrapper[4766]: I1002 12:43:44.288684 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" event={"ID":"88d78077-1bd0-416c-979a-b52075152089","Type":"ContainerStarted","Data":"9e0af97fe6b3119398d24affbf448f98e4ab854051e2bde610b083afaa2c8fa2"} Oct 02 12:43:44 crc kubenswrapper[4766]: I1002 12:43:44.318423 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" podStartSLOduration=2.147644609 podStartE2EDuration="2.318391618s" podCreationTimestamp="2025-10-02 12:43:42 +0000 UTC" firstStartedPulling="2025-10-02 12:43:43.494231918 +0000 UTC m=+6738.437102862" lastFinishedPulling="2025-10-02 12:43:43.664978927 +0000 UTC m=+6738.607849871" observedRunningTime="2025-10-02 12:43:44.305405392 +0000 UTC m=+6739.248276346" watchObservedRunningTime="2025-10-02 12:43:44.318391618 +0000 UTC m=+6739.261262562" Oct 02 12:44:07 crc kubenswrapper[4766]: I1002 12:44:07.064963 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-8pj2m"] Oct 02 12:44:07 crc kubenswrapper[4766]: I1002 12:44:07.082620 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-8pj2m"] Oct 02 12:44:07 crc kubenswrapper[4766]: I1002 12:44:07.896607 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4" path="/var/lib/kubelet/pods/ff13d18f-5c99-46f5-88ba-c6b3ff4ad9c4/volumes" Oct 02 12:44:40 crc kubenswrapper[4766]: I1002 12:44:40.707953 4766 scope.go:117] "RemoveContainer" containerID="01bcca35808068d85f7c0b71d608222e9ab4e1a7b1cf02d06f450e137eaa3f2d" Oct 02 12:44:40 crc kubenswrapper[4766]: I1002 12:44:40.751142 4766 scope.go:117] "RemoveContainer" containerID="a6325c657d10489588d6f282b23902a2dd33c7cba6b68c4663c0b406718e617f" Oct 02 12:44:54 crc kubenswrapper[4766]: I1002 12:44:54.432654 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:44:54 crc kubenswrapper[4766]: I1002 12:44:54.433265 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:45:00 crc kubenswrapper[4766]: I1002 12:45:00.184025 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b"] Oct 02 12:45:00 crc kubenswrapper[4766]: I1002 12:45:00.190891 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" Oct 02 12:45:00 crc kubenswrapper[4766]: I1002 12:45:00.197052 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 12:45:00 crc kubenswrapper[4766]: I1002 12:45:00.197376 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 12:45:00 crc kubenswrapper[4766]: I1002 12:45:00.200004 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b"] Oct 02 12:45:00 crc kubenswrapper[4766]: I1002 12:45:00.358732 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eb675bb-c55b-4030-9540-06e38ef714e4-config-volume\") pod \"collect-profiles-29323485-jbz4b\" (UID: \"3eb675bb-c55b-4030-9540-06e38ef714e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" Oct 02 12:45:00 crc kubenswrapper[4766]: I1002 12:45:00.359343 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eb675bb-c55b-4030-9540-06e38ef714e4-secret-volume\") pod \"collect-profiles-29323485-jbz4b\" (UID: \"3eb675bb-c55b-4030-9540-06e38ef714e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" Oct 02 12:45:00 crc kubenswrapper[4766]: I1002 12:45:00.359470 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbzhc\" (UniqueName: \"kubernetes.io/projected/3eb675bb-c55b-4030-9540-06e38ef714e4-kube-api-access-sbzhc\") pod \"collect-profiles-29323485-jbz4b\" (UID: \"3eb675bb-c55b-4030-9540-06e38ef714e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" Oct 02 12:45:00 crc kubenswrapper[4766]: I1002 12:45:00.460816 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eb675bb-c55b-4030-9540-06e38ef714e4-secret-volume\") pod \"collect-profiles-29323485-jbz4b\" (UID: \"3eb675bb-c55b-4030-9540-06e38ef714e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" Oct 02 12:45:00 crc kubenswrapper[4766]: I1002 12:45:00.460878 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbzhc\" (UniqueName: \"kubernetes.io/projected/3eb675bb-c55b-4030-9540-06e38ef714e4-kube-api-access-sbzhc\") pod \"collect-profiles-29323485-jbz4b\" (UID: \"3eb675bb-c55b-4030-9540-06e38ef714e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" Oct 02 12:45:00 crc kubenswrapper[4766]: I1002 12:45:00.460911 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eb675bb-c55b-4030-9540-06e38ef714e4-config-volume\") pod \"collect-profiles-29323485-jbz4b\" (UID: \"3eb675bb-c55b-4030-9540-06e38ef714e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" Oct 02 12:45:00 crc kubenswrapper[4766]: I1002 12:45:00.461885 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eb675bb-c55b-4030-9540-06e38ef714e4-config-volume\") pod \"collect-profiles-29323485-jbz4b\" (UID: \"3eb675bb-c55b-4030-9540-06e38ef714e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" Oct 02 12:45:00 crc kubenswrapper[4766]: I1002 12:45:00.467861 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eb675bb-c55b-4030-9540-06e38ef714e4-secret-volume\") pod \"collect-profiles-29323485-jbz4b\" (UID: \"3eb675bb-c55b-4030-9540-06e38ef714e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" Oct 02 12:45:00 crc kubenswrapper[4766]: I1002 12:45:00.477628 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbzhc\" (UniqueName: \"kubernetes.io/projected/3eb675bb-c55b-4030-9540-06e38ef714e4-kube-api-access-sbzhc\") pod \"collect-profiles-29323485-jbz4b\" (UID: \"3eb675bb-c55b-4030-9540-06e38ef714e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" Oct 02 12:45:00 crc kubenswrapper[4766]: I1002 12:45:00.530072 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" Oct 02 12:45:00 crc kubenswrapper[4766]: I1002 12:45:00.995222 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b"] Oct 02 12:45:01 crc kubenswrapper[4766]: I1002 12:45:01.236686 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" event={"ID":"3eb675bb-c55b-4030-9540-06e38ef714e4","Type":"ContainerStarted","Data":"3b7d2ff88e13fb551bb5b8396d19888a8483d9d17a4be31488d24e004209f7d8"} Oct 02 12:45:02 crc kubenswrapper[4766]: I1002 12:45:02.264021 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" event={"ID":"3eb675bb-c55b-4030-9540-06e38ef714e4","Type":"ContainerStarted","Data":"c1cb41b85a1671773e16e79691210a93389da0e44caa4ba9eec1413b98bb1786"} Oct 02 12:45:02 crc kubenswrapper[4766]: I1002 12:45:02.289631 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" podStartSLOduration=2.28960433 podStartE2EDuration="2.28960433s" podCreationTimestamp="2025-10-02 12:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 12:45:02.278436463 +0000 UTC m=+6817.221307407" watchObservedRunningTime="2025-10-02 12:45:02.28960433 +0000 UTC m=+6817.232475274" Oct 02 12:45:03 crc kubenswrapper[4766]: I1002 12:45:03.281048 4766 generic.go:334] "Generic (PLEG): container finished" podID="3eb675bb-c55b-4030-9540-06e38ef714e4" containerID="c1cb41b85a1671773e16e79691210a93389da0e44caa4ba9eec1413b98bb1786" exitCode=0 Oct 02 12:45:03 crc kubenswrapper[4766]: I1002 12:45:03.281201 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" event={"ID":"3eb675bb-c55b-4030-9540-06e38ef714e4","Type":"ContainerDied","Data":"c1cb41b85a1671773e16e79691210a93389da0e44caa4ba9eec1413b98bb1786"} Oct 02 12:45:04 crc kubenswrapper[4766]: I1002 12:45:04.706101 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" Oct 02 12:45:04 crc kubenswrapper[4766]: I1002 12:45:04.866166 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eb675bb-c55b-4030-9540-06e38ef714e4-config-volume\") pod \"3eb675bb-c55b-4030-9540-06e38ef714e4\" (UID: \"3eb675bb-c55b-4030-9540-06e38ef714e4\") " Oct 02 12:45:04 crc kubenswrapper[4766]: I1002 12:45:04.866237 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eb675bb-c55b-4030-9540-06e38ef714e4-secret-volume\") pod \"3eb675bb-c55b-4030-9540-06e38ef714e4\" (UID: \"3eb675bb-c55b-4030-9540-06e38ef714e4\") " Oct 02 12:45:04 crc kubenswrapper[4766]: I1002 12:45:04.866749 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbzhc\" (UniqueName: \"kubernetes.io/projected/3eb675bb-c55b-4030-9540-06e38ef714e4-kube-api-access-sbzhc\") pod \"3eb675bb-c55b-4030-9540-06e38ef714e4\" (UID: \"3eb675bb-c55b-4030-9540-06e38ef714e4\") " Oct 02 12:45:04 crc kubenswrapper[4766]: I1002 12:45:04.867296 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3eb675bb-c55b-4030-9540-06e38ef714e4-config-volume" (OuterVolumeSpecName: "config-volume") pod "3eb675bb-c55b-4030-9540-06e38ef714e4" (UID: "3eb675bb-c55b-4030-9540-06e38ef714e4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 12:45:04 crc kubenswrapper[4766]: I1002 12:45:04.873250 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb675bb-c55b-4030-9540-06e38ef714e4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3eb675bb-c55b-4030-9540-06e38ef714e4" (UID: "3eb675bb-c55b-4030-9540-06e38ef714e4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:45:04 crc kubenswrapper[4766]: I1002 12:45:04.875963 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb675bb-c55b-4030-9540-06e38ef714e4-kube-api-access-sbzhc" (OuterVolumeSpecName: "kube-api-access-sbzhc") pod "3eb675bb-c55b-4030-9540-06e38ef714e4" (UID: "3eb675bb-c55b-4030-9540-06e38ef714e4"). InnerVolumeSpecName "kube-api-access-sbzhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:45:04 crc kubenswrapper[4766]: I1002 12:45:04.969597 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbzhc\" (UniqueName: \"kubernetes.io/projected/3eb675bb-c55b-4030-9540-06e38ef714e4-kube-api-access-sbzhc\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:04 crc kubenswrapper[4766]: I1002 12:45:04.969646 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3eb675bb-c55b-4030-9540-06e38ef714e4-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:04 crc kubenswrapper[4766]: I1002 12:45:04.969664 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3eb675bb-c55b-4030-9540-06e38ef714e4-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 12:45:05 crc kubenswrapper[4766]: I1002 12:45:05.307965 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" event={"ID":"3eb675bb-c55b-4030-9540-06e38ef714e4","Type":"ContainerDied","Data":"3b7d2ff88e13fb551bb5b8396d19888a8483d9d17a4be31488d24e004209f7d8"} Oct 02 12:45:05 crc kubenswrapper[4766]: I1002 12:45:05.308380 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b7d2ff88e13fb551bb5b8396d19888a8483d9d17a4be31488d24e004209f7d8" Oct 02 12:45:05 crc kubenswrapper[4766]: I1002 12:45:05.308051 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323485-jbz4b" Oct 02 12:45:05 crc kubenswrapper[4766]: I1002 12:45:05.370703 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n"] Oct 02 12:45:05 crc kubenswrapper[4766]: I1002 12:45:05.379018 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323440-8bn8n"] Oct 02 12:45:05 crc kubenswrapper[4766]: I1002 12:45:05.902033 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50cd8c1e-373f-4ca6-b413-678459c490f1" path="/var/lib/kubelet/pods/50cd8c1e-373f-4ca6-b413-678459c490f1/volumes" Oct 02 12:45:24 crc kubenswrapper[4766]: I1002 12:45:24.432482 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:45:24 crc kubenswrapper[4766]: I1002 12:45:24.433135 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:45:40 crc kubenswrapper[4766]: I1002 12:45:40.928728 4766 scope.go:117] "RemoveContainer" containerID="5e4b4196a9a0f10f8ef7aea7e40b04b02466fa4eafd1f40b8f1c86e33154c90f" Oct 02 12:45:48 crc kubenswrapper[4766]: I1002 12:45:48.557277 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-84p2g"] Oct 02 12:45:48 crc kubenswrapper[4766]: E1002 12:45:48.558309 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb675bb-c55b-4030-9540-06e38ef714e4" containerName="collect-profiles" Oct 02 12:45:48 crc kubenswrapper[4766]: I1002 12:45:48.558323 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb675bb-c55b-4030-9540-06e38ef714e4" containerName="collect-profiles" Oct 02 12:45:48 crc kubenswrapper[4766]: I1002 12:45:48.558597 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb675bb-c55b-4030-9540-06e38ef714e4" containerName="collect-profiles" Oct 02 12:45:48 crc kubenswrapper[4766]: I1002 12:45:48.560604 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84p2g" Oct 02 12:45:48 crc kubenswrapper[4766]: I1002 12:45:48.574402 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-84p2g"] Oct 02 12:45:48 crc kubenswrapper[4766]: I1002 12:45:48.744041 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-utilities\") pod \"certified-operators-84p2g\" (UID: \"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d\") " pod="openshift-marketplace/certified-operators-84p2g" Oct 02 12:45:48 crc kubenswrapper[4766]: I1002 12:45:48.744361 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b2qz\" (UniqueName: \"kubernetes.io/projected/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-kube-api-access-2b2qz\") pod \"certified-operators-84p2g\" (UID: \"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d\") " pod="openshift-marketplace/certified-operators-84p2g" Oct 02 12:45:48 crc kubenswrapper[4766]: I1002 12:45:48.744441 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-catalog-content\") pod \"certified-operators-84p2g\" (UID: \"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d\") " pod="openshift-marketplace/certified-operators-84p2g" Oct 02 12:45:48 crc kubenswrapper[4766]: I1002 12:45:48.846425 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-utilities\") pod \"certified-operators-84p2g\" (UID: \"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d\") " pod="openshift-marketplace/certified-operators-84p2g" Oct 02 12:45:48 crc kubenswrapper[4766]: I1002 12:45:48.846996 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-utilities\") pod \"certified-operators-84p2g\" (UID: \"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d\") " pod="openshift-marketplace/certified-operators-84p2g" Oct 02 12:45:48 crc kubenswrapper[4766]: I1002 12:45:48.847673 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b2qz\" (UniqueName: \"kubernetes.io/projected/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-kube-api-access-2b2qz\") pod \"certified-operators-84p2g\" (UID: \"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d\") " pod="openshift-marketplace/certified-operators-84p2g" Oct 02 12:45:48 crc kubenswrapper[4766]: I1002 12:45:48.847840 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-catalog-content\") pod \"certified-operators-84p2g\" (UID: \"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d\") " pod="openshift-marketplace/certified-operators-84p2g" Oct 02 12:45:48 crc kubenswrapper[4766]: I1002 12:45:48.848453 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-catalog-content\") pod \"certified-operators-84p2g\" (UID: \"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d\") " pod="openshift-marketplace/certified-operators-84p2g" Oct 02 12:45:48 crc kubenswrapper[4766]: I1002 12:45:48.878070 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b2qz\" (UniqueName: \"kubernetes.io/projected/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-kube-api-access-2b2qz\") pod \"certified-operators-84p2g\" (UID: \"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d\") " pod="openshift-marketplace/certified-operators-84p2g" Oct 02 12:45:48 crc kubenswrapper[4766]: I1002 12:45:48.896032 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84p2g" Oct 02 12:45:49 crc kubenswrapper[4766]: I1002 12:45:49.368030 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-84p2g"] Oct 02 12:45:49 crc kubenswrapper[4766]: W1002 12:45:49.377305 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7de7cd3d_8db6_4728_89cf_7b099aaaaf8d.slice/crio-73139570976a1da305889d3b625cac1f51d91b7f5f219815c252a00b9f4e2ad5 WatchSource:0}: Error finding container 73139570976a1da305889d3b625cac1f51d91b7f5f219815c252a00b9f4e2ad5: Status 404 returned error can't find the container with id 73139570976a1da305889d3b625cac1f51d91b7f5f219815c252a00b9f4e2ad5 Oct 02 12:45:49 crc kubenswrapper[4766]: I1002 12:45:49.895980 4766 generic.go:334] "Generic (PLEG): container finished" podID="7de7cd3d-8db6-4728-89cf-7b099aaaaf8d" containerID="334db39b232bd45cefab3f620b0b073928a5ebce0d9ef7f7b79ad7c257192fa5" exitCode=0 Oct 02 12:45:49 crc kubenswrapper[4766]: I1002 12:45:49.896041 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84p2g" event={"ID":"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d","Type":"ContainerDied","Data":"334db39b232bd45cefab3f620b0b073928a5ebce0d9ef7f7b79ad7c257192fa5"} Oct 02 12:45:49 crc kubenswrapper[4766]: I1002 12:45:49.896351 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84p2g" event={"ID":"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d","Type":"ContainerStarted","Data":"73139570976a1da305889d3b625cac1f51d91b7f5f219815c252a00b9f4e2ad5"} Oct 02 12:45:49 crc kubenswrapper[4766]: I1002 12:45:49.898354 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:45:50 crc kubenswrapper[4766]: I1002 12:45:50.910594 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84p2g" event={"ID":"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d","Type":"ContainerStarted","Data":"9e0c5f5d4f08c8d2cbe25c3c7bd801a3071b61bfe812dce5df8a084a2b262687"} Oct 02 12:45:51 crc kubenswrapper[4766]: I1002 12:45:51.922210 4766 generic.go:334] "Generic (PLEG): container finished" podID="7de7cd3d-8db6-4728-89cf-7b099aaaaf8d" containerID="9e0c5f5d4f08c8d2cbe25c3c7bd801a3071b61bfe812dce5df8a084a2b262687" exitCode=0 Oct 02 12:45:51 crc kubenswrapper[4766]: I1002 12:45:51.922354 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84p2g" event={"ID":"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d","Type":"ContainerDied","Data":"9e0c5f5d4f08c8d2cbe25c3c7bd801a3071b61bfe812dce5df8a084a2b262687"} Oct 02 12:45:52 crc kubenswrapper[4766]: I1002 12:45:52.938761 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84p2g" event={"ID":"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d","Type":"ContainerStarted","Data":"ae168a96fa965819800969fd94252b5ca99b6000d54e8c5f872d44c58c7875c5"} Oct 02 12:45:52 crc kubenswrapper[4766]: I1002 12:45:52.964572 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-84p2g" podStartSLOduration=2.500552558 podStartE2EDuration="4.964554336s" podCreationTimestamp="2025-10-02 12:45:48 +0000 UTC" firstStartedPulling="2025-10-02 12:45:49.89806969 +0000 UTC m=+6864.840940634" lastFinishedPulling="2025-10-02 12:45:52.362071468 +0000 UTC m=+6867.304942412" observedRunningTime="2025-10-02 12:45:52.958842903 +0000 UTC m=+6867.901713907" watchObservedRunningTime="2025-10-02 12:45:52.964554336 +0000 UTC m=+6867.907425280" Oct 02 12:45:54 crc kubenswrapper[4766]: I1002 12:45:54.432446 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:45:54 crc kubenswrapper[4766]: I1002 12:45:54.432815 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:45:54 crc kubenswrapper[4766]: I1002 12:45:54.432954 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 12:45:54 crc kubenswrapper[4766]: I1002 12:45:54.433846 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2b4353ffed00a874e54b842c2e985395b749e0bf31632b418df66f7c022195d"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:45:54 crc kubenswrapper[4766]: I1002 12:45:54.433908 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://c2b4353ffed00a874e54b842c2e985395b749e0bf31632b418df66f7c022195d" gracePeriod=600 Oct 02 12:45:54 crc kubenswrapper[4766]: I1002 12:45:54.960599 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="c2b4353ffed00a874e54b842c2e985395b749e0bf31632b418df66f7c022195d" exitCode=0 Oct 02 12:45:54 crc kubenswrapper[4766]: I1002 12:45:54.960684 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"c2b4353ffed00a874e54b842c2e985395b749e0bf31632b418df66f7c022195d"} Oct 02 12:45:54 crc kubenswrapper[4766]: I1002 12:45:54.960966 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089"} Oct 02 12:45:54 crc kubenswrapper[4766]: I1002 12:45:54.960990 4766 scope.go:117] "RemoveContainer" containerID="63489761a3da41ebb10bdabbaa2e3e5a808a358a1d7c8217184a0263f9d5b03b" Oct 02 12:45:58 crc kubenswrapper[4766]: I1002 12:45:58.896591 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-84p2g" Oct 02 12:45:58 crc kubenswrapper[4766]: I1002 12:45:58.897294 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-84p2g" Oct 02 12:45:59 crc kubenswrapper[4766]: I1002 12:45:59.946437 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-84p2g" podUID="7de7cd3d-8db6-4728-89cf-7b099aaaaf8d" containerName="registry-server" probeResult="failure" output=< Oct 02 12:45:59 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Oct 02 12:45:59 crc kubenswrapper[4766]: > Oct 02 12:46:08 crc kubenswrapper[4766]: I1002 12:46:08.954066 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-84p2g" Oct 02 12:46:09 crc kubenswrapper[4766]: I1002 12:46:09.005771 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-84p2g" Oct 02 12:46:09 crc kubenswrapper[4766]: I1002 12:46:09.194317 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-84p2g"] Oct 02 12:46:10 crc kubenswrapper[4766]: I1002 12:46:10.148769 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-84p2g" podUID="7de7cd3d-8db6-4728-89cf-7b099aaaaf8d" containerName="registry-server" containerID="cri-o://ae168a96fa965819800969fd94252b5ca99b6000d54e8c5f872d44c58c7875c5" gracePeriod=2 Oct 02 12:46:10 crc kubenswrapper[4766]: I1002 12:46:10.710711 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84p2g" Oct 02 12:46:10 crc kubenswrapper[4766]: I1002 12:46:10.827310 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b2qz\" (UniqueName: \"kubernetes.io/projected/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-kube-api-access-2b2qz\") pod \"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d\" (UID: \"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d\") " Oct 02 12:46:10 crc kubenswrapper[4766]: I1002 12:46:10.827600 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-utilities\") pod \"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d\" (UID: \"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d\") " Oct 02 12:46:10 crc kubenswrapper[4766]: I1002 12:46:10.827678 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-catalog-content\") pod \"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d\" (UID: \"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d\") " Oct 02 12:46:10 crc kubenswrapper[4766]: I1002 12:46:10.828339 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-utilities" (OuterVolumeSpecName: "utilities") pod "7de7cd3d-8db6-4728-89cf-7b099aaaaf8d" (UID: "7de7cd3d-8db6-4728-89cf-7b099aaaaf8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:46:10 crc kubenswrapper[4766]: I1002 12:46:10.828722 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:10 crc kubenswrapper[4766]: I1002 12:46:10.835404 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-kube-api-access-2b2qz" (OuterVolumeSpecName: "kube-api-access-2b2qz") pod "7de7cd3d-8db6-4728-89cf-7b099aaaaf8d" (UID: "7de7cd3d-8db6-4728-89cf-7b099aaaaf8d"). InnerVolumeSpecName "kube-api-access-2b2qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:46:10 crc kubenswrapper[4766]: I1002 12:46:10.924393 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7de7cd3d-8db6-4728-89cf-7b099aaaaf8d" (UID: "7de7cd3d-8db6-4728-89cf-7b099aaaaf8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:46:10 crc kubenswrapper[4766]: I1002 12:46:10.932616 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b2qz\" (UniqueName: \"kubernetes.io/projected/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-kube-api-access-2b2qz\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:10 crc kubenswrapper[4766]: I1002 12:46:10.932909 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:46:11 crc kubenswrapper[4766]: I1002 12:46:11.163888 4766 generic.go:334] "Generic (PLEG): container finished" podID="7de7cd3d-8db6-4728-89cf-7b099aaaaf8d" containerID="ae168a96fa965819800969fd94252b5ca99b6000d54e8c5f872d44c58c7875c5" exitCode=0 Oct 02 12:46:11 crc kubenswrapper[4766]: I1002 12:46:11.163936 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84p2g" event={"ID":"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d","Type":"ContainerDied","Data":"ae168a96fa965819800969fd94252b5ca99b6000d54e8c5f872d44c58c7875c5"} Oct 02 12:46:11 crc kubenswrapper[4766]: I1002 12:46:11.163968 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-84p2g" event={"ID":"7de7cd3d-8db6-4728-89cf-7b099aaaaf8d","Type":"ContainerDied","Data":"73139570976a1da305889d3b625cac1f51d91b7f5f219815c252a00b9f4e2ad5"} Oct 02 12:46:11 crc kubenswrapper[4766]: I1002 12:46:11.163995 4766 scope.go:117] "RemoveContainer" containerID="ae168a96fa965819800969fd94252b5ca99b6000d54e8c5f872d44c58c7875c5" Oct 02 12:46:11 crc kubenswrapper[4766]: I1002 12:46:11.164147 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-84p2g" Oct 02 12:46:11 crc kubenswrapper[4766]: I1002 12:46:11.214643 4766 scope.go:117] "RemoveContainer" containerID="9e0c5f5d4f08c8d2cbe25c3c7bd801a3071b61bfe812dce5df8a084a2b262687" Oct 02 12:46:11 crc kubenswrapper[4766]: I1002 12:46:11.239137 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-84p2g"] Oct 02 12:46:11 crc kubenswrapper[4766]: I1002 12:46:11.243940 4766 scope.go:117] "RemoveContainer" containerID="334db39b232bd45cefab3f620b0b073928a5ebce0d9ef7f7b79ad7c257192fa5" Oct 02 12:46:11 crc kubenswrapper[4766]: I1002 12:46:11.252618 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-84p2g"] Oct 02 12:46:11 crc kubenswrapper[4766]: I1002 12:46:11.298227 4766 scope.go:117] "RemoveContainer" containerID="ae168a96fa965819800969fd94252b5ca99b6000d54e8c5f872d44c58c7875c5" Oct 02 12:46:11 crc kubenswrapper[4766]: E1002 12:46:11.298949 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae168a96fa965819800969fd94252b5ca99b6000d54e8c5f872d44c58c7875c5\": container with ID starting with ae168a96fa965819800969fd94252b5ca99b6000d54e8c5f872d44c58c7875c5 not found: ID does not exist" containerID="ae168a96fa965819800969fd94252b5ca99b6000d54e8c5f872d44c58c7875c5" Oct 02 12:46:11 crc kubenswrapper[4766]: I1002 12:46:11.298992 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae168a96fa965819800969fd94252b5ca99b6000d54e8c5f872d44c58c7875c5"} err="failed to get container status \"ae168a96fa965819800969fd94252b5ca99b6000d54e8c5f872d44c58c7875c5\": rpc error: code = NotFound desc = could not find container \"ae168a96fa965819800969fd94252b5ca99b6000d54e8c5f872d44c58c7875c5\": container with ID starting with ae168a96fa965819800969fd94252b5ca99b6000d54e8c5f872d44c58c7875c5 not found: ID does not exist" Oct 02 12:46:11 crc kubenswrapper[4766]: I1002 12:46:11.299022 4766 scope.go:117] "RemoveContainer" containerID="9e0c5f5d4f08c8d2cbe25c3c7bd801a3071b61bfe812dce5df8a084a2b262687" Oct 02 12:46:11 crc kubenswrapper[4766]: E1002 12:46:11.299459 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e0c5f5d4f08c8d2cbe25c3c7bd801a3071b61bfe812dce5df8a084a2b262687\": container with ID starting with 9e0c5f5d4f08c8d2cbe25c3c7bd801a3071b61bfe812dce5df8a084a2b262687 not found: ID does not exist" containerID="9e0c5f5d4f08c8d2cbe25c3c7bd801a3071b61bfe812dce5df8a084a2b262687" Oct 02 12:46:11 crc kubenswrapper[4766]: I1002 12:46:11.299484 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0c5f5d4f08c8d2cbe25c3c7bd801a3071b61bfe812dce5df8a084a2b262687"} err="failed to get container status \"9e0c5f5d4f08c8d2cbe25c3c7bd801a3071b61bfe812dce5df8a084a2b262687\": rpc error: code = NotFound desc = could not find container \"9e0c5f5d4f08c8d2cbe25c3c7bd801a3071b61bfe812dce5df8a084a2b262687\": container with ID starting with 9e0c5f5d4f08c8d2cbe25c3c7bd801a3071b61bfe812dce5df8a084a2b262687 not found: ID does not exist" Oct 02 12:46:11 crc kubenswrapper[4766]: I1002 12:46:11.299516 4766 scope.go:117] "RemoveContainer" containerID="334db39b232bd45cefab3f620b0b073928a5ebce0d9ef7f7b79ad7c257192fa5" Oct 02 12:46:11 crc kubenswrapper[4766]: E1002 12:46:11.300855 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"334db39b232bd45cefab3f620b0b073928a5ebce0d9ef7f7b79ad7c257192fa5\": container with ID starting with 334db39b232bd45cefab3f620b0b073928a5ebce0d9ef7f7b79ad7c257192fa5 not found: ID does not exist" containerID="334db39b232bd45cefab3f620b0b073928a5ebce0d9ef7f7b79ad7c257192fa5" Oct 02 12:46:11 crc kubenswrapper[4766]: I1002 12:46:11.300931 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"334db39b232bd45cefab3f620b0b073928a5ebce0d9ef7f7b79ad7c257192fa5"} err="failed to get container status \"334db39b232bd45cefab3f620b0b073928a5ebce0d9ef7f7b79ad7c257192fa5\": rpc error: code = NotFound desc = could not find container \"334db39b232bd45cefab3f620b0b073928a5ebce0d9ef7f7b79ad7c257192fa5\": container with ID starting with 334db39b232bd45cefab3f620b0b073928a5ebce0d9ef7f7b79ad7c257192fa5 not found: ID does not exist" Oct 02 12:46:11 crc kubenswrapper[4766]: I1002 12:46:11.905850 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7de7cd3d-8db6-4728-89cf-7b099aaaaf8d" path="/var/lib/kubelet/pods/7de7cd3d-8db6-4728-89cf-7b099aaaaf8d/volumes" Oct 02 12:47:06 crc kubenswrapper[4766]: I1002 12:47:06.055929 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-zff98"] Oct 02 12:47:06 crc kubenswrapper[4766]: I1002 12:47:06.066794 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-zff98"] Oct 02 12:47:07 crc kubenswrapper[4766]: I1002 12:47:07.900836 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c2cafd-3943-4964-a407-348c81b0b416" path="/var/lib/kubelet/pods/98c2cafd-3943-4964-a407-348c81b0b416/volumes" Oct 02 12:47:16 crc kubenswrapper[4766]: I1002 12:47:16.045769 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-082b-account-create-nshkm"] Oct 02 12:47:16 crc kubenswrapper[4766]: I1002 12:47:16.066591 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-082b-account-create-nshkm"] Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.252419 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tdbb9"] Oct 02 12:47:17 crc kubenswrapper[4766]: E1002 12:47:17.253341 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de7cd3d-8db6-4728-89cf-7b099aaaaf8d" containerName="registry-server" Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.253354 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de7cd3d-8db6-4728-89cf-7b099aaaaf8d" containerName="registry-server" Oct 02 12:47:17 crc kubenswrapper[4766]: E1002 12:47:17.253405 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de7cd3d-8db6-4728-89cf-7b099aaaaf8d" containerName="extract-content" Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.253411 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de7cd3d-8db6-4728-89cf-7b099aaaaf8d" containerName="extract-content" Oct 02 12:47:17 crc kubenswrapper[4766]: E1002 12:47:17.253423 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de7cd3d-8db6-4728-89cf-7b099aaaaf8d" containerName="extract-utilities" Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.253430 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de7cd3d-8db6-4728-89cf-7b099aaaaf8d" containerName="extract-utilities" Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.253643 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7de7cd3d-8db6-4728-89cf-7b099aaaaf8d" containerName="registry-server" Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.255362 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdbb9" Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.267550 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdbb9"] Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.406531 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2vxp\" (UniqueName: \"kubernetes.io/projected/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-kube-api-access-f2vxp\") pod \"redhat-marketplace-tdbb9\" (UID: \"0b1392b3-c23f-4c0a-8552-cbad78ae07bb\") " pod="openshift-marketplace/redhat-marketplace-tdbb9" Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.406591 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-utilities\") pod \"redhat-marketplace-tdbb9\" (UID: \"0b1392b3-c23f-4c0a-8552-cbad78ae07bb\") " pod="openshift-marketplace/redhat-marketplace-tdbb9" Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.406615 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-catalog-content\") pod \"redhat-marketplace-tdbb9\" (UID: \"0b1392b3-c23f-4c0a-8552-cbad78ae07bb\") " pod="openshift-marketplace/redhat-marketplace-tdbb9" Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.509409 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2vxp\" (UniqueName: \"kubernetes.io/projected/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-kube-api-access-f2vxp\") pod \"redhat-marketplace-tdbb9\" (UID: \"0b1392b3-c23f-4c0a-8552-cbad78ae07bb\") " pod="openshift-marketplace/redhat-marketplace-tdbb9" Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.509496 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-utilities\") pod \"redhat-marketplace-tdbb9\" (UID: \"0b1392b3-c23f-4c0a-8552-cbad78ae07bb\") " pod="openshift-marketplace/redhat-marketplace-tdbb9" Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.509544 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-catalog-content\") pod \"redhat-marketplace-tdbb9\" (UID: \"0b1392b3-c23f-4c0a-8552-cbad78ae07bb\") " pod="openshift-marketplace/redhat-marketplace-tdbb9" Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.510281 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-catalog-content\") pod \"redhat-marketplace-tdbb9\" (UID: \"0b1392b3-c23f-4c0a-8552-cbad78ae07bb\") " pod="openshift-marketplace/redhat-marketplace-tdbb9" Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.510588 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-utilities\") pod \"redhat-marketplace-tdbb9\" (UID: \"0b1392b3-c23f-4c0a-8552-cbad78ae07bb\") " pod="openshift-marketplace/redhat-marketplace-tdbb9" Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.548472 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2vxp\" (UniqueName: \"kubernetes.io/projected/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-kube-api-access-f2vxp\") pod \"redhat-marketplace-tdbb9\" (UID: \"0b1392b3-c23f-4c0a-8552-cbad78ae07bb\") " pod="openshift-marketplace/redhat-marketplace-tdbb9" Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.600625 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdbb9" Oct 02 12:47:17 crc kubenswrapper[4766]: I1002 12:47:17.901437 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86e9a00c-0b12-4517-8018-8164a05fac41" path="/var/lib/kubelet/pods/86e9a00c-0b12-4517-8018-8164a05fac41/volumes" Oct 02 12:47:18 crc kubenswrapper[4766]: I1002 12:47:18.098212 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdbb9"] Oct 02 12:47:18 crc kubenswrapper[4766]: I1002 12:47:18.976050 4766 generic.go:334] "Generic (PLEG): container finished" podID="0b1392b3-c23f-4c0a-8552-cbad78ae07bb" containerID="e640eb01e272907f04a30ac65d5d437a5536159287e4a13a9ed3f3ca8983f497" exitCode=0 Oct 02 12:47:18 crc kubenswrapper[4766]: I1002 12:47:18.976194 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdbb9" event={"ID":"0b1392b3-c23f-4c0a-8552-cbad78ae07bb","Type":"ContainerDied","Data":"e640eb01e272907f04a30ac65d5d437a5536159287e4a13a9ed3f3ca8983f497"} Oct 02 12:47:18 crc kubenswrapper[4766]: I1002 12:47:18.976608 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdbb9" event={"ID":"0b1392b3-c23f-4c0a-8552-cbad78ae07bb","Type":"ContainerStarted","Data":"2f6fcca5f618e1fc6032479d7463e33b6757477c7289c061eb30de6341072078"} Oct 02 12:47:19 crc kubenswrapper[4766]: I1002 12:47:19.987363 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdbb9" event={"ID":"0b1392b3-c23f-4c0a-8552-cbad78ae07bb","Type":"ContainerStarted","Data":"9bdbfaaa06a19c8432225e0c10bf8d43ff39a76ec1cced1bd7c0da41e01bfd4a"} Oct 02 12:47:20 crc kubenswrapper[4766]: I1002 12:47:20.998763 4766 generic.go:334] "Generic (PLEG): container finished" podID="0b1392b3-c23f-4c0a-8552-cbad78ae07bb" containerID="9bdbfaaa06a19c8432225e0c10bf8d43ff39a76ec1cced1bd7c0da41e01bfd4a" exitCode=0 Oct 02 12:47:20 crc kubenswrapper[4766]: I1002 12:47:20.998860 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdbb9" event={"ID":"0b1392b3-c23f-4c0a-8552-cbad78ae07bb","Type":"ContainerDied","Data":"9bdbfaaa06a19c8432225e0c10bf8d43ff39a76ec1cced1bd7c0da41e01bfd4a"} Oct 02 12:47:22 crc kubenswrapper[4766]: I1002 12:47:22.018067 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdbb9" event={"ID":"0b1392b3-c23f-4c0a-8552-cbad78ae07bb","Type":"ContainerStarted","Data":"3cf7623f35fa4fa6ff159b9335f33cc521875b4f0afce409be78e7fe6bfb9071"} Oct 02 12:47:22 crc kubenswrapper[4766]: I1002 12:47:22.040712 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tdbb9" podStartSLOduration=2.554345278 podStartE2EDuration="5.040689431s" podCreationTimestamp="2025-10-02 12:47:17 +0000 UTC" firstStartedPulling="2025-10-02 12:47:18.978123101 +0000 UTC m=+6953.920994035" lastFinishedPulling="2025-10-02 12:47:21.464467244 +0000 UTC m=+6956.407338188" observedRunningTime="2025-10-02 12:47:22.037081085 +0000 UTC m=+6956.979952049" watchObservedRunningTime="2025-10-02 12:47:22.040689431 +0000 UTC m=+6956.983560395" Oct 02 12:47:27 crc kubenswrapper[4766]: I1002 12:47:27.602152 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tdbb9" Oct 02 12:47:27 crc kubenswrapper[4766]: I1002 12:47:27.602772 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tdbb9" Oct 02 12:47:27 crc kubenswrapper[4766]: I1002 12:47:27.651864 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tdbb9" Oct 02 12:47:28 crc kubenswrapper[4766]: I1002 12:47:28.148517 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tdbb9" Oct 02 12:47:28 crc kubenswrapper[4766]: I1002 12:47:28.209621 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdbb9"] Oct 02 12:47:30 crc kubenswrapper[4766]: I1002 12:47:30.033516 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-h4jks"] Oct 02 12:47:30 crc kubenswrapper[4766]: I1002 12:47:30.042447 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-h4jks"] Oct 02 12:47:30 crc kubenswrapper[4766]: I1002 12:47:30.105042 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tdbb9" podUID="0b1392b3-c23f-4c0a-8552-cbad78ae07bb" containerName="registry-server" containerID="cri-o://3cf7623f35fa4fa6ff159b9335f33cc521875b4f0afce409be78e7fe6bfb9071" gracePeriod=2 Oct 02 12:47:30 crc kubenswrapper[4766]: I1002 12:47:30.675663 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdbb9" Oct 02 12:47:30 crc kubenswrapper[4766]: I1002 12:47:30.745928 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-catalog-content\") pod \"0b1392b3-c23f-4c0a-8552-cbad78ae07bb\" (UID: \"0b1392b3-c23f-4c0a-8552-cbad78ae07bb\") " Oct 02 12:47:30 crc kubenswrapper[4766]: I1002 12:47:30.746092 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2vxp\" (UniqueName: \"kubernetes.io/projected/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-kube-api-access-f2vxp\") pod \"0b1392b3-c23f-4c0a-8552-cbad78ae07bb\" (UID: \"0b1392b3-c23f-4c0a-8552-cbad78ae07bb\") " Oct 02 12:47:30 crc kubenswrapper[4766]: I1002 12:47:30.746202 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-utilities\") pod \"0b1392b3-c23f-4c0a-8552-cbad78ae07bb\" (UID: \"0b1392b3-c23f-4c0a-8552-cbad78ae07bb\") " Oct 02 12:47:30 crc kubenswrapper[4766]: I1002 12:47:30.747369 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-utilities" (OuterVolumeSpecName: "utilities") pod "0b1392b3-c23f-4c0a-8552-cbad78ae07bb" (UID: "0b1392b3-c23f-4c0a-8552-cbad78ae07bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:47:30 crc kubenswrapper[4766]: I1002 12:47:30.757800 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-kube-api-access-f2vxp" (OuterVolumeSpecName: "kube-api-access-f2vxp") pod "0b1392b3-c23f-4c0a-8552-cbad78ae07bb" (UID: "0b1392b3-c23f-4c0a-8552-cbad78ae07bb"). InnerVolumeSpecName "kube-api-access-f2vxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:47:30 crc kubenswrapper[4766]: I1002 12:47:30.773474 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b1392b3-c23f-4c0a-8552-cbad78ae07bb" (UID: "0b1392b3-c23f-4c0a-8552-cbad78ae07bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:47:30 crc kubenswrapper[4766]: I1002 12:47:30.850436 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2vxp\" (UniqueName: \"kubernetes.io/projected/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-kube-api-access-f2vxp\") on node \"crc\" DevicePath \"\"" Oct 02 12:47:30 crc kubenswrapper[4766]: I1002 12:47:30.851028 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:47:30 crc kubenswrapper[4766]: I1002 12:47:30.851139 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b1392b3-c23f-4c0a-8552-cbad78ae07bb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:47:31 crc kubenswrapper[4766]: I1002 12:47:31.114665 4766 generic.go:334] "Generic (PLEG): container finished" podID="0b1392b3-c23f-4c0a-8552-cbad78ae07bb" containerID="3cf7623f35fa4fa6ff159b9335f33cc521875b4f0afce409be78e7fe6bfb9071" exitCode=0 Oct 02 12:47:31 crc kubenswrapper[4766]: I1002 12:47:31.114713 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdbb9" event={"ID":"0b1392b3-c23f-4c0a-8552-cbad78ae07bb","Type":"ContainerDied","Data":"3cf7623f35fa4fa6ff159b9335f33cc521875b4f0afce409be78e7fe6bfb9071"} Oct 02 12:47:31 crc kubenswrapper[4766]: I1002 12:47:31.114746 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tdbb9" event={"ID":"0b1392b3-c23f-4c0a-8552-cbad78ae07bb","Type":"ContainerDied","Data":"2f6fcca5f618e1fc6032479d7463e33b6757477c7289c061eb30de6341072078"} Oct 02 12:47:31 crc kubenswrapper[4766]: I1002 12:47:31.114767 4766 scope.go:117] "RemoveContainer" containerID="3cf7623f35fa4fa6ff159b9335f33cc521875b4f0afce409be78e7fe6bfb9071" Oct 02 12:47:31 crc kubenswrapper[4766]: I1002 12:47:31.114933 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tdbb9" Oct 02 12:47:31 crc kubenswrapper[4766]: I1002 12:47:31.139519 4766 scope.go:117] "RemoveContainer" containerID="9bdbfaaa06a19c8432225e0c10bf8d43ff39a76ec1cced1bd7c0da41e01bfd4a" Oct 02 12:47:31 crc kubenswrapper[4766]: I1002 12:47:31.151220 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdbb9"] Oct 02 12:47:31 crc kubenswrapper[4766]: I1002 12:47:31.160787 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tdbb9"] Oct 02 12:47:31 crc kubenswrapper[4766]: I1002 12:47:31.183327 4766 scope.go:117] "RemoveContainer" containerID="e640eb01e272907f04a30ac65d5d437a5536159287e4a13a9ed3f3ca8983f497" Oct 02 12:47:31 crc kubenswrapper[4766]: I1002 12:47:31.242300 4766 scope.go:117] "RemoveContainer" containerID="3cf7623f35fa4fa6ff159b9335f33cc521875b4f0afce409be78e7fe6bfb9071" Oct 02 12:47:31 crc kubenswrapper[4766]: E1002 12:47:31.242947 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cf7623f35fa4fa6ff159b9335f33cc521875b4f0afce409be78e7fe6bfb9071\": container with ID starting with 3cf7623f35fa4fa6ff159b9335f33cc521875b4f0afce409be78e7fe6bfb9071 not found: ID does not exist" containerID="3cf7623f35fa4fa6ff159b9335f33cc521875b4f0afce409be78e7fe6bfb9071" Oct 02 12:47:31 crc kubenswrapper[4766]: I1002 12:47:31.242987 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cf7623f35fa4fa6ff159b9335f33cc521875b4f0afce409be78e7fe6bfb9071"} err="failed to get container status \"3cf7623f35fa4fa6ff159b9335f33cc521875b4f0afce409be78e7fe6bfb9071\": rpc error: code = NotFound desc = could not find container \"3cf7623f35fa4fa6ff159b9335f33cc521875b4f0afce409be78e7fe6bfb9071\": container with ID starting with 3cf7623f35fa4fa6ff159b9335f33cc521875b4f0afce409be78e7fe6bfb9071 not found: ID does not exist" Oct 02 12:47:31 crc kubenswrapper[4766]: I1002 12:47:31.243016 4766 scope.go:117] "RemoveContainer" containerID="9bdbfaaa06a19c8432225e0c10bf8d43ff39a76ec1cced1bd7c0da41e01bfd4a" Oct 02 12:47:31 crc kubenswrapper[4766]: E1002 12:47:31.243552 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bdbfaaa06a19c8432225e0c10bf8d43ff39a76ec1cced1bd7c0da41e01bfd4a\": container with ID starting with 9bdbfaaa06a19c8432225e0c10bf8d43ff39a76ec1cced1bd7c0da41e01bfd4a not found: ID does not exist" containerID="9bdbfaaa06a19c8432225e0c10bf8d43ff39a76ec1cced1bd7c0da41e01bfd4a" Oct 02 12:47:31 crc kubenswrapper[4766]: I1002 12:47:31.243608 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bdbfaaa06a19c8432225e0c10bf8d43ff39a76ec1cced1bd7c0da41e01bfd4a"} err="failed to get container status \"9bdbfaaa06a19c8432225e0c10bf8d43ff39a76ec1cced1bd7c0da41e01bfd4a\": rpc error: code = NotFound desc = could not find container \"9bdbfaaa06a19c8432225e0c10bf8d43ff39a76ec1cced1bd7c0da41e01bfd4a\": container with ID starting with 9bdbfaaa06a19c8432225e0c10bf8d43ff39a76ec1cced1bd7c0da41e01bfd4a not found: ID does not exist" Oct 02 12:47:31 crc kubenswrapper[4766]: I1002 12:47:31.243645 4766 scope.go:117] "RemoveContainer" containerID="e640eb01e272907f04a30ac65d5d437a5536159287e4a13a9ed3f3ca8983f497" Oct 02 12:47:31 crc kubenswrapper[4766]: E1002 12:47:31.244458 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e640eb01e272907f04a30ac65d5d437a5536159287e4a13a9ed3f3ca8983f497\": container with ID starting with e640eb01e272907f04a30ac65d5d437a5536159287e4a13a9ed3f3ca8983f497 not found: ID does not exist" containerID="e640eb01e272907f04a30ac65d5d437a5536159287e4a13a9ed3f3ca8983f497" Oct 02 12:47:31 crc kubenswrapper[4766]: I1002 12:47:31.244708 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e640eb01e272907f04a30ac65d5d437a5536159287e4a13a9ed3f3ca8983f497"} err="failed to get container status \"e640eb01e272907f04a30ac65d5d437a5536159287e4a13a9ed3f3ca8983f497\": rpc error: code = NotFound desc = could not find container \"e640eb01e272907f04a30ac65d5d437a5536159287e4a13a9ed3f3ca8983f497\": container with ID starting with e640eb01e272907f04a30ac65d5d437a5536159287e4a13a9ed3f3ca8983f497 not found: ID does not exist" Oct 02 12:47:31 crc kubenswrapper[4766]: I1002 12:47:31.898857 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b1392b3-c23f-4c0a-8552-cbad78ae07bb" path="/var/lib/kubelet/pods/0b1392b3-c23f-4c0a-8552-cbad78ae07bb/volumes" Oct 02 12:47:31 crc kubenswrapper[4766]: I1002 12:47:31.900247 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46aae5f7-366f-4467-bc67-e662384d164b" path="/var/lib/kubelet/pods/46aae5f7-366f-4467-bc67-e662384d164b/volumes" Oct 02 12:47:41 crc kubenswrapper[4766]: I1002 12:47:41.033167 4766 scope.go:117] "RemoveContainer" containerID="3c6912673af50e25395766b8fd03ba85852936101dd2472604fdbf19b2c441ea" Oct 02 12:47:41 crc kubenswrapper[4766]: I1002 12:47:41.076258 4766 scope.go:117] "RemoveContainer" containerID="ec51ccd2e26d2c1a0aafcb13a06f150dbd55dfa6c7812bacfbe1c4a372a51dd2" Oct 02 12:47:41 crc kubenswrapper[4766]: I1002 12:47:41.142463 4766 scope.go:117] "RemoveContainer" containerID="14845fb6abdb72bc7cf94cb0758b8f225bb4fb3175f2ae8aec5c870ee4adf61e" Oct 02 12:47:54 crc kubenswrapper[4766]: I1002 12:47:54.431960 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:47:54 crc kubenswrapper[4766]: I1002 12:47:54.432677 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:48:24 crc kubenswrapper[4766]: I1002 12:48:24.432335 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:48:24 crc kubenswrapper[4766]: I1002 12:48:24.433127 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:48:37 crc kubenswrapper[4766]: I1002 12:48:37.433689 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-96grk"] Oct 02 12:48:37 crc kubenswrapper[4766]: E1002 12:48:37.434720 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b1392b3-c23f-4c0a-8552-cbad78ae07bb" containerName="registry-server" Oct 02 12:48:37 crc kubenswrapper[4766]: I1002 12:48:37.434735 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1392b3-c23f-4c0a-8552-cbad78ae07bb" containerName="registry-server" Oct 02 12:48:37 crc kubenswrapper[4766]: E1002 12:48:37.434750 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b1392b3-c23f-4c0a-8552-cbad78ae07bb" containerName="extract-utilities" Oct 02 12:48:37 crc kubenswrapper[4766]: I1002 12:48:37.434759 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1392b3-c23f-4c0a-8552-cbad78ae07bb" containerName="extract-utilities" Oct 02 12:48:37 crc kubenswrapper[4766]: E1002 12:48:37.434784 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b1392b3-c23f-4c0a-8552-cbad78ae07bb" containerName="extract-content" Oct 02 12:48:37 crc kubenswrapper[4766]: I1002 12:48:37.434793 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1392b3-c23f-4c0a-8552-cbad78ae07bb" containerName="extract-content" Oct 02 12:48:37 crc kubenswrapper[4766]: I1002 12:48:37.435098 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b1392b3-c23f-4c0a-8552-cbad78ae07bb" containerName="registry-server" Oct 02 12:48:37 crc kubenswrapper[4766]: I1002 12:48:37.456736 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-96grk" Oct 02 12:48:37 crc kubenswrapper[4766]: I1002 12:48:37.511697 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-96grk"] Oct 02 12:48:37 crc kubenswrapper[4766]: I1002 12:48:37.564602 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b959d503-9548-45b0-a676-efa68d97657b-catalog-content\") pod \"redhat-operators-96grk\" (UID: \"b959d503-9548-45b0-a676-efa68d97657b\") " pod="openshift-marketplace/redhat-operators-96grk" Oct 02 12:48:37 crc kubenswrapper[4766]: I1002 12:48:37.564788 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w62qf\" (UniqueName: \"kubernetes.io/projected/b959d503-9548-45b0-a676-efa68d97657b-kube-api-access-w62qf\") pod \"redhat-operators-96grk\" (UID: \"b959d503-9548-45b0-a676-efa68d97657b\") " pod="openshift-marketplace/redhat-operators-96grk" Oct 02 12:48:37 crc kubenswrapper[4766]: I1002 12:48:37.564925 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b959d503-9548-45b0-a676-efa68d97657b-utilities\") pod \"redhat-operators-96grk\" (UID: \"b959d503-9548-45b0-a676-efa68d97657b\") " pod="openshift-marketplace/redhat-operators-96grk" Oct 02 12:48:37 crc kubenswrapper[4766]: I1002 12:48:37.666980 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b959d503-9548-45b0-a676-efa68d97657b-catalog-content\") pod \"redhat-operators-96grk\" (UID: \"b959d503-9548-45b0-a676-efa68d97657b\") " pod="openshift-marketplace/redhat-operators-96grk" Oct 02 12:48:37 crc kubenswrapper[4766]: I1002 12:48:37.667129 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w62qf\" (UniqueName: \"kubernetes.io/projected/b959d503-9548-45b0-a676-efa68d97657b-kube-api-access-w62qf\") pod \"redhat-operators-96grk\" (UID: \"b959d503-9548-45b0-a676-efa68d97657b\") " pod="openshift-marketplace/redhat-operators-96grk" Oct 02 12:48:37 crc kubenswrapper[4766]: I1002 12:48:37.667174 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b959d503-9548-45b0-a676-efa68d97657b-utilities\") pod \"redhat-operators-96grk\" (UID: \"b959d503-9548-45b0-a676-efa68d97657b\") " pod="openshift-marketplace/redhat-operators-96grk" Oct 02 12:48:37 crc kubenswrapper[4766]: I1002 12:48:37.667625 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b959d503-9548-45b0-a676-efa68d97657b-catalog-content\") pod \"redhat-operators-96grk\" (UID: \"b959d503-9548-45b0-a676-efa68d97657b\") " pod="openshift-marketplace/redhat-operators-96grk" Oct 02 12:48:37 crc kubenswrapper[4766]: I1002 12:48:37.667654 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b959d503-9548-45b0-a676-efa68d97657b-utilities\") pod \"redhat-operators-96grk\" (UID: \"b959d503-9548-45b0-a676-efa68d97657b\") " pod="openshift-marketplace/redhat-operators-96grk" Oct 02 12:48:37 crc kubenswrapper[4766]: I1002 12:48:37.700783 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w62qf\" (UniqueName: \"kubernetes.io/projected/b959d503-9548-45b0-a676-efa68d97657b-kube-api-access-w62qf\") pod \"redhat-operators-96grk\" (UID: \"b959d503-9548-45b0-a676-efa68d97657b\") " pod="openshift-marketplace/redhat-operators-96grk" Oct 02 12:48:37 crc kubenswrapper[4766]: I1002 12:48:37.791153 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-96grk" Oct 02 12:48:38 crc kubenswrapper[4766]: I1002 12:48:38.307691 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-96grk"] Oct 02 12:48:38 crc kubenswrapper[4766]: I1002 12:48:38.822275 4766 generic.go:334] "Generic (PLEG): container finished" podID="b959d503-9548-45b0-a676-efa68d97657b" containerID="98184696db0e1a4faf31b0c5c2a2cfd100838ffb7ed974d8aa55264142d15c55" exitCode=0 Oct 02 12:48:38 crc kubenswrapper[4766]: I1002 12:48:38.822360 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96grk" event={"ID":"b959d503-9548-45b0-a676-efa68d97657b","Type":"ContainerDied","Data":"98184696db0e1a4faf31b0c5c2a2cfd100838ffb7ed974d8aa55264142d15c55"} Oct 02 12:48:38 crc kubenswrapper[4766]: I1002 12:48:38.823446 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96grk" event={"ID":"b959d503-9548-45b0-a676-efa68d97657b","Type":"ContainerStarted","Data":"b56ba9371506c6cd8bfc1ad90793e86d2dac571c4e7c725212ce2a62e4a74c1b"} Oct 02 12:48:41 crc kubenswrapper[4766]: I1002 12:48:40.852105 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96grk" event={"ID":"b959d503-9548-45b0-a676-efa68d97657b","Type":"ContainerStarted","Data":"f36594f2c1a0b6c633832b026c30d80fcd16197f34b12e2a556af418e31304ab"} Oct 02 12:48:44 crc kubenswrapper[4766]: I1002 12:48:44.901054 4766 generic.go:334] "Generic (PLEG): container finished" podID="b959d503-9548-45b0-a676-efa68d97657b" containerID="f36594f2c1a0b6c633832b026c30d80fcd16197f34b12e2a556af418e31304ab" exitCode=0 Oct 02 12:48:44 crc kubenswrapper[4766]: I1002 12:48:44.901148 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96grk" event={"ID":"b959d503-9548-45b0-a676-efa68d97657b","Type":"ContainerDied","Data":"f36594f2c1a0b6c633832b026c30d80fcd16197f34b12e2a556af418e31304ab"} Oct 02 12:48:45 crc kubenswrapper[4766]: I1002 12:48:45.912532 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96grk" event={"ID":"b959d503-9548-45b0-a676-efa68d97657b","Type":"ContainerStarted","Data":"22c7c1adbeb986b68bc988a64be249f3386ced0051f65ae9bba68d4425230839"} Oct 02 12:48:45 crc kubenswrapper[4766]: I1002 12:48:45.930857 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-96grk" podStartSLOduration=2.224571564 podStartE2EDuration="8.930833709s" podCreationTimestamp="2025-10-02 12:48:37 +0000 UTC" firstStartedPulling="2025-10-02 12:48:38.824734357 +0000 UTC m=+7033.767605301" lastFinishedPulling="2025-10-02 12:48:45.530996502 +0000 UTC m=+7040.473867446" observedRunningTime="2025-10-02 12:48:45.929540997 +0000 UTC m=+7040.872411981" watchObservedRunningTime="2025-10-02 12:48:45.930833709 +0000 UTC m=+7040.873704653" Oct 02 12:48:47 crc kubenswrapper[4766]: I1002 12:48:47.791409 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-96grk" Oct 02 12:48:47 crc kubenswrapper[4766]: I1002 12:48:47.791701 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-96grk" Oct 02 12:48:48 crc kubenswrapper[4766]: I1002 12:48:48.844495 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-96grk" podUID="b959d503-9548-45b0-a676-efa68d97657b" containerName="registry-server" probeResult="failure" output=< Oct 02 12:48:48 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Oct 02 12:48:48 crc kubenswrapper[4766]: > Oct 02 12:48:54 crc kubenswrapper[4766]: I1002 12:48:54.432148 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:48:54 crc kubenswrapper[4766]: I1002 12:48:54.432820 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:48:54 crc kubenswrapper[4766]: I1002 12:48:54.432885 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 12:48:54 crc kubenswrapper[4766]: I1002 12:48:54.433930 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:48:54 crc kubenswrapper[4766]: I1002 12:48:54.434007 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" gracePeriod=600 Oct 02 12:48:54 crc kubenswrapper[4766]: E1002 12:48:54.557215 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:48:55 crc kubenswrapper[4766]: I1002 12:48:55.005656 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" exitCode=0 Oct 02 12:48:55 crc kubenswrapper[4766]: I1002 12:48:55.005753 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089"} Oct 02 12:48:55 crc kubenswrapper[4766]: I1002 12:48:55.005836 4766 scope.go:117] "RemoveContainer" containerID="c2b4353ffed00a874e54b842c2e985395b749e0bf31632b418df66f7c022195d" Oct 02 12:48:55 crc kubenswrapper[4766]: I1002 12:48:55.007342 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:48:55 crc kubenswrapper[4766]: E1002 12:48:55.007724 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:48:58 crc kubenswrapper[4766]: I1002 12:48:58.844940 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-96grk" podUID="b959d503-9548-45b0-a676-efa68d97657b" containerName="registry-server" probeResult="failure" output=< Oct 02 12:48:58 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Oct 02 12:48:58 crc kubenswrapper[4766]: > Oct 02 12:49:06 crc kubenswrapper[4766]: I1002 12:49:06.882330 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:49:06 crc kubenswrapper[4766]: E1002 12:49:06.884303 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:49:08 crc kubenswrapper[4766]: I1002 12:49:08.853438 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-96grk" podUID="b959d503-9548-45b0-a676-efa68d97657b" containerName="registry-server" probeResult="failure" output=< Oct 02 12:49:08 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Oct 02 12:49:08 crc kubenswrapper[4766]: > Oct 02 12:49:17 crc kubenswrapper[4766]: I1002 12:49:17.848329 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-96grk" Oct 02 12:49:17 crc kubenswrapper[4766]: I1002 12:49:17.924741 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-96grk" Oct 02 12:49:18 crc kubenswrapper[4766]: I1002 12:49:18.091178 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-96grk"] Oct 02 12:49:19 crc kubenswrapper[4766]: I1002 12:49:19.274283 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-96grk" podUID="b959d503-9548-45b0-a676-efa68d97657b" containerName="registry-server" containerID="cri-o://22c7c1adbeb986b68bc988a64be249f3386ced0051f65ae9bba68d4425230839" gracePeriod=2 Oct 02 12:49:19 crc kubenswrapper[4766]: I1002 12:49:19.806726 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-96grk" Oct 02 12:49:19 crc kubenswrapper[4766]: I1002 12:49:19.964596 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b959d503-9548-45b0-a676-efa68d97657b-catalog-content\") pod \"b959d503-9548-45b0-a676-efa68d97657b\" (UID: \"b959d503-9548-45b0-a676-efa68d97657b\") " Oct 02 12:49:19 crc kubenswrapper[4766]: I1002 12:49:19.964701 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b959d503-9548-45b0-a676-efa68d97657b-utilities\") pod \"b959d503-9548-45b0-a676-efa68d97657b\" (UID: \"b959d503-9548-45b0-a676-efa68d97657b\") " Oct 02 12:49:19 crc kubenswrapper[4766]: I1002 12:49:19.964733 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w62qf\" (UniqueName: \"kubernetes.io/projected/b959d503-9548-45b0-a676-efa68d97657b-kube-api-access-w62qf\") pod \"b959d503-9548-45b0-a676-efa68d97657b\" (UID: \"b959d503-9548-45b0-a676-efa68d97657b\") " Oct 02 12:49:19 crc kubenswrapper[4766]: I1002 12:49:19.966723 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b959d503-9548-45b0-a676-efa68d97657b-utilities" (OuterVolumeSpecName: "utilities") pod "b959d503-9548-45b0-a676-efa68d97657b" (UID: "b959d503-9548-45b0-a676-efa68d97657b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:49:19 crc kubenswrapper[4766]: I1002 12:49:19.967573 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b959d503-9548-45b0-a676-efa68d97657b-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:19 crc kubenswrapper[4766]: I1002 12:49:19.972408 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b959d503-9548-45b0-a676-efa68d97657b-kube-api-access-w62qf" (OuterVolumeSpecName: "kube-api-access-w62qf") pod "b959d503-9548-45b0-a676-efa68d97657b" (UID: "b959d503-9548-45b0-a676-efa68d97657b"). InnerVolumeSpecName "kube-api-access-w62qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.048673 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b959d503-9548-45b0-a676-efa68d97657b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b959d503-9548-45b0-a676-efa68d97657b" (UID: "b959d503-9548-45b0-a676-efa68d97657b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.069334 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b959d503-9548-45b0-a676-efa68d97657b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.069362 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w62qf\" (UniqueName: \"kubernetes.io/projected/b959d503-9548-45b0-a676-efa68d97657b-kube-api-access-w62qf\") on node \"crc\" DevicePath \"\"" Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.290120 4766 generic.go:334] "Generic (PLEG): container finished" podID="b959d503-9548-45b0-a676-efa68d97657b" containerID="22c7c1adbeb986b68bc988a64be249f3386ced0051f65ae9bba68d4425230839" exitCode=0 Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.290184 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96grk" event={"ID":"b959d503-9548-45b0-a676-efa68d97657b","Type":"ContainerDied","Data":"22c7c1adbeb986b68bc988a64be249f3386ced0051f65ae9bba68d4425230839"} Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.290198 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-96grk" Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.290231 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-96grk" event={"ID":"b959d503-9548-45b0-a676-efa68d97657b","Type":"ContainerDied","Data":"b56ba9371506c6cd8bfc1ad90793e86d2dac571c4e7c725212ce2a62e4a74c1b"} Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.290260 4766 scope.go:117] "RemoveContainer" containerID="22c7c1adbeb986b68bc988a64be249f3386ced0051f65ae9bba68d4425230839" Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.325347 4766 scope.go:117] "RemoveContainer" containerID="f36594f2c1a0b6c633832b026c30d80fcd16197f34b12e2a556af418e31304ab" Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.345263 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-96grk"] Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.352782 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-96grk"] Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.361453 4766 scope.go:117] "RemoveContainer" containerID="98184696db0e1a4faf31b0c5c2a2cfd100838ffb7ed974d8aa55264142d15c55" Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.429535 4766 scope.go:117] "RemoveContainer" containerID="22c7c1adbeb986b68bc988a64be249f3386ced0051f65ae9bba68d4425230839" Oct 02 12:49:20 crc kubenswrapper[4766]: E1002 12:49:20.429974 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c7c1adbeb986b68bc988a64be249f3386ced0051f65ae9bba68d4425230839\": container with ID starting with 22c7c1adbeb986b68bc988a64be249f3386ced0051f65ae9bba68d4425230839 not found: ID does not exist" containerID="22c7c1adbeb986b68bc988a64be249f3386ced0051f65ae9bba68d4425230839" Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.430058 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c7c1adbeb986b68bc988a64be249f3386ced0051f65ae9bba68d4425230839"} err="failed to get container status \"22c7c1adbeb986b68bc988a64be249f3386ced0051f65ae9bba68d4425230839\": rpc error: code = NotFound desc = could not find container \"22c7c1adbeb986b68bc988a64be249f3386ced0051f65ae9bba68d4425230839\": container with ID starting with 22c7c1adbeb986b68bc988a64be249f3386ced0051f65ae9bba68d4425230839 not found: ID does not exist" Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.430092 4766 scope.go:117] "RemoveContainer" containerID="f36594f2c1a0b6c633832b026c30d80fcd16197f34b12e2a556af418e31304ab" Oct 02 12:49:20 crc kubenswrapper[4766]: E1002 12:49:20.430448 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f36594f2c1a0b6c633832b026c30d80fcd16197f34b12e2a556af418e31304ab\": container with ID starting with f36594f2c1a0b6c633832b026c30d80fcd16197f34b12e2a556af418e31304ab not found: ID does not exist" containerID="f36594f2c1a0b6c633832b026c30d80fcd16197f34b12e2a556af418e31304ab" Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.430519 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f36594f2c1a0b6c633832b026c30d80fcd16197f34b12e2a556af418e31304ab"} err="failed to get container status \"f36594f2c1a0b6c633832b026c30d80fcd16197f34b12e2a556af418e31304ab\": rpc error: code = NotFound desc = could not find container \"f36594f2c1a0b6c633832b026c30d80fcd16197f34b12e2a556af418e31304ab\": container with ID starting with f36594f2c1a0b6c633832b026c30d80fcd16197f34b12e2a556af418e31304ab not found: ID does not exist" Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.430556 4766 scope.go:117] "RemoveContainer" containerID="98184696db0e1a4faf31b0c5c2a2cfd100838ffb7ed974d8aa55264142d15c55" Oct 02 12:49:20 crc kubenswrapper[4766]: E1002 12:49:20.437101 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98184696db0e1a4faf31b0c5c2a2cfd100838ffb7ed974d8aa55264142d15c55\": container with ID starting with 98184696db0e1a4faf31b0c5c2a2cfd100838ffb7ed974d8aa55264142d15c55 not found: ID does not exist" containerID="98184696db0e1a4faf31b0c5c2a2cfd100838ffb7ed974d8aa55264142d15c55" Oct 02 12:49:20 crc kubenswrapper[4766]: I1002 12:49:20.437167 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98184696db0e1a4faf31b0c5c2a2cfd100838ffb7ed974d8aa55264142d15c55"} err="failed to get container status \"98184696db0e1a4faf31b0c5c2a2cfd100838ffb7ed974d8aa55264142d15c55\": rpc error: code = NotFound desc = could not find container \"98184696db0e1a4faf31b0c5c2a2cfd100838ffb7ed974d8aa55264142d15c55\": container with ID starting with 98184696db0e1a4faf31b0c5c2a2cfd100838ffb7ed974d8aa55264142d15c55 not found: ID does not exist" Oct 02 12:49:21 crc kubenswrapper[4766]: I1002 12:49:21.881888 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:49:21 crc kubenswrapper[4766]: E1002 12:49:21.882199 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:49:21 crc kubenswrapper[4766]: I1002 12:49:21.892614 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b959d503-9548-45b0-a676-efa68d97657b" path="/var/lib/kubelet/pods/b959d503-9548-45b0-a676-efa68d97657b/volumes" Oct 02 12:49:32 crc kubenswrapper[4766]: I1002 12:49:32.885898 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:49:32 crc kubenswrapper[4766]: E1002 12:49:32.887045 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:49:47 crc kubenswrapper[4766]: I1002 12:49:47.045352 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-22h8g"] Oct 02 12:49:47 crc kubenswrapper[4766]: I1002 12:49:47.057632 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-22h8g"] Oct 02 12:49:47 crc kubenswrapper[4766]: I1002 12:49:47.890442 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:49:47 crc kubenswrapper[4766]: E1002 12:49:47.891234 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:49:47 crc kubenswrapper[4766]: I1002 12:49:47.902112 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c68434a-a559-4f0d-b7b7-ed2490989b58" path="/var/lib/kubelet/pods/9c68434a-a559-4f0d-b7b7-ed2490989b58/volumes" Oct 02 12:49:57 crc kubenswrapper[4766]: I1002 12:49:57.037216 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-3546-account-create-t9dgr"] Oct 02 12:49:57 crc kubenswrapper[4766]: I1002 12:49:57.047224 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-3546-account-create-t9dgr"] Oct 02 12:49:57 crc kubenswrapper[4766]: I1002 12:49:57.899015 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4183ea30-6477-4d5c-bae6-89e3e8e07591" path="/var/lib/kubelet/pods/4183ea30-6477-4d5c-bae6-89e3e8e07591/volumes" Oct 02 12:49:58 crc kubenswrapper[4766]: I1002 12:49:58.881627 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:49:58 crc kubenswrapper[4766]: E1002 12:49:58.882364 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:50:10 crc kubenswrapper[4766]: I1002 12:50:10.881330 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:50:10 crc kubenswrapper[4766]: E1002 12:50:10.882158 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:50:10 crc kubenswrapper[4766]: I1002 12:50:10.988486 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4k6q5"] Oct 02 12:50:10 crc kubenswrapper[4766]: E1002 12:50:10.989096 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b959d503-9548-45b0-a676-efa68d97657b" containerName="extract-utilities" Oct 02 12:50:10 crc kubenswrapper[4766]: I1002 12:50:10.989119 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b959d503-9548-45b0-a676-efa68d97657b" containerName="extract-utilities" Oct 02 12:50:10 crc kubenswrapper[4766]: E1002 12:50:10.989180 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b959d503-9548-45b0-a676-efa68d97657b" containerName="extract-content" Oct 02 12:50:10 crc kubenswrapper[4766]: I1002 12:50:10.989190 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b959d503-9548-45b0-a676-efa68d97657b" containerName="extract-content" Oct 02 12:50:10 crc kubenswrapper[4766]: E1002 12:50:10.989208 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b959d503-9548-45b0-a676-efa68d97657b" containerName="registry-server" Oct 02 12:50:10 crc kubenswrapper[4766]: I1002 12:50:10.989218 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="b959d503-9548-45b0-a676-efa68d97657b" containerName="registry-server" Oct 02 12:50:10 crc kubenswrapper[4766]: I1002 12:50:10.989477 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="b959d503-9548-45b0-a676-efa68d97657b" containerName="registry-server" Oct 02 12:50:10 crc kubenswrapper[4766]: I1002 12:50:10.991711 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4k6q5" Oct 02 12:50:11 crc kubenswrapper[4766]: I1002 12:50:10.999655 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4k6q5"] Oct 02 12:50:11 crc kubenswrapper[4766]: I1002 12:50:11.065722 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ca20d6-d835-475e-9734-0b2791feaeac-utilities\") pod \"community-operators-4k6q5\" (UID: \"33ca20d6-d835-475e-9734-0b2791feaeac\") " pod="openshift-marketplace/community-operators-4k6q5" Oct 02 12:50:11 crc kubenswrapper[4766]: I1002 12:50:11.065826 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frtkk\" (UniqueName: \"kubernetes.io/projected/33ca20d6-d835-475e-9734-0b2791feaeac-kube-api-access-frtkk\") pod \"community-operators-4k6q5\" (UID: \"33ca20d6-d835-475e-9734-0b2791feaeac\") " pod="openshift-marketplace/community-operators-4k6q5" Oct 02 12:50:11 crc kubenswrapper[4766]: I1002 12:50:11.066010 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ca20d6-d835-475e-9734-0b2791feaeac-catalog-content\") pod \"community-operators-4k6q5\" (UID: \"33ca20d6-d835-475e-9734-0b2791feaeac\") " pod="openshift-marketplace/community-operators-4k6q5" Oct 02 12:50:11 crc kubenswrapper[4766]: I1002 12:50:11.167930 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ca20d6-d835-475e-9734-0b2791feaeac-catalog-content\") pod \"community-operators-4k6q5\" (UID: \"33ca20d6-d835-475e-9734-0b2791feaeac\") " pod="openshift-marketplace/community-operators-4k6q5" Oct 02 12:50:11 crc kubenswrapper[4766]: I1002 12:50:11.168074 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ca20d6-d835-475e-9734-0b2791feaeac-utilities\") pod \"community-operators-4k6q5\" (UID: \"33ca20d6-d835-475e-9734-0b2791feaeac\") " pod="openshift-marketplace/community-operators-4k6q5" Oct 02 12:50:11 crc kubenswrapper[4766]: I1002 12:50:11.168141 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frtkk\" (UniqueName: \"kubernetes.io/projected/33ca20d6-d835-475e-9734-0b2791feaeac-kube-api-access-frtkk\") pod \"community-operators-4k6q5\" (UID: \"33ca20d6-d835-475e-9734-0b2791feaeac\") " pod="openshift-marketplace/community-operators-4k6q5" Oct 02 12:50:11 crc kubenswrapper[4766]: I1002 12:50:11.168559 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ca20d6-d835-475e-9734-0b2791feaeac-catalog-content\") pod \"community-operators-4k6q5\" (UID: \"33ca20d6-d835-475e-9734-0b2791feaeac\") " pod="openshift-marketplace/community-operators-4k6q5" Oct 02 12:50:11 crc kubenswrapper[4766]: I1002 12:50:11.168655 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ca20d6-d835-475e-9734-0b2791feaeac-utilities\") pod \"community-operators-4k6q5\" (UID: \"33ca20d6-d835-475e-9734-0b2791feaeac\") " pod="openshift-marketplace/community-operators-4k6q5" Oct 02 12:50:11 crc kubenswrapper[4766]: I1002 12:50:11.194568 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frtkk\" (UniqueName: \"kubernetes.io/projected/33ca20d6-d835-475e-9734-0b2791feaeac-kube-api-access-frtkk\") pod \"community-operators-4k6q5\" (UID: \"33ca20d6-d835-475e-9734-0b2791feaeac\") " pod="openshift-marketplace/community-operators-4k6q5" Oct 02 12:50:11 crc kubenswrapper[4766]: I1002 12:50:11.324768 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4k6q5" Oct 02 12:50:11 crc kubenswrapper[4766]: I1002 12:50:11.922257 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4k6q5"] Oct 02 12:50:12 crc kubenswrapper[4766]: I1002 12:50:12.866706 4766 generic.go:334] "Generic (PLEG): container finished" podID="33ca20d6-d835-475e-9734-0b2791feaeac" containerID="212ffe9c4330602197150a69bea1face30eddcc974b9c7262d98e9765d8a51b8" exitCode=0 Oct 02 12:50:12 crc kubenswrapper[4766]: I1002 12:50:12.866790 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k6q5" event={"ID":"33ca20d6-d835-475e-9734-0b2791feaeac","Type":"ContainerDied","Data":"212ffe9c4330602197150a69bea1face30eddcc974b9c7262d98e9765d8a51b8"} Oct 02 12:50:12 crc kubenswrapper[4766]: I1002 12:50:12.867024 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k6q5" event={"ID":"33ca20d6-d835-475e-9734-0b2791feaeac","Type":"ContainerStarted","Data":"4ff8c3cbd3d8d476ae031c32dbef667d59a8a5b6e07b4af930c38322787ba604"} Oct 02 12:50:13 crc kubenswrapper[4766]: I1002 12:50:13.877971 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k6q5" event={"ID":"33ca20d6-d835-475e-9734-0b2791feaeac","Type":"ContainerStarted","Data":"3644d1b85954a3253ca98db5bc7d8e99949612da58ceed22874f50c037911ad3"} Oct 02 12:50:15 crc kubenswrapper[4766]: I1002 12:50:15.909257 4766 generic.go:334] "Generic (PLEG): container finished" podID="33ca20d6-d835-475e-9734-0b2791feaeac" containerID="3644d1b85954a3253ca98db5bc7d8e99949612da58ceed22874f50c037911ad3" exitCode=0 Oct 02 12:50:15 crc kubenswrapper[4766]: I1002 12:50:15.909368 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k6q5" event={"ID":"33ca20d6-d835-475e-9734-0b2791feaeac","Type":"ContainerDied","Data":"3644d1b85954a3253ca98db5bc7d8e99949612da58ceed22874f50c037911ad3"} Oct 02 12:50:16 crc kubenswrapper[4766]: I1002 12:50:16.922542 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k6q5" event={"ID":"33ca20d6-d835-475e-9734-0b2791feaeac","Type":"ContainerStarted","Data":"b136d2c4171a2d1b507139fc22d7c0bff1ff728e32e0570ce1eae910a8b6261f"} Oct 02 12:50:21 crc kubenswrapper[4766]: I1002 12:50:21.325195 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4k6q5" Oct 02 12:50:21 crc kubenswrapper[4766]: I1002 12:50:21.326008 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4k6q5" Oct 02 12:50:21 crc kubenswrapper[4766]: I1002 12:50:21.378266 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4k6q5" Oct 02 12:50:21 crc kubenswrapper[4766]: I1002 12:50:21.410736 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4k6q5" podStartSLOduration=7.697394511 podStartE2EDuration="11.410670925s" podCreationTimestamp="2025-10-02 12:50:10 +0000 UTC" firstStartedPulling="2025-10-02 12:50:12.870456354 +0000 UTC m=+7127.813327298" lastFinishedPulling="2025-10-02 12:50:16.583732768 +0000 UTC m=+7131.526603712" observedRunningTime="2025-10-02 12:50:16.948727169 +0000 UTC m=+7131.891598153" watchObservedRunningTime="2025-10-02 12:50:21.410670925 +0000 UTC m=+7136.353541869" Oct 02 12:50:21 crc kubenswrapper[4766]: I1002 12:50:21.881302 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:50:21 crc kubenswrapper[4766]: E1002 12:50:21.881953 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:50:22 crc kubenswrapper[4766]: I1002 12:50:22.029832 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4k6q5" Oct 02 12:50:22 crc kubenswrapper[4766]: I1002 12:50:22.084570 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4k6q5"] Oct 02 12:50:23 crc kubenswrapper[4766]: I1002 12:50:23.045879 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-gg7hw"] Oct 02 12:50:23 crc kubenswrapper[4766]: I1002 12:50:23.057617 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-gg7hw"] Oct 02 12:50:23 crc kubenswrapper[4766]: I1002 12:50:23.897098 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74f60e52-cef0-4224-83e9-ec914df2bd9d" path="/var/lib/kubelet/pods/74f60e52-cef0-4224-83e9-ec914df2bd9d/volumes" Oct 02 12:50:23 crc kubenswrapper[4766]: I1002 12:50:23.994392 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4k6q5" podUID="33ca20d6-d835-475e-9734-0b2791feaeac" containerName="registry-server" containerID="cri-o://b136d2c4171a2d1b507139fc22d7c0bff1ff728e32e0570ce1eae910a8b6261f" gracePeriod=2 Oct 02 12:50:24 crc kubenswrapper[4766]: I1002 12:50:24.549837 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4k6q5" Oct 02 12:50:24 crc kubenswrapper[4766]: I1002 12:50:24.674017 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ca20d6-d835-475e-9734-0b2791feaeac-catalog-content\") pod \"33ca20d6-d835-475e-9734-0b2791feaeac\" (UID: \"33ca20d6-d835-475e-9734-0b2791feaeac\") " Oct 02 12:50:24 crc kubenswrapper[4766]: I1002 12:50:24.674709 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ca20d6-d835-475e-9734-0b2791feaeac-utilities\") pod \"33ca20d6-d835-475e-9734-0b2791feaeac\" (UID: \"33ca20d6-d835-475e-9734-0b2791feaeac\") " Oct 02 12:50:24 crc kubenswrapper[4766]: I1002 12:50:24.675213 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frtkk\" (UniqueName: \"kubernetes.io/projected/33ca20d6-d835-475e-9734-0b2791feaeac-kube-api-access-frtkk\") pod \"33ca20d6-d835-475e-9734-0b2791feaeac\" (UID: \"33ca20d6-d835-475e-9734-0b2791feaeac\") " Oct 02 12:50:24 crc kubenswrapper[4766]: I1002 12:50:24.675720 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33ca20d6-d835-475e-9734-0b2791feaeac-utilities" (OuterVolumeSpecName: "utilities") pod "33ca20d6-d835-475e-9734-0b2791feaeac" (UID: "33ca20d6-d835-475e-9734-0b2791feaeac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:50:24 crc kubenswrapper[4766]: I1002 12:50:24.676738 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33ca20d6-d835-475e-9734-0b2791feaeac-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:50:24 crc kubenswrapper[4766]: I1002 12:50:24.680605 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ca20d6-d835-475e-9734-0b2791feaeac-kube-api-access-frtkk" (OuterVolumeSpecName: "kube-api-access-frtkk") pod "33ca20d6-d835-475e-9734-0b2791feaeac" (UID: "33ca20d6-d835-475e-9734-0b2791feaeac"). InnerVolumeSpecName "kube-api-access-frtkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:50:24 crc kubenswrapper[4766]: I1002 12:50:24.779235 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frtkk\" (UniqueName: \"kubernetes.io/projected/33ca20d6-d835-475e-9734-0b2791feaeac-kube-api-access-frtkk\") on node \"crc\" DevicePath \"\"" Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.006132 4766 generic.go:334] "Generic (PLEG): container finished" podID="33ca20d6-d835-475e-9734-0b2791feaeac" containerID="b136d2c4171a2d1b507139fc22d7c0bff1ff728e32e0570ce1eae910a8b6261f" exitCode=0 Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.006184 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k6q5" event={"ID":"33ca20d6-d835-475e-9734-0b2791feaeac","Type":"ContainerDied","Data":"b136d2c4171a2d1b507139fc22d7c0bff1ff728e32e0570ce1eae910a8b6261f"} Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.006215 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4k6q5" event={"ID":"33ca20d6-d835-475e-9734-0b2791feaeac","Type":"ContainerDied","Data":"4ff8c3cbd3d8d476ae031c32dbef667d59a8a5b6e07b4af930c38322787ba604"} Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.006231 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4k6q5" Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.006236 4766 scope.go:117] "RemoveContainer" containerID="b136d2c4171a2d1b507139fc22d7c0bff1ff728e32e0570ce1eae910a8b6261f" Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.030648 4766 scope.go:117] "RemoveContainer" containerID="3644d1b85954a3253ca98db5bc7d8e99949612da58ceed22874f50c037911ad3" Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.063241 4766 scope.go:117] "RemoveContainer" containerID="212ffe9c4330602197150a69bea1face30eddcc974b9c7262d98e9765d8a51b8" Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.138158 4766 scope.go:117] "RemoveContainer" containerID="b136d2c4171a2d1b507139fc22d7c0bff1ff728e32e0570ce1eae910a8b6261f" Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.139407 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33ca20d6-d835-475e-9734-0b2791feaeac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33ca20d6-d835-475e-9734-0b2791feaeac" (UID: "33ca20d6-d835-475e-9734-0b2791feaeac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:50:25 crc kubenswrapper[4766]: E1002 12:50:25.139727 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b136d2c4171a2d1b507139fc22d7c0bff1ff728e32e0570ce1eae910a8b6261f\": container with ID starting with b136d2c4171a2d1b507139fc22d7c0bff1ff728e32e0570ce1eae910a8b6261f not found: ID does not exist" containerID="b136d2c4171a2d1b507139fc22d7c0bff1ff728e32e0570ce1eae910a8b6261f" Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.139806 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b136d2c4171a2d1b507139fc22d7c0bff1ff728e32e0570ce1eae910a8b6261f"} err="failed to get container status \"b136d2c4171a2d1b507139fc22d7c0bff1ff728e32e0570ce1eae910a8b6261f\": rpc error: code = NotFound desc = could not find container \"b136d2c4171a2d1b507139fc22d7c0bff1ff728e32e0570ce1eae910a8b6261f\": container with ID starting with b136d2c4171a2d1b507139fc22d7c0bff1ff728e32e0570ce1eae910a8b6261f not found: ID does not exist" Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.139872 4766 scope.go:117] "RemoveContainer" containerID="3644d1b85954a3253ca98db5bc7d8e99949612da58ceed22874f50c037911ad3" Oct 02 12:50:25 crc kubenswrapper[4766]: E1002 12:50:25.140432 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3644d1b85954a3253ca98db5bc7d8e99949612da58ceed22874f50c037911ad3\": container with ID starting with 3644d1b85954a3253ca98db5bc7d8e99949612da58ceed22874f50c037911ad3 not found: ID does not exist" containerID="3644d1b85954a3253ca98db5bc7d8e99949612da58ceed22874f50c037911ad3" Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.140489 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3644d1b85954a3253ca98db5bc7d8e99949612da58ceed22874f50c037911ad3"} err="failed to get container status \"3644d1b85954a3253ca98db5bc7d8e99949612da58ceed22874f50c037911ad3\": rpc error: code = NotFound desc = could not find container \"3644d1b85954a3253ca98db5bc7d8e99949612da58ceed22874f50c037911ad3\": container with ID starting with 3644d1b85954a3253ca98db5bc7d8e99949612da58ceed22874f50c037911ad3 not found: ID does not exist" Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.140544 4766 scope.go:117] "RemoveContainer" containerID="212ffe9c4330602197150a69bea1face30eddcc974b9c7262d98e9765d8a51b8" Oct 02 12:50:25 crc kubenswrapper[4766]: E1002 12:50:25.141024 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"212ffe9c4330602197150a69bea1face30eddcc974b9c7262d98e9765d8a51b8\": container with ID starting with 212ffe9c4330602197150a69bea1face30eddcc974b9c7262d98e9765d8a51b8 not found: ID does not exist" containerID="212ffe9c4330602197150a69bea1face30eddcc974b9c7262d98e9765d8a51b8" Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.141078 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"212ffe9c4330602197150a69bea1face30eddcc974b9c7262d98e9765d8a51b8"} err="failed to get container status \"212ffe9c4330602197150a69bea1face30eddcc974b9c7262d98e9765d8a51b8\": rpc error: code = NotFound desc = could not find container \"212ffe9c4330602197150a69bea1face30eddcc974b9c7262d98e9765d8a51b8\": container with ID starting with 212ffe9c4330602197150a69bea1face30eddcc974b9c7262d98e9765d8a51b8 not found: ID does not exist" Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.185459 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33ca20d6-d835-475e-9734-0b2791feaeac-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.339313 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4k6q5"] Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.349070 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4k6q5"] Oct 02 12:50:25 crc kubenswrapper[4766]: I1002 12:50:25.905402 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ca20d6-d835-475e-9734-0b2791feaeac" path="/var/lib/kubelet/pods/33ca20d6-d835-475e-9734-0b2791feaeac/volumes" Oct 02 12:50:36 crc kubenswrapper[4766]: I1002 12:50:36.881074 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:50:36 crc kubenswrapper[4766]: E1002 12:50:36.881920 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:50:41 crc kubenswrapper[4766]: I1002 12:50:41.342593 4766 scope.go:117] "RemoveContainer" containerID="ef2a20319350fa1c2836fa8d8bfce79811592c922bfd6aa9bd23910741460bf0" Oct 02 12:50:41 crc kubenswrapper[4766]: I1002 12:50:41.407334 4766 scope.go:117] "RemoveContainer" containerID="4f5a0273e90ff42f896232eebe95a70aa0fd21390d84e5047438714cb59cc948" Oct 02 12:50:41 crc kubenswrapper[4766]: I1002 12:50:41.441191 4766 scope.go:117] "RemoveContainer" containerID="7dbd00e4dae346409b8d5550010bc9edbfee6d9f2645a8c7d3041a9b94c14c7f" Oct 02 12:50:47 crc kubenswrapper[4766]: I1002 12:50:47.881562 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:50:47 crc kubenswrapper[4766]: E1002 12:50:47.882595 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:50:53 crc kubenswrapper[4766]: I1002 12:50:53.040886 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-wrf46"] Oct 02 12:50:53 crc kubenswrapper[4766]: I1002 12:50:53.055074 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-wrf46"] Oct 02 12:50:53 crc kubenswrapper[4766]: I1002 12:50:53.899947 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aab08b25-08d7-4dd3-837c-24863de3ab01" path="/var/lib/kubelet/pods/aab08b25-08d7-4dd3-837c-24863de3ab01/volumes" Oct 02 12:51:02 crc kubenswrapper[4766]: I1002 12:51:02.882351 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:51:02 crc kubenswrapper[4766]: E1002 12:51:02.883597 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:51:03 crc kubenswrapper[4766]: I1002 12:51:03.033805 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-75c6-account-create-hgfjc"] Oct 02 12:51:03 crc kubenswrapper[4766]: I1002 12:51:03.045764 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-75c6-account-create-hgfjc"] Oct 02 12:51:03 crc kubenswrapper[4766]: I1002 12:51:03.903303 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60696ba0-e879-4b95-b020-9c3a81a922af" path="/var/lib/kubelet/pods/60696ba0-e879-4b95-b020-9c3a81a922af/volumes" Oct 02 12:51:13 crc kubenswrapper[4766]: I1002 12:51:13.882552 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:51:13 crc kubenswrapper[4766]: E1002 12:51:13.883685 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:51:16 crc kubenswrapper[4766]: I1002 12:51:16.049748 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-dcvnv"] Oct 02 12:51:16 crc kubenswrapper[4766]: I1002 12:51:16.062043 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-dcvnv"] Oct 02 12:51:17 crc kubenswrapper[4766]: I1002 12:51:17.905309 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4" path="/var/lib/kubelet/pods/24f54da4-a1ab-4fb8-86b2-ccc9b20a7ba4/volumes" Oct 02 12:51:25 crc kubenswrapper[4766]: I1002 12:51:25.890313 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:51:25 crc kubenswrapper[4766]: E1002 12:51:25.891880 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:51:36 crc kubenswrapper[4766]: I1002 12:51:36.882477 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:51:36 crc kubenswrapper[4766]: E1002 12:51:36.883479 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:51:41 crc kubenswrapper[4766]: I1002 12:51:41.615451 4766 scope.go:117] "RemoveContainer" containerID="c4bed18926e9e6bb7b83fb8174e4f461e3bee4196997bfc78417bfb54ab04b83" Oct 02 12:51:41 crc kubenswrapper[4766]: I1002 12:51:41.644451 4766 scope.go:117] "RemoveContainer" containerID="a5de29348ba2324c852fb46cfa523213b8b0e248a6271ceb4c8b8e0764f33d33" Oct 02 12:51:41 crc kubenswrapper[4766]: I1002 12:51:41.719692 4766 scope.go:117] "RemoveContainer" containerID="8490be9a62e7164e63066e7caf031d83817a6ebbc07f351ea4c97617a0c4ef0f" Oct 02 12:51:50 crc kubenswrapper[4766]: I1002 12:51:50.881790 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:51:50 crc kubenswrapper[4766]: E1002 12:51:50.882972 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:52:05 crc kubenswrapper[4766]: I1002 12:52:05.887707 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:52:05 crc kubenswrapper[4766]: E1002 12:52:05.888465 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:52:17 crc kubenswrapper[4766]: I1002 12:52:17.885661 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:52:17 crc kubenswrapper[4766]: E1002 12:52:17.891680 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:52:28 crc kubenswrapper[4766]: I1002 12:52:28.881237 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:52:28 crc kubenswrapper[4766]: E1002 12:52:28.882403 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:52:41 crc kubenswrapper[4766]: I1002 12:52:41.882183 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:52:41 crc kubenswrapper[4766]: E1002 12:52:41.883261 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:52:52 crc kubenswrapper[4766]: I1002 12:52:52.881360 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:52:52 crc kubenswrapper[4766]: E1002 12:52:52.882200 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:53:04 crc kubenswrapper[4766]: I1002 12:53:04.882930 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:53:04 crc kubenswrapper[4766]: E1002 12:53:04.884209 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:53:18 crc kubenswrapper[4766]: I1002 12:53:18.882145 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:53:18 crc kubenswrapper[4766]: E1002 12:53:18.884775 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:53:33 crc kubenswrapper[4766]: I1002 12:53:33.881923 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:53:33 crc kubenswrapper[4766]: E1002 12:53:33.882869 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:53:46 crc kubenswrapper[4766]: I1002 12:53:46.880914 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:53:47 crc kubenswrapper[4766]: E1002 12:53:46.881732 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 12:54:01 crc kubenswrapper[4766]: I1002 12:54:01.882023 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:54:02 crc kubenswrapper[4766]: I1002 12:54:02.332878 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"653354b49be3bc2afc4b9e41e295aa3b09fbf84e205a578a42bfba4c372df755"} Oct 02 12:54:26 crc kubenswrapper[4766]: I1002 12:54:26.603328 4766 generic.go:334] "Generic (PLEG): container finished" podID="88d78077-1bd0-416c-979a-b52075152089" containerID="73da9b9a999a7121a2d0d5056892db8bf0436940fd18829de76f9afbe06c983e" exitCode=0 Oct 02 12:54:26 crc kubenswrapper[4766]: I1002 12:54:26.603451 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" event={"ID":"88d78077-1bd0-416c-979a-b52075152089","Type":"ContainerDied","Data":"73da9b9a999a7121a2d0d5056892db8bf0436940fd18829de76f9afbe06c983e"} Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.096591 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.264734 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr6lf\" (UniqueName: \"kubernetes.io/projected/88d78077-1bd0-416c-979a-b52075152089-kube-api-access-rr6lf\") pod \"88d78077-1bd0-416c-979a-b52075152089\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.264786 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-inventory\") pod \"88d78077-1bd0-416c-979a-b52075152089\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.264969 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-ceph\") pod \"88d78077-1bd0-416c-979a-b52075152089\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.265013 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-tripleo-cleanup-combined-ca-bundle\") pod \"88d78077-1bd0-416c-979a-b52075152089\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.265122 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-ssh-key\") pod \"88d78077-1bd0-416c-979a-b52075152089\" (UID: \"88d78077-1bd0-416c-979a-b52075152089\") " Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.270804 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-ceph" (OuterVolumeSpecName: "ceph") pod "88d78077-1bd0-416c-979a-b52075152089" (UID: "88d78077-1bd0-416c-979a-b52075152089"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.271182 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88d78077-1bd0-416c-979a-b52075152089-kube-api-access-rr6lf" (OuterVolumeSpecName: "kube-api-access-rr6lf") pod "88d78077-1bd0-416c-979a-b52075152089" (UID: "88d78077-1bd0-416c-979a-b52075152089"). InnerVolumeSpecName "kube-api-access-rr6lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.273164 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "88d78077-1bd0-416c-979a-b52075152089" (UID: "88d78077-1bd0-416c-979a-b52075152089"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.300123 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-inventory" (OuterVolumeSpecName: "inventory") pod "88d78077-1bd0-416c-979a-b52075152089" (UID: "88d78077-1bd0-416c-979a-b52075152089"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.305972 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "88d78077-1bd0-416c-979a-b52075152089" (UID: "88d78077-1bd0-416c-979a-b52075152089"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.367794 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.367855 4766 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.367871 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.367885 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr6lf\" (UniqueName: \"kubernetes.io/projected/88d78077-1bd0-416c-979a-b52075152089-kube-api-access-rr6lf\") on node \"crc\" DevicePath \"\"" Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.367898 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/88d78077-1bd0-416c-979a-b52075152089-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.622206 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" event={"ID":"88d78077-1bd0-416c-979a-b52075152089","Type":"ContainerDied","Data":"9e0af97fe6b3119398d24affbf448f98e4ab854051e2bde610b083afaa2c8fa2"} Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.622335 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58" Oct 02 12:54:28 crc kubenswrapper[4766]: I1002 12:54:28.622497 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e0af97fe6b3119398d24affbf448f98e4ab854051e2bde610b083afaa2c8fa2" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.009758 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-6x2jq"] Oct 02 12:54:31 crc kubenswrapper[4766]: E1002 12:54:31.010562 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ca20d6-d835-475e-9734-0b2791feaeac" containerName="extract-utilities" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.010577 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ca20d6-d835-475e-9734-0b2791feaeac" containerName="extract-utilities" Oct 02 12:54:31 crc kubenswrapper[4766]: E1002 12:54:31.010594 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ca20d6-d835-475e-9734-0b2791feaeac" containerName="extract-content" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.010600 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ca20d6-d835-475e-9734-0b2791feaeac" containerName="extract-content" Oct 02 12:54:31 crc kubenswrapper[4766]: E1002 12:54:31.010623 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ca20d6-d835-475e-9734-0b2791feaeac" containerName="registry-server" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.010629 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ca20d6-d835-475e-9734-0b2791feaeac" containerName="registry-server" Oct 02 12:54:31 crc kubenswrapper[4766]: E1002 12:54:31.010647 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88d78077-1bd0-416c-979a-b52075152089" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.010655 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d78077-1bd0-416c-979a-b52075152089" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.010858 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ca20d6-d835-475e-9734-0b2791feaeac" containerName="registry-server" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.010882 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="88d78077-1bd0-416c-979a-b52075152089" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.011927 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.014988 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.015420 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.015487 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.018007 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.027140 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-6x2jq"] Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.030769 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-6x2jq\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.030891 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-inventory\") pod \"bootstrap-openstack-openstack-cell1-6x2jq\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.031216 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-6x2jq\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.031319 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8crc\" (UniqueName: \"kubernetes.io/projected/829dc872-61b8-4549-a976-404ea823ea25-kube-api-access-g8crc\") pod \"bootstrap-openstack-openstack-cell1-6x2jq\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.031520 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-ceph\") pod \"bootstrap-openstack-openstack-cell1-6x2jq\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.133283 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-6x2jq\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.133388 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-inventory\") pod \"bootstrap-openstack-openstack-cell1-6x2jq\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.133479 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-6x2jq\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.133533 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8crc\" (UniqueName: \"kubernetes.io/projected/829dc872-61b8-4549-a976-404ea823ea25-kube-api-access-g8crc\") pod \"bootstrap-openstack-openstack-cell1-6x2jq\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.133601 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-ceph\") pod \"bootstrap-openstack-openstack-cell1-6x2jq\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.139296 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-ceph\") pod \"bootstrap-openstack-openstack-cell1-6x2jq\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.139685 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-6x2jq\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.140155 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-6x2jq\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.141294 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-inventory\") pod \"bootstrap-openstack-openstack-cell1-6x2jq\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.156306 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8crc\" (UniqueName: \"kubernetes.io/projected/829dc872-61b8-4549-a976-404ea823ea25-kube-api-access-g8crc\") pod \"bootstrap-openstack-openstack-cell1-6x2jq\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.330905 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.944305 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-6x2jq"] Oct 02 12:54:31 crc kubenswrapper[4766]: W1002 12:54:31.961753 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod829dc872_61b8_4549_a976_404ea823ea25.slice/crio-63175b5f42b8a33910ab88bbd05f844b736a41c9db97db533392f85c645967cf WatchSource:0}: Error finding container 63175b5f42b8a33910ab88bbd05f844b736a41c9db97db533392f85c645967cf: Status 404 returned error can't find the container with id 63175b5f42b8a33910ab88bbd05f844b736a41c9db97db533392f85c645967cf Oct 02 12:54:31 crc kubenswrapper[4766]: I1002 12:54:31.965086 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:54:32 crc kubenswrapper[4766]: I1002 12:54:32.666721 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" event={"ID":"829dc872-61b8-4549-a976-404ea823ea25","Type":"ContainerStarted","Data":"63175b5f42b8a33910ab88bbd05f844b736a41c9db97db533392f85c645967cf"} Oct 02 12:54:33 crc kubenswrapper[4766]: I1002 12:54:33.676404 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" event={"ID":"829dc872-61b8-4549-a976-404ea823ea25","Type":"ContainerStarted","Data":"49ab45dadcfe5a6e33a0f64341760a24149c7d3c55a08de02433273644f8af68"} Oct 02 12:54:33 crc kubenswrapper[4766]: I1002 12:54:33.702175 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" podStartSLOduration=3.395006996 podStartE2EDuration="3.702147165s" podCreationTimestamp="2025-10-02 12:54:30 +0000 UTC" firstStartedPulling="2025-10-02 12:54:31.964859606 +0000 UTC m=+7386.907730540" lastFinishedPulling="2025-10-02 12:54:32.271999755 +0000 UTC m=+7387.214870709" observedRunningTime="2025-10-02 12:54:33.693128026 +0000 UTC m=+7388.635998980" watchObservedRunningTime="2025-10-02 12:54:33.702147165 +0000 UTC m=+7388.645018119" Oct 02 12:56:19 crc kubenswrapper[4766]: I1002 12:56:19.090554 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-744hj"] Oct 02 12:56:19 crc kubenswrapper[4766]: I1002 12:56:19.093374 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-744hj" Oct 02 12:56:19 crc kubenswrapper[4766]: I1002 12:56:19.108695 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-744hj"] Oct 02 12:56:19 crc kubenswrapper[4766]: I1002 12:56:19.164563 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3ac9da-81ed-40bf-ac52-d219e9c83189-utilities\") pod \"certified-operators-744hj\" (UID: \"ba3ac9da-81ed-40bf-ac52-d219e9c83189\") " pod="openshift-marketplace/certified-operators-744hj" Oct 02 12:56:19 crc kubenswrapper[4766]: I1002 12:56:19.164865 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thksv\" (UniqueName: \"kubernetes.io/projected/ba3ac9da-81ed-40bf-ac52-d219e9c83189-kube-api-access-thksv\") pod \"certified-operators-744hj\" (UID: \"ba3ac9da-81ed-40bf-ac52-d219e9c83189\") " pod="openshift-marketplace/certified-operators-744hj" Oct 02 12:56:19 crc kubenswrapper[4766]: I1002 12:56:19.164919 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3ac9da-81ed-40bf-ac52-d219e9c83189-catalog-content\") pod \"certified-operators-744hj\" (UID: \"ba3ac9da-81ed-40bf-ac52-d219e9c83189\") " pod="openshift-marketplace/certified-operators-744hj" Oct 02 12:56:19 crc kubenswrapper[4766]: I1002 12:56:19.266741 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thksv\" (UniqueName: \"kubernetes.io/projected/ba3ac9da-81ed-40bf-ac52-d219e9c83189-kube-api-access-thksv\") pod \"certified-operators-744hj\" (UID: \"ba3ac9da-81ed-40bf-ac52-d219e9c83189\") " pod="openshift-marketplace/certified-operators-744hj" Oct 02 12:56:19 crc kubenswrapper[4766]: I1002 12:56:19.266811 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3ac9da-81ed-40bf-ac52-d219e9c83189-catalog-content\") pod \"certified-operators-744hj\" (UID: \"ba3ac9da-81ed-40bf-ac52-d219e9c83189\") " pod="openshift-marketplace/certified-operators-744hj" Oct 02 12:56:19 crc kubenswrapper[4766]: I1002 12:56:19.266946 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3ac9da-81ed-40bf-ac52-d219e9c83189-utilities\") pod \"certified-operators-744hj\" (UID: \"ba3ac9da-81ed-40bf-ac52-d219e9c83189\") " pod="openshift-marketplace/certified-operators-744hj" Oct 02 12:56:19 crc kubenswrapper[4766]: I1002 12:56:19.267466 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3ac9da-81ed-40bf-ac52-d219e9c83189-catalog-content\") pod \"certified-operators-744hj\" (UID: \"ba3ac9da-81ed-40bf-ac52-d219e9c83189\") " pod="openshift-marketplace/certified-operators-744hj" Oct 02 12:56:19 crc kubenswrapper[4766]: I1002 12:56:19.267625 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3ac9da-81ed-40bf-ac52-d219e9c83189-utilities\") pod \"certified-operators-744hj\" (UID: \"ba3ac9da-81ed-40bf-ac52-d219e9c83189\") " pod="openshift-marketplace/certified-operators-744hj" Oct 02 12:56:19 crc kubenswrapper[4766]: I1002 12:56:19.290583 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thksv\" (UniqueName: \"kubernetes.io/projected/ba3ac9da-81ed-40bf-ac52-d219e9c83189-kube-api-access-thksv\") pod \"certified-operators-744hj\" (UID: \"ba3ac9da-81ed-40bf-ac52-d219e9c83189\") " pod="openshift-marketplace/certified-operators-744hj" Oct 02 12:56:19 crc kubenswrapper[4766]: I1002 12:56:19.422600 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-744hj" Oct 02 12:56:19 crc kubenswrapper[4766]: I1002 12:56:19.972262 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-744hj"] Oct 02 12:56:20 crc kubenswrapper[4766]: I1002 12:56:20.882552 4766 generic.go:334] "Generic (PLEG): container finished" podID="ba3ac9da-81ed-40bf-ac52-d219e9c83189" containerID="c25c5a932ad97b7109c79c6d5b6bdd7ed05d4b00dc1c6c6a4d19780e175fc0e4" exitCode=0 Oct 02 12:56:20 crc kubenswrapper[4766]: I1002 12:56:20.882650 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-744hj" event={"ID":"ba3ac9da-81ed-40bf-ac52-d219e9c83189","Type":"ContainerDied","Data":"c25c5a932ad97b7109c79c6d5b6bdd7ed05d4b00dc1c6c6a4d19780e175fc0e4"} Oct 02 12:56:20 crc kubenswrapper[4766]: I1002 12:56:20.883791 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-744hj" event={"ID":"ba3ac9da-81ed-40bf-ac52-d219e9c83189","Type":"ContainerStarted","Data":"8bc167c1b97c6579bc728bf34a190c74241c084231059463f907762948d7ab49"} Oct 02 12:56:22 crc kubenswrapper[4766]: I1002 12:56:22.902135 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-744hj" event={"ID":"ba3ac9da-81ed-40bf-ac52-d219e9c83189","Type":"ContainerStarted","Data":"72d427342ffb980186ad463d6b889ee0b2b48ce8c7ef58f8e5842bff6ca77a48"} Oct 02 12:56:23 crc kubenswrapper[4766]: I1002 12:56:23.913124 4766 generic.go:334] "Generic (PLEG): container finished" podID="ba3ac9da-81ed-40bf-ac52-d219e9c83189" containerID="72d427342ffb980186ad463d6b889ee0b2b48ce8c7ef58f8e5842bff6ca77a48" exitCode=0 Oct 02 12:56:23 crc kubenswrapper[4766]: I1002 12:56:23.913219 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-744hj" event={"ID":"ba3ac9da-81ed-40bf-ac52-d219e9c83189","Type":"ContainerDied","Data":"72d427342ffb980186ad463d6b889ee0b2b48ce8c7ef58f8e5842bff6ca77a48"} Oct 02 12:56:24 crc kubenswrapper[4766]: I1002 12:56:24.434094 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:56:24 crc kubenswrapper[4766]: I1002 12:56:24.434216 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:56:24 crc kubenswrapper[4766]: I1002 12:56:24.927084 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-744hj" event={"ID":"ba3ac9da-81ed-40bf-ac52-d219e9c83189","Type":"ContainerStarted","Data":"7193d9d4eb5c933cf9aae7c9569f69182d04fd3bdd0f33e815d0a77810e831da"} Oct 02 12:56:24 crc kubenswrapper[4766]: I1002 12:56:24.949625 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-744hj" podStartSLOduration=2.366377608 podStartE2EDuration="5.949601637s" podCreationTimestamp="2025-10-02 12:56:19 +0000 UTC" firstStartedPulling="2025-10-02 12:56:20.884788492 +0000 UTC m=+7495.827659446" lastFinishedPulling="2025-10-02 12:56:24.468012531 +0000 UTC m=+7499.410883475" observedRunningTime="2025-10-02 12:56:24.945384181 +0000 UTC m=+7499.888255125" watchObservedRunningTime="2025-10-02 12:56:24.949601637 +0000 UTC m=+7499.892472581" Oct 02 12:56:29 crc kubenswrapper[4766]: I1002 12:56:29.423019 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-744hj" Oct 02 12:56:29 crc kubenswrapper[4766]: I1002 12:56:29.424358 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-744hj" Oct 02 12:56:29 crc kubenswrapper[4766]: I1002 12:56:29.493106 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-744hj" Oct 02 12:56:30 crc kubenswrapper[4766]: I1002 12:56:30.041519 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-744hj" Oct 02 12:56:30 crc kubenswrapper[4766]: I1002 12:56:30.102902 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-744hj"] Oct 02 12:56:32 crc kubenswrapper[4766]: I1002 12:56:32.008055 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-744hj" podUID="ba3ac9da-81ed-40bf-ac52-d219e9c83189" containerName="registry-server" containerID="cri-o://7193d9d4eb5c933cf9aae7c9569f69182d04fd3bdd0f33e815d0a77810e831da" gracePeriod=2 Oct 02 12:56:32 crc kubenswrapper[4766]: I1002 12:56:32.607942 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-744hj" Oct 02 12:56:32 crc kubenswrapper[4766]: I1002 12:56:32.713136 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3ac9da-81ed-40bf-ac52-d219e9c83189-catalog-content\") pod \"ba3ac9da-81ed-40bf-ac52-d219e9c83189\" (UID: \"ba3ac9da-81ed-40bf-ac52-d219e9c83189\") " Oct 02 12:56:32 crc kubenswrapper[4766]: I1002 12:56:32.713232 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thksv\" (UniqueName: \"kubernetes.io/projected/ba3ac9da-81ed-40bf-ac52-d219e9c83189-kube-api-access-thksv\") pod \"ba3ac9da-81ed-40bf-ac52-d219e9c83189\" (UID: \"ba3ac9da-81ed-40bf-ac52-d219e9c83189\") " Oct 02 12:56:32 crc kubenswrapper[4766]: I1002 12:56:32.713423 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3ac9da-81ed-40bf-ac52-d219e9c83189-utilities\") pod \"ba3ac9da-81ed-40bf-ac52-d219e9c83189\" (UID: \"ba3ac9da-81ed-40bf-ac52-d219e9c83189\") " Oct 02 12:56:32 crc kubenswrapper[4766]: I1002 12:56:32.714915 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba3ac9da-81ed-40bf-ac52-d219e9c83189-utilities" (OuterVolumeSpecName: "utilities") pod "ba3ac9da-81ed-40bf-ac52-d219e9c83189" (UID: "ba3ac9da-81ed-40bf-ac52-d219e9c83189"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:56:32 crc kubenswrapper[4766]: I1002 12:56:32.718614 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3ac9da-81ed-40bf-ac52-d219e9c83189-kube-api-access-thksv" (OuterVolumeSpecName: "kube-api-access-thksv") pod "ba3ac9da-81ed-40bf-ac52-d219e9c83189" (UID: "ba3ac9da-81ed-40bf-ac52-d219e9c83189"). InnerVolumeSpecName "kube-api-access-thksv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:56:32 crc kubenswrapper[4766]: I1002 12:56:32.777925 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba3ac9da-81ed-40bf-ac52-d219e9c83189-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba3ac9da-81ed-40bf-ac52-d219e9c83189" (UID: "ba3ac9da-81ed-40bf-ac52-d219e9c83189"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:56:32 crc kubenswrapper[4766]: I1002 12:56:32.816364 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thksv\" (UniqueName: \"kubernetes.io/projected/ba3ac9da-81ed-40bf-ac52-d219e9c83189-kube-api-access-thksv\") on node \"crc\" DevicePath \"\"" Oct 02 12:56:32 crc kubenswrapper[4766]: I1002 12:56:32.816422 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3ac9da-81ed-40bf-ac52-d219e9c83189-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:56:32 crc kubenswrapper[4766]: I1002 12:56:32.816442 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3ac9da-81ed-40bf-ac52-d219e9c83189-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:56:33 crc kubenswrapper[4766]: I1002 12:56:33.022680 4766 generic.go:334] "Generic (PLEG): container finished" podID="ba3ac9da-81ed-40bf-ac52-d219e9c83189" containerID="7193d9d4eb5c933cf9aae7c9569f69182d04fd3bdd0f33e815d0a77810e831da" exitCode=0 Oct 02 12:56:33 crc kubenswrapper[4766]: I1002 12:56:33.022734 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-744hj" event={"ID":"ba3ac9da-81ed-40bf-ac52-d219e9c83189","Type":"ContainerDied","Data":"7193d9d4eb5c933cf9aae7c9569f69182d04fd3bdd0f33e815d0a77810e831da"} Oct 02 12:56:33 crc kubenswrapper[4766]: I1002 12:56:33.022818 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-744hj" Oct 02 12:56:33 crc kubenswrapper[4766]: I1002 12:56:33.023619 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-744hj" event={"ID":"ba3ac9da-81ed-40bf-ac52-d219e9c83189","Type":"ContainerDied","Data":"8bc167c1b97c6579bc728bf34a190c74241c084231059463f907762948d7ab49"} Oct 02 12:56:33 crc kubenswrapper[4766]: I1002 12:56:33.023738 4766 scope.go:117] "RemoveContainer" containerID="7193d9d4eb5c933cf9aae7c9569f69182d04fd3bdd0f33e815d0a77810e831da" Oct 02 12:56:33 crc kubenswrapper[4766]: I1002 12:56:33.082657 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-744hj"] Oct 02 12:56:33 crc kubenswrapper[4766]: I1002 12:56:33.083784 4766 scope.go:117] "RemoveContainer" containerID="72d427342ffb980186ad463d6b889ee0b2b48ce8c7ef58f8e5842bff6ca77a48" Oct 02 12:56:33 crc kubenswrapper[4766]: I1002 12:56:33.100058 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-744hj"] Oct 02 12:56:33 crc kubenswrapper[4766]: I1002 12:56:33.110605 4766 scope.go:117] "RemoveContainer" containerID="c25c5a932ad97b7109c79c6d5b6bdd7ed05d4b00dc1c6c6a4d19780e175fc0e4" Oct 02 12:56:33 crc kubenswrapper[4766]: I1002 12:56:33.163124 4766 scope.go:117] "RemoveContainer" containerID="7193d9d4eb5c933cf9aae7c9569f69182d04fd3bdd0f33e815d0a77810e831da" Oct 02 12:56:33 crc kubenswrapper[4766]: E1002 12:56:33.164331 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7193d9d4eb5c933cf9aae7c9569f69182d04fd3bdd0f33e815d0a77810e831da\": container with ID starting with 7193d9d4eb5c933cf9aae7c9569f69182d04fd3bdd0f33e815d0a77810e831da not found: ID does not exist" containerID="7193d9d4eb5c933cf9aae7c9569f69182d04fd3bdd0f33e815d0a77810e831da" Oct 02 12:56:33 crc kubenswrapper[4766]: I1002 12:56:33.164447 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7193d9d4eb5c933cf9aae7c9569f69182d04fd3bdd0f33e815d0a77810e831da"} err="failed to get container status \"7193d9d4eb5c933cf9aae7c9569f69182d04fd3bdd0f33e815d0a77810e831da\": rpc error: code = NotFound desc = could not find container \"7193d9d4eb5c933cf9aae7c9569f69182d04fd3bdd0f33e815d0a77810e831da\": container with ID starting with 7193d9d4eb5c933cf9aae7c9569f69182d04fd3bdd0f33e815d0a77810e831da not found: ID does not exist" Oct 02 12:56:33 crc kubenswrapper[4766]: I1002 12:56:33.164492 4766 scope.go:117] "RemoveContainer" containerID="72d427342ffb980186ad463d6b889ee0b2b48ce8c7ef58f8e5842bff6ca77a48" Oct 02 12:56:33 crc kubenswrapper[4766]: E1002 12:56:33.165263 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d427342ffb980186ad463d6b889ee0b2b48ce8c7ef58f8e5842bff6ca77a48\": container with ID starting with 72d427342ffb980186ad463d6b889ee0b2b48ce8c7ef58f8e5842bff6ca77a48 not found: ID does not exist" containerID="72d427342ffb980186ad463d6b889ee0b2b48ce8c7ef58f8e5842bff6ca77a48" Oct 02 12:56:33 crc kubenswrapper[4766]: I1002 12:56:33.165348 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d427342ffb980186ad463d6b889ee0b2b48ce8c7ef58f8e5842bff6ca77a48"} err="failed to get container status \"72d427342ffb980186ad463d6b889ee0b2b48ce8c7ef58f8e5842bff6ca77a48\": rpc error: code = NotFound desc = could not find container \"72d427342ffb980186ad463d6b889ee0b2b48ce8c7ef58f8e5842bff6ca77a48\": container with ID starting with 72d427342ffb980186ad463d6b889ee0b2b48ce8c7ef58f8e5842bff6ca77a48 not found: ID does not exist" Oct 02 12:56:33 crc kubenswrapper[4766]: I1002 12:56:33.165364 4766 scope.go:117] "RemoveContainer" containerID="c25c5a932ad97b7109c79c6d5b6bdd7ed05d4b00dc1c6c6a4d19780e175fc0e4" Oct 02 12:56:33 crc kubenswrapper[4766]: E1002 12:56:33.166028 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c25c5a932ad97b7109c79c6d5b6bdd7ed05d4b00dc1c6c6a4d19780e175fc0e4\": container with ID starting with c25c5a932ad97b7109c79c6d5b6bdd7ed05d4b00dc1c6c6a4d19780e175fc0e4 not found: ID does not exist" containerID="c25c5a932ad97b7109c79c6d5b6bdd7ed05d4b00dc1c6c6a4d19780e175fc0e4" Oct 02 12:56:33 crc kubenswrapper[4766]: I1002 12:56:33.166099 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c25c5a932ad97b7109c79c6d5b6bdd7ed05d4b00dc1c6c6a4d19780e175fc0e4"} err="failed to get container status \"c25c5a932ad97b7109c79c6d5b6bdd7ed05d4b00dc1c6c6a4d19780e175fc0e4\": rpc error: code = NotFound desc = could not find container \"c25c5a932ad97b7109c79c6d5b6bdd7ed05d4b00dc1c6c6a4d19780e175fc0e4\": container with ID starting with c25c5a932ad97b7109c79c6d5b6bdd7ed05d4b00dc1c6c6a4d19780e175fc0e4 not found: ID does not exist" Oct 02 12:56:33 crc kubenswrapper[4766]: I1002 12:56:33.901021 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba3ac9da-81ed-40bf-ac52-d219e9c83189" path="/var/lib/kubelet/pods/ba3ac9da-81ed-40bf-ac52-d219e9c83189/volumes" Oct 02 12:56:54 crc kubenswrapper[4766]: I1002 12:56:54.432021 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:56:54 crc kubenswrapper[4766]: I1002 12:56:54.432712 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:57:24 crc kubenswrapper[4766]: I1002 12:57:24.432054 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:57:24 crc kubenswrapper[4766]: I1002 12:57:24.432761 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:57:24 crc kubenswrapper[4766]: I1002 12:57:24.432836 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 12:57:24 crc kubenswrapper[4766]: I1002 12:57:24.433973 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"653354b49be3bc2afc4b9e41e295aa3b09fbf84e205a578a42bfba4c372df755"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 12:57:24 crc kubenswrapper[4766]: I1002 12:57:24.434048 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://653354b49be3bc2afc4b9e41e295aa3b09fbf84e205a578a42bfba4c372df755" gracePeriod=600 Oct 02 12:57:24 crc kubenswrapper[4766]: I1002 12:57:24.632274 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="653354b49be3bc2afc4b9e41e295aa3b09fbf84e205a578a42bfba4c372df755" exitCode=0 Oct 02 12:57:24 crc kubenswrapper[4766]: I1002 12:57:24.632316 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"653354b49be3bc2afc4b9e41e295aa3b09fbf84e205a578a42bfba4c372df755"} Oct 02 12:57:24 crc kubenswrapper[4766]: I1002 12:57:24.632657 4766 scope.go:117] "RemoveContainer" containerID="239901ea4424bbaf08b0aff6d9024e8db172e331c1efcba9809849e7d2749089" Oct 02 12:57:25 crc kubenswrapper[4766]: I1002 12:57:25.646572 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf"} Oct 02 12:57:44 crc kubenswrapper[4766]: I1002 12:57:44.852762 4766 generic.go:334] "Generic (PLEG): container finished" podID="829dc872-61b8-4549-a976-404ea823ea25" containerID="49ab45dadcfe5a6e33a0f64341760a24149c7d3c55a08de02433273644f8af68" exitCode=0 Oct 02 12:57:44 crc kubenswrapper[4766]: I1002 12:57:44.852859 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" event={"ID":"829dc872-61b8-4549-a976-404ea823ea25","Type":"ContainerDied","Data":"49ab45dadcfe5a6e33a0f64341760a24149c7d3c55a08de02433273644f8af68"} Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.338835 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.489921 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8crc\" (UniqueName: \"kubernetes.io/projected/829dc872-61b8-4549-a976-404ea823ea25-kube-api-access-g8crc\") pod \"829dc872-61b8-4549-a976-404ea823ea25\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.490970 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-inventory\") pod \"829dc872-61b8-4549-a976-404ea823ea25\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.491026 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-ssh-key\") pod \"829dc872-61b8-4549-a976-404ea823ea25\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.491188 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-bootstrap-combined-ca-bundle\") pod \"829dc872-61b8-4549-a976-404ea823ea25\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.491410 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-ceph\") pod \"829dc872-61b8-4549-a976-404ea823ea25\" (UID: \"829dc872-61b8-4549-a976-404ea823ea25\") " Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.498266 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "829dc872-61b8-4549-a976-404ea823ea25" (UID: "829dc872-61b8-4549-a976-404ea823ea25"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.498626 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/829dc872-61b8-4549-a976-404ea823ea25-kube-api-access-g8crc" (OuterVolumeSpecName: "kube-api-access-g8crc") pod "829dc872-61b8-4549-a976-404ea823ea25" (UID: "829dc872-61b8-4549-a976-404ea823ea25"). InnerVolumeSpecName "kube-api-access-g8crc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.499885 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-ceph" (OuterVolumeSpecName: "ceph") pod "829dc872-61b8-4549-a976-404ea823ea25" (UID: "829dc872-61b8-4549-a976-404ea823ea25"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.524330 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-inventory" (OuterVolumeSpecName: "inventory") pod "829dc872-61b8-4549-a976-404ea823ea25" (UID: "829dc872-61b8-4549-a976-404ea823ea25"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.545906 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "829dc872-61b8-4549-a976-404ea823ea25" (UID: "829dc872-61b8-4549-a976-404ea823ea25"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.594054 4766 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.594392 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.594552 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8crc\" (UniqueName: \"kubernetes.io/projected/829dc872-61b8-4549-a976-404ea823ea25-kube-api-access-g8crc\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.594709 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.594833 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/829dc872-61b8-4549-a976-404ea823ea25-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.875029 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" event={"ID":"829dc872-61b8-4549-a976-404ea823ea25","Type":"ContainerDied","Data":"63175b5f42b8a33910ab88bbd05f844b736a41c9db97db533392f85c645967cf"} Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.875296 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63175b5f42b8a33910ab88bbd05f844b736a41c9db97db533392f85c645967cf" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.875388 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-6x2jq" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.976688 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-whm5p"] Oct 02 12:57:46 crc kubenswrapper[4766]: E1002 12:57:46.977967 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="829dc872-61b8-4549-a976-404ea823ea25" containerName="bootstrap-openstack-openstack-cell1" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.977994 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="829dc872-61b8-4549-a976-404ea823ea25" containerName="bootstrap-openstack-openstack-cell1" Oct 02 12:57:46 crc kubenswrapper[4766]: E1002 12:57:46.978011 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3ac9da-81ed-40bf-ac52-d219e9c83189" containerName="registry-server" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.978020 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3ac9da-81ed-40bf-ac52-d219e9c83189" containerName="registry-server" Oct 02 12:57:46 crc kubenswrapper[4766]: E1002 12:57:46.979038 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3ac9da-81ed-40bf-ac52-d219e9c83189" containerName="extract-utilities" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.979053 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3ac9da-81ed-40bf-ac52-d219e9c83189" containerName="extract-utilities" Oct 02 12:57:46 crc kubenswrapper[4766]: E1002 12:57:46.979071 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3ac9da-81ed-40bf-ac52-d219e9c83189" containerName="extract-content" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.979080 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3ac9da-81ed-40bf-ac52-d219e9c83189" containerName="extract-content" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.979423 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3ac9da-81ed-40bf-ac52-d219e9c83189" containerName="registry-server" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.979473 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="829dc872-61b8-4549-a976-404ea823ea25" containerName="bootstrap-openstack-openstack-cell1" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.983688 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-whm5p" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.987713 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.988059 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.988141 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.988409 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 12:57:46 crc kubenswrapper[4766]: I1002 12:57:46.994304 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-whm5p"] Oct 02 12:57:47 crc kubenswrapper[4766]: I1002 12:57:47.127626 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-ceph\") pod \"download-cache-openstack-openstack-cell1-whm5p\" (UID: \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\") " pod="openstack/download-cache-openstack-openstack-cell1-whm5p" Oct 02 12:57:47 crc kubenswrapper[4766]: I1002 12:57:47.127731 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjsqp\" (UniqueName: \"kubernetes.io/projected/c5b90806-4f3e-49ce-a40a-3a51ee20b419-kube-api-access-gjsqp\") pod \"download-cache-openstack-openstack-cell1-whm5p\" (UID: \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\") " pod="openstack/download-cache-openstack-openstack-cell1-whm5p" Oct 02 12:57:47 crc kubenswrapper[4766]: I1002 12:57:47.127813 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-ssh-key\") pod \"download-cache-openstack-openstack-cell1-whm5p\" (UID: \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\") " pod="openstack/download-cache-openstack-openstack-cell1-whm5p" Oct 02 12:57:47 crc kubenswrapper[4766]: I1002 12:57:47.127890 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-inventory\") pod \"download-cache-openstack-openstack-cell1-whm5p\" (UID: \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\") " pod="openstack/download-cache-openstack-openstack-cell1-whm5p" Oct 02 12:57:47 crc kubenswrapper[4766]: I1002 12:57:47.230357 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-ssh-key\") pod \"download-cache-openstack-openstack-cell1-whm5p\" (UID: \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\") " pod="openstack/download-cache-openstack-openstack-cell1-whm5p" Oct 02 12:57:47 crc kubenswrapper[4766]: I1002 12:57:47.230468 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-inventory\") pod \"download-cache-openstack-openstack-cell1-whm5p\" (UID: \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\") " pod="openstack/download-cache-openstack-openstack-cell1-whm5p" Oct 02 12:57:47 crc kubenswrapper[4766]: I1002 12:57:47.230609 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-ceph\") pod \"download-cache-openstack-openstack-cell1-whm5p\" (UID: \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\") " pod="openstack/download-cache-openstack-openstack-cell1-whm5p" Oct 02 12:57:47 crc kubenswrapper[4766]: I1002 12:57:47.230692 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjsqp\" (UniqueName: \"kubernetes.io/projected/c5b90806-4f3e-49ce-a40a-3a51ee20b419-kube-api-access-gjsqp\") pod \"download-cache-openstack-openstack-cell1-whm5p\" (UID: \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\") " pod="openstack/download-cache-openstack-openstack-cell1-whm5p" Oct 02 12:57:47 crc kubenswrapper[4766]: I1002 12:57:47.236932 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-inventory\") pod \"download-cache-openstack-openstack-cell1-whm5p\" (UID: \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\") " pod="openstack/download-cache-openstack-openstack-cell1-whm5p" Oct 02 12:57:47 crc kubenswrapper[4766]: I1002 12:57:47.237218 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-ssh-key\") pod \"download-cache-openstack-openstack-cell1-whm5p\" (UID: \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\") " pod="openstack/download-cache-openstack-openstack-cell1-whm5p" Oct 02 12:57:47 crc kubenswrapper[4766]: I1002 12:57:47.240034 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-ceph\") pod \"download-cache-openstack-openstack-cell1-whm5p\" (UID: \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\") " pod="openstack/download-cache-openstack-openstack-cell1-whm5p" Oct 02 12:57:47 crc kubenswrapper[4766]: I1002 12:57:47.248274 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjsqp\" (UniqueName: \"kubernetes.io/projected/c5b90806-4f3e-49ce-a40a-3a51ee20b419-kube-api-access-gjsqp\") pod \"download-cache-openstack-openstack-cell1-whm5p\" (UID: \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\") " pod="openstack/download-cache-openstack-openstack-cell1-whm5p" Oct 02 12:57:47 crc kubenswrapper[4766]: I1002 12:57:47.324739 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-whm5p" Oct 02 12:57:47 crc kubenswrapper[4766]: I1002 12:57:47.926126 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-whm5p"] Oct 02 12:57:48 crc kubenswrapper[4766]: I1002 12:57:48.898582 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-whm5p" event={"ID":"c5b90806-4f3e-49ce-a40a-3a51ee20b419","Type":"ContainerStarted","Data":"ba08965b10580715a9ac0c7721ebb55bd90f1b2b2ef0e71b8110c6acbc764538"} Oct 02 12:57:48 crc kubenswrapper[4766]: I1002 12:57:48.898872 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-whm5p" event={"ID":"c5b90806-4f3e-49ce-a40a-3a51ee20b419","Type":"ContainerStarted","Data":"6ed9fa0ce6aed50bbc75d4d69caa26b0efed9484beb8b1eeaa7aee68719b946d"} Oct 02 12:58:33 crc kubenswrapper[4766]: I1002 12:58:33.063451 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-whm5p" podStartSLOduration=46.869120291 podStartE2EDuration="47.063424664s" podCreationTimestamp="2025-10-02 12:57:46 +0000 UTC" firstStartedPulling="2025-10-02 12:57:47.936860047 +0000 UTC m=+7582.879730991" lastFinishedPulling="2025-10-02 12:57:48.13116442 +0000 UTC m=+7583.074035364" observedRunningTime="2025-10-02 12:57:48.916248888 +0000 UTC m=+7583.859119832" watchObservedRunningTime="2025-10-02 12:58:33.063424664 +0000 UTC m=+7628.006295608" Oct 02 12:58:33 crc kubenswrapper[4766]: I1002 12:58:33.069893 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8wzxs"] Oct 02 12:58:33 crc kubenswrapper[4766]: I1002 12:58:33.072598 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wzxs" Oct 02 12:58:33 crc kubenswrapper[4766]: I1002 12:58:33.089437 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wzxs"] Oct 02 12:58:33 crc kubenswrapper[4766]: I1002 12:58:33.129736 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89150068-e97f-4e8e-9c0d-345275d44504-utilities\") pod \"redhat-marketplace-8wzxs\" (UID: \"89150068-e97f-4e8e-9c0d-345275d44504\") " pod="openshift-marketplace/redhat-marketplace-8wzxs" Oct 02 12:58:33 crc kubenswrapper[4766]: I1002 12:58:33.130453 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89150068-e97f-4e8e-9c0d-345275d44504-catalog-content\") pod \"redhat-marketplace-8wzxs\" (UID: \"89150068-e97f-4e8e-9c0d-345275d44504\") " pod="openshift-marketplace/redhat-marketplace-8wzxs" Oct 02 12:58:33 crc kubenswrapper[4766]: I1002 12:58:33.130492 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7gqn\" (UniqueName: \"kubernetes.io/projected/89150068-e97f-4e8e-9c0d-345275d44504-kube-api-access-c7gqn\") pod \"redhat-marketplace-8wzxs\" (UID: \"89150068-e97f-4e8e-9c0d-345275d44504\") " pod="openshift-marketplace/redhat-marketplace-8wzxs" Oct 02 12:58:33 crc kubenswrapper[4766]: I1002 12:58:33.232747 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89150068-e97f-4e8e-9c0d-345275d44504-catalog-content\") pod \"redhat-marketplace-8wzxs\" (UID: \"89150068-e97f-4e8e-9c0d-345275d44504\") " pod="openshift-marketplace/redhat-marketplace-8wzxs" Oct 02 12:58:33 crc kubenswrapper[4766]: I1002 12:58:33.232818 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7gqn\" (UniqueName: \"kubernetes.io/projected/89150068-e97f-4e8e-9c0d-345275d44504-kube-api-access-c7gqn\") pod \"redhat-marketplace-8wzxs\" (UID: \"89150068-e97f-4e8e-9c0d-345275d44504\") " pod="openshift-marketplace/redhat-marketplace-8wzxs" Oct 02 12:58:33 crc kubenswrapper[4766]: I1002 12:58:33.233082 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89150068-e97f-4e8e-9c0d-345275d44504-utilities\") pod \"redhat-marketplace-8wzxs\" (UID: \"89150068-e97f-4e8e-9c0d-345275d44504\") " pod="openshift-marketplace/redhat-marketplace-8wzxs" Oct 02 12:58:33 crc kubenswrapper[4766]: I1002 12:58:33.233314 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89150068-e97f-4e8e-9c0d-345275d44504-catalog-content\") pod \"redhat-marketplace-8wzxs\" (UID: \"89150068-e97f-4e8e-9c0d-345275d44504\") " pod="openshift-marketplace/redhat-marketplace-8wzxs" Oct 02 12:58:33 crc kubenswrapper[4766]: I1002 12:58:33.233676 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89150068-e97f-4e8e-9c0d-345275d44504-utilities\") pod \"redhat-marketplace-8wzxs\" (UID: \"89150068-e97f-4e8e-9c0d-345275d44504\") " pod="openshift-marketplace/redhat-marketplace-8wzxs" Oct 02 12:58:33 crc kubenswrapper[4766]: I1002 12:58:33.261574 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7gqn\" (UniqueName: \"kubernetes.io/projected/89150068-e97f-4e8e-9c0d-345275d44504-kube-api-access-c7gqn\") pod \"redhat-marketplace-8wzxs\" (UID: \"89150068-e97f-4e8e-9c0d-345275d44504\") " pod="openshift-marketplace/redhat-marketplace-8wzxs" Oct 02 12:58:33 crc kubenswrapper[4766]: I1002 12:58:33.405086 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wzxs" Oct 02 12:58:33 crc kubenswrapper[4766]: I1002 12:58:33.904819 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wzxs"] Oct 02 12:58:34 crc kubenswrapper[4766]: I1002 12:58:34.442285 4766 generic.go:334] "Generic (PLEG): container finished" podID="89150068-e97f-4e8e-9c0d-345275d44504" containerID="342021b467f2a955af1e92912722a2c6b29a9f8b5b208f6c731a2b64b007153a" exitCode=0 Oct 02 12:58:34 crc kubenswrapper[4766]: I1002 12:58:34.442349 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wzxs" event={"ID":"89150068-e97f-4e8e-9c0d-345275d44504","Type":"ContainerDied","Data":"342021b467f2a955af1e92912722a2c6b29a9f8b5b208f6c731a2b64b007153a"} Oct 02 12:58:34 crc kubenswrapper[4766]: I1002 12:58:34.442593 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wzxs" event={"ID":"89150068-e97f-4e8e-9c0d-345275d44504","Type":"ContainerStarted","Data":"1f41927c0ad01e8a3644ee82fdabefd4ed2e5195ded04a8b12c22d485a8f9286"} Oct 02 12:58:36 crc kubenswrapper[4766]: I1002 12:58:36.467248 4766 generic.go:334] "Generic (PLEG): container finished" podID="89150068-e97f-4e8e-9c0d-345275d44504" containerID="7f74b81f540708867b33f5936def8811f39933d70872bbd5ae7983ee4b98b3d2" exitCode=0 Oct 02 12:58:36 crc kubenswrapper[4766]: I1002 12:58:36.467319 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wzxs" event={"ID":"89150068-e97f-4e8e-9c0d-345275d44504","Type":"ContainerDied","Data":"7f74b81f540708867b33f5936def8811f39933d70872bbd5ae7983ee4b98b3d2"} Oct 02 12:58:37 crc kubenswrapper[4766]: I1002 12:58:37.487774 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wzxs" event={"ID":"89150068-e97f-4e8e-9c0d-345275d44504","Type":"ContainerStarted","Data":"cc396634b1581cdf5fd6396e710138899278bd6a0184830297fb89a1eb73daef"} Oct 02 12:58:37 crc kubenswrapper[4766]: I1002 12:58:37.510919 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8wzxs" podStartSLOduration=2.07347571 podStartE2EDuration="4.510897375s" podCreationTimestamp="2025-10-02 12:58:33 +0000 UTC" firstStartedPulling="2025-10-02 12:58:34.447831909 +0000 UTC m=+7629.390702873" lastFinishedPulling="2025-10-02 12:58:36.885253594 +0000 UTC m=+7631.828124538" observedRunningTime="2025-10-02 12:58:37.50701337 +0000 UTC m=+7632.449884324" watchObservedRunningTime="2025-10-02 12:58:37.510897375 +0000 UTC m=+7632.453768319" Oct 02 12:58:43 crc kubenswrapper[4766]: I1002 12:58:43.405822 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8wzxs" Oct 02 12:58:43 crc kubenswrapper[4766]: I1002 12:58:43.406336 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8wzxs" Oct 02 12:58:43 crc kubenswrapper[4766]: I1002 12:58:43.475787 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8wzxs" Oct 02 12:58:43 crc kubenswrapper[4766]: I1002 12:58:43.618784 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8wzxs" Oct 02 12:58:46 crc kubenswrapper[4766]: I1002 12:58:46.455716 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wzxs"] Oct 02 12:58:46 crc kubenswrapper[4766]: I1002 12:58:46.456205 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8wzxs" podUID="89150068-e97f-4e8e-9c0d-345275d44504" containerName="registry-server" containerID="cri-o://cc396634b1581cdf5fd6396e710138899278bd6a0184830297fb89a1eb73daef" gracePeriod=2 Oct 02 12:58:46 crc kubenswrapper[4766]: I1002 12:58:46.603397 4766 generic.go:334] "Generic (PLEG): container finished" podID="89150068-e97f-4e8e-9c0d-345275d44504" containerID="cc396634b1581cdf5fd6396e710138899278bd6a0184830297fb89a1eb73daef" exitCode=0 Oct 02 12:58:46 crc kubenswrapper[4766]: I1002 12:58:46.603815 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wzxs" event={"ID":"89150068-e97f-4e8e-9c0d-345275d44504","Type":"ContainerDied","Data":"cc396634b1581cdf5fd6396e710138899278bd6a0184830297fb89a1eb73daef"} Oct 02 12:58:46 crc kubenswrapper[4766]: I1002 12:58:46.966781 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wzxs" Oct 02 12:58:47 crc kubenswrapper[4766]: I1002 12:58:47.036628 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7gqn\" (UniqueName: \"kubernetes.io/projected/89150068-e97f-4e8e-9c0d-345275d44504-kube-api-access-c7gqn\") pod \"89150068-e97f-4e8e-9c0d-345275d44504\" (UID: \"89150068-e97f-4e8e-9c0d-345275d44504\") " Oct 02 12:58:47 crc kubenswrapper[4766]: I1002 12:58:47.036872 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89150068-e97f-4e8e-9c0d-345275d44504-utilities\") pod \"89150068-e97f-4e8e-9c0d-345275d44504\" (UID: \"89150068-e97f-4e8e-9c0d-345275d44504\") " Oct 02 12:58:47 crc kubenswrapper[4766]: I1002 12:58:47.036937 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89150068-e97f-4e8e-9c0d-345275d44504-catalog-content\") pod \"89150068-e97f-4e8e-9c0d-345275d44504\" (UID: \"89150068-e97f-4e8e-9c0d-345275d44504\") " Oct 02 12:58:47 crc kubenswrapper[4766]: I1002 12:58:47.039023 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89150068-e97f-4e8e-9c0d-345275d44504-utilities" (OuterVolumeSpecName: "utilities") pod "89150068-e97f-4e8e-9c0d-345275d44504" (UID: "89150068-e97f-4e8e-9c0d-345275d44504"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:58:47 crc kubenswrapper[4766]: I1002 12:58:47.059942 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89150068-e97f-4e8e-9c0d-345275d44504-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89150068-e97f-4e8e-9c0d-345275d44504" (UID: "89150068-e97f-4e8e-9c0d-345275d44504"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 12:58:47 crc kubenswrapper[4766]: I1002 12:58:47.064954 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89150068-e97f-4e8e-9c0d-345275d44504-kube-api-access-c7gqn" (OuterVolumeSpecName: "kube-api-access-c7gqn") pod "89150068-e97f-4e8e-9c0d-345275d44504" (UID: "89150068-e97f-4e8e-9c0d-345275d44504"). InnerVolumeSpecName "kube-api-access-c7gqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:58:47 crc kubenswrapper[4766]: I1002 12:58:47.139344 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89150068-e97f-4e8e-9c0d-345275d44504-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 12:58:47 crc kubenswrapper[4766]: I1002 12:58:47.139394 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89150068-e97f-4e8e-9c0d-345275d44504-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 12:58:47 crc kubenswrapper[4766]: I1002 12:58:47.139410 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7gqn\" (UniqueName: \"kubernetes.io/projected/89150068-e97f-4e8e-9c0d-345275d44504-kube-api-access-c7gqn\") on node \"crc\" DevicePath \"\"" Oct 02 12:58:47 crc kubenswrapper[4766]: I1002 12:58:47.615900 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8wzxs" event={"ID":"89150068-e97f-4e8e-9c0d-345275d44504","Type":"ContainerDied","Data":"1f41927c0ad01e8a3644ee82fdabefd4ed2e5195ded04a8b12c22d485a8f9286"} Oct 02 12:58:47 crc kubenswrapper[4766]: I1002 12:58:47.615960 4766 scope.go:117] "RemoveContainer" containerID="cc396634b1581cdf5fd6396e710138899278bd6a0184830297fb89a1eb73daef" Oct 02 12:58:47 crc kubenswrapper[4766]: I1002 12:58:47.616055 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8wzxs" Oct 02 12:58:47 crc kubenswrapper[4766]: I1002 12:58:47.642654 4766 scope.go:117] "RemoveContainer" containerID="7f74b81f540708867b33f5936def8811f39933d70872bbd5ae7983ee4b98b3d2" Oct 02 12:58:47 crc kubenswrapper[4766]: I1002 12:58:47.656649 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wzxs"] Oct 02 12:58:47 crc kubenswrapper[4766]: I1002 12:58:47.668261 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8wzxs"] Oct 02 12:58:47 crc kubenswrapper[4766]: I1002 12:58:47.677796 4766 scope.go:117] "RemoveContainer" containerID="342021b467f2a955af1e92912722a2c6b29a9f8b5b208f6c731a2b64b007153a" Oct 02 12:58:47 crc kubenswrapper[4766]: I1002 12:58:47.901331 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89150068-e97f-4e8e-9c0d-345275d44504" path="/var/lib/kubelet/pods/89150068-e97f-4e8e-9c0d-345275d44504/volumes" Oct 02 12:59:20 crc kubenswrapper[4766]: I1002 12:59:20.943040 4766 generic.go:334] "Generic (PLEG): container finished" podID="c5b90806-4f3e-49ce-a40a-3a51ee20b419" containerID="ba08965b10580715a9ac0c7721ebb55bd90f1b2b2ef0e71b8110c6acbc764538" exitCode=0 Oct 02 12:59:20 crc kubenswrapper[4766]: I1002 12:59:20.943461 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-whm5p" event={"ID":"c5b90806-4f3e-49ce-a40a-3a51ee20b419","Type":"ContainerDied","Data":"ba08965b10580715a9ac0c7721ebb55bd90f1b2b2ef0e71b8110c6acbc764538"} Oct 02 12:59:22 crc kubenswrapper[4766]: I1002 12:59:22.419579 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-whm5p" Oct 02 12:59:22 crc kubenswrapper[4766]: I1002 12:59:22.533845 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-inventory\") pod \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\" (UID: \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\") " Oct 02 12:59:22 crc kubenswrapper[4766]: I1002 12:59:22.533964 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-ceph\") pod \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\" (UID: \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\") " Oct 02 12:59:22 crc kubenswrapper[4766]: I1002 12:59:22.534194 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-ssh-key\") pod \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\" (UID: \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\") " Oct 02 12:59:22 crc kubenswrapper[4766]: I1002 12:59:22.534339 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjsqp\" (UniqueName: \"kubernetes.io/projected/c5b90806-4f3e-49ce-a40a-3a51ee20b419-kube-api-access-gjsqp\") pod \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\" (UID: \"c5b90806-4f3e-49ce-a40a-3a51ee20b419\") " Oct 02 12:59:22 crc kubenswrapper[4766]: I1002 12:59:22.539342 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5b90806-4f3e-49ce-a40a-3a51ee20b419-kube-api-access-gjsqp" (OuterVolumeSpecName: "kube-api-access-gjsqp") pod "c5b90806-4f3e-49ce-a40a-3a51ee20b419" (UID: "c5b90806-4f3e-49ce-a40a-3a51ee20b419"). InnerVolumeSpecName "kube-api-access-gjsqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 12:59:22 crc kubenswrapper[4766]: I1002 12:59:22.539763 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-ceph" (OuterVolumeSpecName: "ceph") pod "c5b90806-4f3e-49ce-a40a-3a51ee20b419" (UID: "c5b90806-4f3e-49ce-a40a-3a51ee20b419"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:22 crc kubenswrapper[4766]: I1002 12:59:22.568818 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-inventory" (OuterVolumeSpecName: "inventory") pod "c5b90806-4f3e-49ce-a40a-3a51ee20b419" (UID: "c5b90806-4f3e-49ce-a40a-3a51ee20b419"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:22 crc kubenswrapper[4766]: I1002 12:59:22.569955 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c5b90806-4f3e-49ce-a40a-3a51ee20b419" (UID: "c5b90806-4f3e-49ce-a40a-3a51ee20b419"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 12:59:22 crc kubenswrapper[4766]: I1002 12:59:22.637167 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:22 crc kubenswrapper[4766]: I1002 12:59:22.637200 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjsqp\" (UniqueName: \"kubernetes.io/projected/c5b90806-4f3e-49ce-a40a-3a51ee20b419-kube-api-access-gjsqp\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:22 crc kubenswrapper[4766]: I1002 12:59:22.637212 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:22 crc kubenswrapper[4766]: I1002 12:59:22.637223 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c5b90806-4f3e-49ce-a40a-3a51ee20b419-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 12:59:22 crc kubenswrapper[4766]: I1002 12:59:22.968736 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-whm5p" event={"ID":"c5b90806-4f3e-49ce-a40a-3a51ee20b419","Type":"ContainerDied","Data":"6ed9fa0ce6aed50bbc75d4d69caa26b0efed9484beb8b1eeaa7aee68719b946d"} Oct 02 12:59:22 crc kubenswrapper[4766]: I1002 12:59:22.968821 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ed9fa0ce6aed50bbc75d4d69caa26b0efed9484beb8b1eeaa7aee68719b946d" Oct 02 12:59:22 crc kubenswrapper[4766]: I1002 12:59:22.968774 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-whm5p" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.069092 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-pzt5t"] Oct 02 12:59:23 crc kubenswrapper[4766]: E1002 12:59:23.069662 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b90806-4f3e-49ce-a40a-3a51ee20b419" containerName="download-cache-openstack-openstack-cell1" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.069686 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b90806-4f3e-49ce-a40a-3a51ee20b419" containerName="download-cache-openstack-openstack-cell1" Oct 02 12:59:23 crc kubenswrapper[4766]: E1002 12:59:23.069726 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89150068-e97f-4e8e-9c0d-345275d44504" containerName="extract-utilities" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.069736 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="89150068-e97f-4e8e-9c0d-345275d44504" containerName="extract-utilities" Oct 02 12:59:23 crc kubenswrapper[4766]: E1002 12:59:23.069769 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89150068-e97f-4e8e-9c0d-345275d44504" containerName="registry-server" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.069779 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="89150068-e97f-4e8e-9c0d-345275d44504" containerName="registry-server" Oct 02 12:59:23 crc kubenswrapper[4766]: E1002 12:59:23.069790 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89150068-e97f-4e8e-9c0d-345275d44504" containerName="extract-content" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.069799 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="89150068-e97f-4e8e-9c0d-345275d44504" containerName="extract-content" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.070139 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b90806-4f3e-49ce-a40a-3a51ee20b419" containerName="download-cache-openstack-openstack-cell1" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.070176 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="89150068-e97f-4e8e-9c0d-345275d44504" containerName="registry-server" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.071172 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.073823 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.074223 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.074323 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.074406 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.098270 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-pzt5t"] Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.148374 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-ceph\") pod \"configure-network-openstack-openstack-cell1-pzt5t\" (UID: \"286eccce-3f88-4796-b896-cd03ccfc3eba\") " pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.148455 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-inventory\") pod \"configure-network-openstack-openstack-cell1-pzt5t\" (UID: \"286eccce-3f88-4796-b896-cd03ccfc3eba\") " pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.148545 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9m8fn\" (UniqueName: \"kubernetes.io/projected/286eccce-3f88-4796-b896-cd03ccfc3eba-kube-api-access-9m8fn\") pod \"configure-network-openstack-openstack-cell1-pzt5t\" (UID: \"286eccce-3f88-4796-b896-cd03ccfc3eba\") " pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.148606 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-ssh-key\") pod \"configure-network-openstack-openstack-cell1-pzt5t\" (UID: \"286eccce-3f88-4796-b896-cd03ccfc3eba\") " pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.250520 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-ceph\") pod \"configure-network-openstack-openstack-cell1-pzt5t\" (UID: \"286eccce-3f88-4796-b896-cd03ccfc3eba\") " pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.250603 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-inventory\") pod \"configure-network-openstack-openstack-cell1-pzt5t\" (UID: \"286eccce-3f88-4796-b896-cd03ccfc3eba\") " pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.250666 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9m8fn\" (UniqueName: \"kubernetes.io/projected/286eccce-3f88-4796-b896-cd03ccfc3eba-kube-api-access-9m8fn\") pod \"configure-network-openstack-openstack-cell1-pzt5t\" (UID: \"286eccce-3f88-4796-b896-cd03ccfc3eba\") " pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.250731 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-ssh-key\") pod \"configure-network-openstack-openstack-cell1-pzt5t\" (UID: \"286eccce-3f88-4796-b896-cd03ccfc3eba\") " pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.257167 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-inventory\") pod \"configure-network-openstack-openstack-cell1-pzt5t\" (UID: \"286eccce-3f88-4796-b896-cd03ccfc3eba\") " pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.257444 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-ssh-key\") pod \"configure-network-openstack-openstack-cell1-pzt5t\" (UID: \"286eccce-3f88-4796-b896-cd03ccfc3eba\") " pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.257574 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-ceph\") pod \"configure-network-openstack-openstack-cell1-pzt5t\" (UID: \"286eccce-3f88-4796-b896-cd03ccfc3eba\") " pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.267203 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9m8fn\" (UniqueName: \"kubernetes.io/projected/286eccce-3f88-4796-b896-cd03ccfc3eba-kube-api-access-9m8fn\") pod \"configure-network-openstack-openstack-cell1-pzt5t\" (UID: \"286eccce-3f88-4796-b896-cd03ccfc3eba\") " pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.402148 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" Oct 02 12:59:23 crc kubenswrapper[4766]: I1002 12:59:23.968984 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-pzt5t"] Oct 02 12:59:24 crc kubenswrapper[4766]: I1002 12:59:24.431717 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:59:24 crc kubenswrapper[4766]: I1002 12:59:24.432031 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 12:59:24 crc kubenswrapper[4766]: I1002 12:59:24.995581 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" event={"ID":"286eccce-3f88-4796-b896-cd03ccfc3eba","Type":"ContainerStarted","Data":"9b388bc9dc10e9bcad21940ecb79758935c8dd4b44c760ecd59ddb4155647b5a"} Oct 02 12:59:24 crc kubenswrapper[4766]: I1002 12:59:24.996145 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" event={"ID":"286eccce-3f88-4796-b896-cd03ccfc3eba","Type":"ContainerStarted","Data":"5d3d6f4cb9f0ec58ca04f39913093fd6e613c31ea6110d6794fa91b2b0dd87cd"} Oct 02 12:59:25 crc kubenswrapper[4766]: I1002 12:59:25.025967 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" podStartSLOduration=1.826157212 podStartE2EDuration="2.025941231s" podCreationTimestamp="2025-10-02 12:59:23 +0000 UTC" firstStartedPulling="2025-10-02 12:59:23.97602085 +0000 UTC m=+7678.918891804" lastFinishedPulling="2025-10-02 12:59:24.175804879 +0000 UTC m=+7679.118675823" observedRunningTime="2025-10-02 12:59:25.020850408 +0000 UTC m=+7679.963721422" watchObservedRunningTime="2025-10-02 12:59:25.025941231 +0000 UTC m=+7679.968812215" Oct 02 12:59:41 crc kubenswrapper[4766]: I1002 12:59:41.087936 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-znnhl"] Oct 02 12:59:41 crc kubenswrapper[4766]: I1002 12:59:41.093836 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-znnhl" Oct 02 12:59:41 crc kubenswrapper[4766]: I1002 12:59:41.104551 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-znnhl"] Oct 02 12:59:41 crc kubenswrapper[4766]: I1002 12:59:41.281681 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x2fq\" (UniqueName: \"kubernetes.io/projected/37a291ba-8396-4ad0-8a38-7941858594ec-kube-api-access-9x2fq\") pod \"redhat-operators-znnhl\" (UID: \"37a291ba-8396-4ad0-8a38-7941858594ec\") " pod="openshift-marketplace/redhat-operators-znnhl" Oct 02 12:59:41 crc kubenswrapper[4766]: I1002 12:59:41.282215 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37a291ba-8396-4ad0-8a38-7941858594ec-utilities\") pod \"redhat-operators-znnhl\" (UID: \"37a291ba-8396-4ad0-8a38-7941858594ec\") " pod="openshift-marketplace/redhat-operators-znnhl" Oct 02 12:59:41 crc kubenswrapper[4766]: I1002 12:59:41.282349 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37a291ba-8396-4ad0-8a38-7941858594ec-catalog-content\") pod \"redhat-operators-znnhl\" (UID: \"37a291ba-8396-4ad0-8a38-7941858594ec\") " pod="openshift-marketplace/redhat-operators-znnhl" Oct 02 12:59:41 crc kubenswrapper[4766]: I1002 12:59:41.384449 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37a291ba-8396-4ad0-8a38-7941858594ec-catalog-content\") pod \"redhat-operators-znnhl\" (UID: \"37a291ba-8396-4ad0-8a38-7941858594ec\") " pod="openshift-marketplace/redhat-operators-znnhl" Oct 02 12:59:41 crc kubenswrapper[4766]: I1002 12:59:41.384594 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x2fq\" (UniqueName: \"kubernetes.io/projected/37a291ba-8396-4ad0-8a38-7941858594ec-kube-api-access-9x2fq\") pod \"redhat-operators-znnhl\" (UID: \"37a291ba-8396-4ad0-8a38-7941858594ec\") " pod="openshift-marketplace/redhat-operators-znnhl" Oct 02 12:59:41 crc kubenswrapper[4766]: I1002 12:59:41.384757 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37a291ba-8396-4ad0-8a38-7941858594ec-utilities\") pod \"redhat-operators-znnhl\" (UID: \"37a291ba-8396-4ad0-8a38-7941858594ec\") " pod="openshift-marketplace/redhat-operators-znnhl" Oct 02 12:59:41 crc kubenswrapper[4766]: I1002 12:59:41.385613 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37a291ba-8396-4ad0-8a38-7941858594ec-utilities\") pod \"redhat-operators-znnhl\" (UID: \"37a291ba-8396-4ad0-8a38-7941858594ec\") " pod="openshift-marketplace/redhat-operators-znnhl" Oct 02 12:59:41 crc kubenswrapper[4766]: I1002 12:59:41.385825 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37a291ba-8396-4ad0-8a38-7941858594ec-catalog-content\") pod \"redhat-operators-znnhl\" (UID: \"37a291ba-8396-4ad0-8a38-7941858594ec\") " pod="openshift-marketplace/redhat-operators-znnhl" Oct 02 12:59:41 crc kubenswrapper[4766]: I1002 12:59:41.406955 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x2fq\" (UniqueName: \"kubernetes.io/projected/37a291ba-8396-4ad0-8a38-7941858594ec-kube-api-access-9x2fq\") pod \"redhat-operators-znnhl\" (UID: \"37a291ba-8396-4ad0-8a38-7941858594ec\") " pod="openshift-marketplace/redhat-operators-znnhl" Oct 02 12:59:41 crc kubenswrapper[4766]: I1002 12:59:41.431690 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-znnhl" Oct 02 12:59:41 crc kubenswrapper[4766]: I1002 12:59:41.929323 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-znnhl"] Oct 02 12:59:42 crc kubenswrapper[4766]: I1002 12:59:42.198834 4766 generic.go:334] "Generic (PLEG): container finished" podID="37a291ba-8396-4ad0-8a38-7941858594ec" containerID="93731be35e5139b5a5214e4fb40c3629188cc554e7b509c34e0d651b6c34c000" exitCode=0 Oct 02 12:59:42 crc kubenswrapper[4766]: I1002 12:59:42.199161 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znnhl" event={"ID":"37a291ba-8396-4ad0-8a38-7941858594ec","Type":"ContainerDied","Data":"93731be35e5139b5a5214e4fb40c3629188cc554e7b509c34e0d651b6c34c000"} Oct 02 12:59:42 crc kubenswrapper[4766]: I1002 12:59:42.199315 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znnhl" event={"ID":"37a291ba-8396-4ad0-8a38-7941858594ec","Type":"ContainerStarted","Data":"ef36be2c1bc478699ebbeec1ce962ec12d61aa09656440531d6f807d91257284"} Oct 02 12:59:42 crc kubenswrapper[4766]: I1002 12:59:42.202944 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 12:59:44 crc kubenswrapper[4766]: I1002 12:59:44.228589 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znnhl" event={"ID":"37a291ba-8396-4ad0-8a38-7941858594ec","Type":"ContainerStarted","Data":"fb01238a1a6ad9552c775a9a43ee3cadebcb2368b588f1ece2e19fdeae09b4c6"} Oct 02 12:59:49 crc kubenswrapper[4766]: I1002 12:59:49.288848 4766 generic.go:334] "Generic (PLEG): container finished" podID="37a291ba-8396-4ad0-8a38-7941858594ec" containerID="fb01238a1a6ad9552c775a9a43ee3cadebcb2368b588f1ece2e19fdeae09b4c6" exitCode=0 Oct 02 12:59:49 crc kubenswrapper[4766]: I1002 12:59:49.288869 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znnhl" event={"ID":"37a291ba-8396-4ad0-8a38-7941858594ec","Type":"ContainerDied","Data":"fb01238a1a6ad9552c775a9a43ee3cadebcb2368b588f1ece2e19fdeae09b4c6"} Oct 02 12:59:50 crc kubenswrapper[4766]: I1002 12:59:50.302773 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znnhl" event={"ID":"37a291ba-8396-4ad0-8a38-7941858594ec","Type":"ContainerStarted","Data":"f701f8c074d2bde085ffde3b24e0dff0f46528c6cc41dcf187c72af2ff5bb845"} Oct 02 12:59:50 crc kubenswrapper[4766]: I1002 12:59:50.336908 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-znnhl" podStartSLOduration=1.7237109739999998 podStartE2EDuration="9.336872919s" podCreationTimestamp="2025-10-02 12:59:41 +0000 UTC" firstStartedPulling="2025-10-02 12:59:42.202632543 +0000 UTC m=+7697.145503487" lastFinishedPulling="2025-10-02 12:59:49.815794478 +0000 UTC m=+7704.758665432" observedRunningTime="2025-10-02 12:59:50.323853001 +0000 UTC m=+7705.266723945" watchObservedRunningTime="2025-10-02 12:59:50.336872919 +0000 UTC m=+7705.279743903" Oct 02 12:59:51 crc kubenswrapper[4766]: I1002 12:59:51.432626 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-znnhl" Oct 02 12:59:51 crc kubenswrapper[4766]: I1002 12:59:51.433841 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-znnhl" Oct 02 12:59:52 crc kubenswrapper[4766]: I1002 12:59:52.485430 4766 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-znnhl" podUID="37a291ba-8396-4ad0-8a38-7941858594ec" containerName="registry-server" probeResult="failure" output=< Oct 02 12:59:52 crc kubenswrapper[4766]: timeout: failed to connect service ":50051" within 1s Oct 02 12:59:52 crc kubenswrapper[4766]: > Oct 02 12:59:54 crc kubenswrapper[4766]: I1002 12:59:54.432276 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 12:59:54 crc kubenswrapper[4766]: I1002 12:59:54.432983 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:00:00 crc kubenswrapper[4766]: I1002 13:00:00.150304 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l"] Oct 02 13:00:00 crc kubenswrapper[4766]: I1002 13:00:00.152323 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l" Oct 02 13:00:00 crc kubenswrapper[4766]: I1002 13:00:00.155328 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 13:00:00 crc kubenswrapper[4766]: I1002 13:00:00.156220 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 13:00:00 crc kubenswrapper[4766]: I1002 13:00:00.161755 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l"] Oct 02 13:00:00 crc kubenswrapper[4766]: I1002 13:00:00.270797 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cds8\" (UniqueName: \"kubernetes.io/projected/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-kube-api-access-8cds8\") pod \"collect-profiles-29323500-54n7l\" (UID: \"97c6c516-5f1b-4ec2-99d3-9248472ce8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l" Oct 02 13:00:00 crc kubenswrapper[4766]: I1002 13:00:00.270845 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-secret-volume\") pod \"collect-profiles-29323500-54n7l\" (UID: \"97c6c516-5f1b-4ec2-99d3-9248472ce8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l" Oct 02 13:00:00 crc kubenswrapper[4766]: I1002 13:00:00.270889 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-config-volume\") pod \"collect-profiles-29323500-54n7l\" (UID: \"97c6c516-5f1b-4ec2-99d3-9248472ce8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l" Oct 02 13:00:00 crc kubenswrapper[4766]: I1002 13:00:00.372864 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cds8\" (UniqueName: \"kubernetes.io/projected/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-kube-api-access-8cds8\") pod \"collect-profiles-29323500-54n7l\" (UID: \"97c6c516-5f1b-4ec2-99d3-9248472ce8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l" Oct 02 13:00:00 crc kubenswrapper[4766]: I1002 13:00:00.372932 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-secret-volume\") pod \"collect-profiles-29323500-54n7l\" (UID: \"97c6c516-5f1b-4ec2-99d3-9248472ce8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l" Oct 02 13:00:00 crc kubenswrapper[4766]: I1002 13:00:00.372972 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-config-volume\") pod \"collect-profiles-29323500-54n7l\" (UID: \"97c6c516-5f1b-4ec2-99d3-9248472ce8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l" Oct 02 13:00:00 crc kubenswrapper[4766]: I1002 13:00:00.374093 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-config-volume\") pod \"collect-profiles-29323500-54n7l\" (UID: \"97c6c516-5f1b-4ec2-99d3-9248472ce8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l" Oct 02 13:00:00 crc kubenswrapper[4766]: I1002 13:00:00.380863 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-secret-volume\") pod \"collect-profiles-29323500-54n7l\" (UID: \"97c6c516-5f1b-4ec2-99d3-9248472ce8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l" Oct 02 13:00:00 crc kubenswrapper[4766]: I1002 13:00:00.399712 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cds8\" (UniqueName: \"kubernetes.io/projected/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-kube-api-access-8cds8\") pod \"collect-profiles-29323500-54n7l\" (UID: \"97c6c516-5f1b-4ec2-99d3-9248472ce8d2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l" Oct 02 13:00:00 crc kubenswrapper[4766]: I1002 13:00:00.482715 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l" Oct 02 13:00:01 crc kubenswrapper[4766]: I1002 13:00:01.003853 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l"] Oct 02 13:00:01 crc kubenswrapper[4766]: W1002 13:00:01.009857 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97c6c516_5f1b_4ec2_99d3_9248472ce8d2.slice/crio-ee7c10673bff2d2222872ff0989cd4f0b67b515a74e7ae9044dafaf4c9d1a3c8 WatchSource:0}: Error finding container ee7c10673bff2d2222872ff0989cd4f0b67b515a74e7ae9044dafaf4c9d1a3c8: Status 404 returned error can't find the container with id ee7c10673bff2d2222872ff0989cd4f0b67b515a74e7ae9044dafaf4c9d1a3c8 Oct 02 13:00:01 crc kubenswrapper[4766]: I1002 13:00:01.440884 4766 generic.go:334] "Generic (PLEG): container finished" podID="97c6c516-5f1b-4ec2-99d3-9248472ce8d2" containerID="d3d67e0f96e0cb79bfab5fcacfec7ba3d0d46bfa2666c7dc2c793f61e66b1b7e" exitCode=0 Oct 02 13:00:01 crc kubenswrapper[4766]: I1002 13:00:01.441151 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l" event={"ID":"97c6c516-5f1b-4ec2-99d3-9248472ce8d2","Type":"ContainerDied","Data":"d3d67e0f96e0cb79bfab5fcacfec7ba3d0d46bfa2666c7dc2c793f61e66b1b7e"} Oct 02 13:00:01 crc kubenswrapper[4766]: I1002 13:00:01.441177 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l" event={"ID":"97c6c516-5f1b-4ec2-99d3-9248472ce8d2","Type":"ContainerStarted","Data":"ee7c10673bff2d2222872ff0989cd4f0b67b515a74e7ae9044dafaf4c9d1a3c8"} Oct 02 13:00:01 crc kubenswrapper[4766]: I1002 13:00:01.486339 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-znnhl" Oct 02 13:00:01 crc kubenswrapper[4766]: I1002 13:00:01.540042 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-znnhl" Oct 02 13:00:01 crc kubenswrapper[4766]: I1002 13:00:01.722099 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-znnhl"] Oct 02 13:00:02 crc kubenswrapper[4766]: I1002 13:00:02.852922 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l" Oct 02 13:00:03 crc kubenswrapper[4766]: I1002 13:00:03.038761 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cds8\" (UniqueName: \"kubernetes.io/projected/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-kube-api-access-8cds8\") pod \"97c6c516-5f1b-4ec2-99d3-9248472ce8d2\" (UID: \"97c6c516-5f1b-4ec2-99d3-9248472ce8d2\") " Oct 02 13:00:03 crc kubenswrapper[4766]: I1002 13:00:03.039300 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-secret-volume\") pod \"97c6c516-5f1b-4ec2-99d3-9248472ce8d2\" (UID: \"97c6c516-5f1b-4ec2-99d3-9248472ce8d2\") " Oct 02 13:00:03 crc kubenswrapper[4766]: I1002 13:00:03.039395 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-config-volume\") pod \"97c6c516-5f1b-4ec2-99d3-9248472ce8d2\" (UID: \"97c6c516-5f1b-4ec2-99d3-9248472ce8d2\") " Oct 02 13:00:03 crc kubenswrapper[4766]: I1002 13:00:03.040241 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-config-volume" (OuterVolumeSpecName: "config-volume") pod "97c6c516-5f1b-4ec2-99d3-9248472ce8d2" (UID: "97c6c516-5f1b-4ec2-99d3-9248472ce8d2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:00:03 crc kubenswrapper[4766]: I1002 13:00:03.041759 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:03 crc kubenswrapper[4766]: I1002 13:00:03.045938 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-kube-api-access-8cds8" (OuterVolumeSpecName: "kube-api-access-8cds8") pod "97c6c516-5f1b-4ec2-99d3-9248472ce8d2" (UID: "97c6c516-5f1b-4ec2-99d3-9248472ce8d2"). InnerVolumeSpecName "kube-api-access-8cds8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:00:03 crc kubenswrapper[4766]: I1002 13:00:03.046449 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "97c6c516-5f1b-4ec2-99d3-9248472ce8d2" (UID: "97c6c516-5f1b-4ec2-99d3-9248472ce8d2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:00:03 crc kubenswrapper[4766]: I1002 13:00:03.144331 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:03 crc kubenswrapper[4766]: I1002 13:00:03.144379 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cds8\" (UniqueName: \"kubernetes.io/projected/97c6c516-5f1b-4ec2-99d3-9248472ce8d2-kube-api-access-8cds8\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:03 crc kubenswrapper[4766]: I1002 13:00:03.463927 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l" event={"ID":"97c6c516-5f1b-4ec2-99d3-9248472ce8d2","Type":"ContainerDied","Data":"ee7c10673bff2d2222872ff0989cd4f0b67b515a74e7ae9044dafaf4c9d1a3c8"} Oct 02 13:00:03 crc kubenswrapper[4766]: I1002 13:00:03.463976 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee7c10673bff2d2222872ff0989cd4f0b67b515a74e7ae9044dafaf4c9d1a3c8" Oct 02 13:00:03 crc kubenswrapper[4766]: I1002 13:00:03.463993 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323500-54n7l" Oct 02 13:00:03 crc kubenswrapper[4766]: I1002 13:00:03.464062 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-znnhl" podUID="37a291ba-8396-4ad0-8a38-7941858594ec" containerName="registry-server" containerID="cri-o://f701f8c074d2bde085ffde3b24e0dff0f46528c6cc41dcf187c72af2ff5bb845" gracePeriod=2 Oct 02 13:00:03 crc kubenswrapper[4766]: I1002 13:00:03.938475 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb"] Oct 02 13:00:03 crc kubenswrapper[4766]: I1002 13:00:03.947425 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323455-69xxb"] Oct 02 13:00:03 crc kubenswrapper[4766]: I1002 13:00:03.989163 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-znnhl" Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.163605 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37a291ba-8396-4ad0-8a38-7941858594ec-utilities\") pod \"37a291ba-8396-4ad0-8a38-7941858594ec\" (UID: \"37a291ba-8396-4ad0-8a38-7941858594ec\") " Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.163750 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37a291ba-8396-4ad0-8a38-7941858594ec-catalog-content\") pod \"37a291ba-8396-4ad0-8a38-7941858594ec\" (UID: \"37a291ba-8396-4ad0-8a38-7941858594ec\") " Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.163928 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9x2fq\" (UniqueName: \"kubernetes.io/projected/37a291ba-8396-4ad0-8a38-7941858594ec-kube-api-access-9x2fq\") pod \"37a291ba-8396-4ad0-8a38-7941858594ec\" (UID: \"37a291ba-8396-4ad0-8a38-7941858594ec\") " Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.164687 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37a291ba-8396-4ad0-8a38-7941858594ec-utilities" (OuterVolumeSpecName: "utilities") pod "37a291ba-8396-4ad0-8a38-7941858594ec" (UID: "37a291ba-8396-4ad0-8a38-7941858594ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.168093 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37a291ba-8396-4ad0-8a38-7941858594ec-kube-api-access-9x2fq" (OuterVolumeSpecName: "kube-api-access-9x2fq") pod "37a291ba-8396-4ad0-8a38-7941858594ec" (UID: "37a291ba-8396-4ad0-8a38-7941858594ec"). InnerVolumeSpecName "kube-api-access-9x2fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.254776 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37a291ba-8396-4ad0-8a38-7941858594ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37a291ba-8396-4ad0-8a38-7941858594ec" (UID: "37a291ba-8396-4ad0-8a38-7941858594ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.267164 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37a291ba-8396-4ad0-8a38-7941858594ec-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.267207 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9x2fq\" (UniqueName: \"kubernetes.io/projected/37a291ba-8396-4ad0-8a38-7941858594ec-kube-api-access-9x2fq\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.267217 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37a291ba-8396-4ad0-8a38-7941858594ec-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.480617 4766 generic.go:334] "Generic (PLEG): container finished" podID="37a291ba-8396-4ad0-8a38-7941858594ec" containerID="f701f8c074d2bde085ffde3b24e0dff0f46528c6cc41dcf187c72af2ff5bb845" exitCode=0 Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.480687 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-znnhl" Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.480699 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znnhl" event={"ID":"37a291ba-8396-4ad0-8a38-7941858594ec","Type":"ContainerDied","Data":"f701f8c074d2bde085ffde3b24e0dff0f46528c6cc41dcf187c72af2ff5bb845"} Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.481142 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-znnhl" event={"ID":"37a291ba-8396-4ad0-8a38-7941858594ec","Type":"ContainerDied","Data":"ef36be2c1bc478699ebbeec1ce962ec12d61aa09656440531d6f807d91257284"} Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.481164 4766 scope.go:117] "RemoveContainer" containerID="f701f8c074d2bde085ffde3b24e0dff0f46528c6cc41dcf187c72af2ff5bb845" Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.512852 4766 scope.go:117] "RemoveContainer" containerID="fb01238a1a6ad9552c775a9a43ee3cadebcb2368b588f1ece2e19fdeae09b4c6" Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.516050 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-znnhl"] Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.526718 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-znnhl"] Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.544374 4766 scope.go:117] "RemoveContainer" containerID="93731be35e5139b5a5214e4fb40c3629188cc554e7b509c34e0d651b6c34c000" Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.602516 4766 scope.go:117] "RemoveContainer" containerID="f701f8c074d2bde085ffde3b24e0dff0f46528c6cc41dcf187c72af2ff5bb845" Oct 02 13:00:04 crc kubenswrapper[4766]: E1002 13:00:04.603080 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f701f8c074d2bde085ffde3b24e0dff0f46528c6cc41dcf187c72af2ff5bb845\": container with ID starting with f701f8c074d2bde085ffde3b24e0dff0f46528c6cc41dcf187c72af2ff5bb845 not found: ID does not exist" containerID="f701f8c074d2bde085ffde3b24e0dff0f46528c6cc41dcf187c72af2ff5bb845" Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.603116 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f701f8c074d2bde085ffde3b24e0dff0f46528c6cc41dcf187c72af2ff5bb845"} err="failed to get container status \"f701f8c074d2bde085ffde3b24e0dff0f46528c6cc41dcf187c72af2ff5bb845\": rpc error: code = NotFound desc = could not find container \"f701f8c074d2bde085ffde3b24e0dff0f46528c6cc41dcf187c72af2ff5bb845\": container with ID starting with f701f8c074d2bde085ffde3b24e0dff0f46528c6cc41dcf187c72af2ff5bb845 not found: ID does not exist" Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.603141 4766 scope.go:117] "RemoveContainer" containerID="fb01238a1a6ad9552c775a9a43ee3cadebcb2368b588f1ece2e19fdeae09b4c6" Oct 02 13:00:04 crc kubenswrapper[4766]: E1002 13:00:04.603653 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb01238a1a6ad9552c775a9a43ee3cadebcb2368b588f1ece2e19fdeae09b4c6\": container with ID starting with fb01238a1a6ad9552c775a9a43ee3cadebcb2368b588f1ece2e19fdeae09b4c6 not found: ID does not exist" containerID="fb01238a1a6ad9552c775a9a43ee3cadebcb2368b588f1ece2e19fdeae09b4c6" Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.603705 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb01238a1a6ad9552c775a9a43ee3cadebcb2368b588f1ece2e19fdeae09b4c6"} err="failed to get container status \"fb01238a1a6ad9552c775a9a43ee3cadebcb2368b588f1ece2e19fdeae09b4c6\": rpc error: code = NotFound desc = could not find container \"fb01238a1a6ad9552c775a9a43ee3cadebcb2368b588f1ece2e19fdeae09b4c6\": container with ID starting with fb01238a1a6ad9552c775a9a43ee3cadebcb2368b588f1ece2e19fdeae09b4c6 not found: ID does not exist" Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.603742 4766 scope.go:117] "RemoveContainer" containerID="93731be35e5139b5a5214e4fb40c3629188cc554e7b509c34e0d651b6c34c000" Oct 02 13:00:04 crc kubenswrapper[4766]: E1002 13:00:04.604140 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93731be35e5139b5a5214e4fb40c3629188cc554e7b509c34e0d651b6c34c000\": container with ID starting with 93731be35e5139b5a5214e4fb40c3629188cc554e7b509c34e0d651b6c34c000 not found: ID does not exist" containerID="93731be35e5139b5a5214e4fb40c3629188cc554e7b509c34e0d651b6c34c000" Oct 02 13:00:04 crc kubenswrapper[4766]: I1002 13:00:04.604197 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93731be35e5139b5a5214e4fb40c3629188cc554e7b509c34e0d651b6c34c000"} err="failed to get container status \"93731be35e5139b5a5214e4fb40c3629188cc554e7b509c34e0d651b6c34c000\": rpc error: code = NotFound desc = could not find container \"93731be35e5139b5a5214e4fb40c3629188cc554e7b509c34e0d651b6c34c000\": container with ID starting with 93731be35e5139b5a5214e4fb40c3629188cc554e7b509c34e0d651b6c34c000 not found: ID does not exist" Oct 02 13:00:05 crc kubenswrapper[4766]: I1002 13:00:05.913229 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37a291ba-8396-4ad0-8a38-7941858594ec" path="/var/lib/kubelet/pods/37a291ba-8396-4ad0-8a38-7941858594ec/volumes" Oct 02 13:00:05 crc kubenswrapper[4766]: I1002 13:00:05.915851 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7baeb498-8c0f-497f-a80e-58c405c1d32f" path="/var/lib/kubelet/pods/7baeb498-8c0f-497f-a80e-58c405c1d32f/volumes" Oct 02 13:00:24 crc kubenswrapper[4766]: I1002 13:00:24.432646 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:00:24 crc kubenswrapper[4766]: I1002 13:00:24.433090 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:00:24 crc kubenswrapper[4766]: I1002 13:00:24.433140 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 13:00:24 crc kubenswrapper[4766]: I1002 13:00:24.433982 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:00:24 crc kubenswrapper[4766]: I1002 13:00:24.434035 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" gracePeriod=600 Oct 02 13:00:24 crc kubenswrapper[4766]: E1002 13:00:24.559396 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:00:24 crc kubenswrapper[4766]: I1002 13:00:24.709759 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" exitCode=0 Oct 02 13:00:24 crc kubenswrapper[4766]: I1002 13:00:24.709840 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf"} Oct 02 13:00:24 crc kubenswrapper[4766]: I1002 13:00:24.709906 4766 scope.go:117] "RemoveContainer" containerID="653354b49be3bc2afc4b9e41e295aa3b09fbf84e205a578a42bfba4c372df755" Oct 02 13:00:24 crc kubenswrapper[4766]: I1002 13:00:24.711101 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:00:24 crc kubenswrapper[4766]: E1002 13:00:24.711746 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:00:34 crc kubenswrapper[4766]: I1002 13:00:34.881736 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:00:34 crc kubenswrapper[4766]: E1002 13:00:34.882650 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:00:42 crc kubenswrapper[4766]: I1002 13:00:42.084834 4766 scope.go:117] "RemoveContainer" containerID="0823cb18d3c479a33ba38d82fa830a0c494a8e538aed2aac68f31210bd4ec663" Oct 02 13:00:43 crc kubenswrapper[4766]: I1002 13:00:43.953114 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" event={"ID":"286eccce-3f88-4796-b896-cd03ccfc3eba","Type":"ContainerDied","Data":"9b388bc9dc10e9bcad21940ecb79758935c8dd4b44c760ecd59ddb4155647b5a"} Oct 02 13:00:43 crc kubenswrapper[4766]: I1002 13:00:43.953213 4766 generic.go:334] "Generic (PLEG): container finished" podID="286eccce-3f88-4796-b896-cd03ccfc3eba" containerID="9b388bc9dc10e9bcad21940ecb79758935c8dd4b44c760ecd59ddb4155647b5a" exitCode=0 Oct 02 13:00:45 crc kubenswrapper[4766]: I1002 13:00:45.439923 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" Oct 02 13:00:45 crc kubenswrapper[4766]: I1002 13:00:45.466023 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9m8fn\" (UniqueName: \"kubernetes.io/projected/286eccce-3f88-4796-b896-cd03ccfc3eba-kube-api-access-9m8fn\") pod \"286eccce-3f88-4796-b896-cd03ccfc3eba\" (UID: \"286eccce-3f88-4796-b896-cd03ccfc3eba\") " Oct 02 13:00:45 crc kubenswrapper[4766]: I1002 13:00:45.466135 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-ssh-key\") pod \"286eccce-3f88-4796-b896-cd03ccfc3eba\" (UID: \"286eccce-3f88-4796-b896-cd03ccfc3eba\") " Oct 02 13:00:45 crc kubenswrapper[4766]: I1002 13:00:45.466794 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-inventory\") pod \"286eccce-3f88-4796-b896-cd03ccfc3eba\" (UID: \"286eccce-3f88-4796-b896-cd03ccfc3eba\") " Oct 02 13:00:45 crc kubenswrapper[4766]: I1002 13:00:45.466837 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-ceph\") pod \"286eccce-3f88-4796-b896-cd03ccfc3eba\" (UID: \"286eccce-3f88-4796-b896-cd03ccfc3eba\") " Oct 02 13:00:45 crc kubenswrapper[4766]: I1002 13:00:45.479850 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-ceph" (OuterVolumeSpecName: "ceph") pod "286eccce-3f88-4796-b896-cd03ccfc3eba" (UID: "286eccce-3f88-4796-b896-cd03ccfc3eba"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:00:45 crc kubenswrapper[4766]: I1002 13:00:45.480256 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/286eccce-3f88-4796-b896-cd03ccfc3eba-kube-api-access-9m8fn" (OuterVolumeSpecName: "kube-api-access-9m8fn") pod "286eccce-3f88-4796-b896-cd03ccfc3eba" (UID: "286eccce-3f88-4796-b896-cd03ccfc3eba"). InnerVolumeSpecName "kube-api-access-9m8fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:00:45 crc kubenswrapper[4766]: I1002 13:00:45.505284 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-inventory" (OuterVolumeSpecName: "inventory") pod "286eccce-3f88-4796-b896-cd03ccfc3eba" (UID: "286eccce-3f88-4796-b896-cd03ccfc3eba"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:00:45 crc kubenswrapper[4766]: I1002 13:00:45.520862 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "286eccce-3f88-4796-b896-cd03ccfc3eba" (UID: "286eccce-3f88-4796-b896-cd03ccfc3eba"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:00:45 crc kubenswrapper[4766]: I1002 13:00:45.569590 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:45 crc kubenswrapper[4766]: I1002 13:00:45.569642 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:45 crc kubenswrapper[4766]: I1002 13:00:45.569658 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9m8fn\" (UniqueName: \"kubernetes.io/projected/286eccce-3f88-4796-b896-cd03ccfc3eba-kube-api-access-9m8fn\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:45 crc kubenswrapper[4766]: I1002 13:00:45.569671 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/286eccce-3f88-4796-b896-cd03ccfc3eba-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:45 crc kubenswrapper[4766]: I1002 13:00:45.983973 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" event={"ID":"286eccce-3f88-4796-b896-cd03ccfc3eba","Type":"ContainerDied","Data":"5d3d6f4cb9f0ec58ca04f39913093fd6e613c31ea6110d6794fa91b2b0dd87cd"} Oct 02 13:00:45 crc kubenswrapper[4766]: I1002 13:00:45.984030 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d3d6f4cb9f0ec58ca04f39913093fd6e613c31ea6110d6794fa91b2b0dd87cd" Oct 02 13:00:45 crc kubenswrapper[4766]: I1002 13:00:45.984057 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-pzt5t" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.074564 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-46x8b"] Oct 02 13:00:46 crc kubenswrapper[4766]: E1002 13:00:46.075190 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a291ba-8396-4ad0-8a38-7941858594ec" containerName="registry-server" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.075218 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a291ba-8396-4ad0-8a38-7941858594ec" containerName="registry-server" Oct 02 13:00:46 crc kubenswrapper[4766]: E1002 13:00:46.075238 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a291ba-8396-4ad0-8a38-7941858594ec" containerName="extract-utilities" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.075248 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a291ba-8396-4ad0-8a38-7941858594ec" containerName="extract-utilities" Oct 02 13:00:46 crc kubenswrapper[4766]: E1002 13:00:46.075266 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a291ba-8396-4ad0-8a38-7941858594ec" containerName="extract-content" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.075274 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a291ba-8396-4ad0-8a38-7941858594ec" containerName="extract-content" Oct 02 13:00:46 crc kubenswrapper[4766]: E1002 13:00:46.075304 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c6c516-5f1b-4ec2-99d3-9248472ce8d2" containerName="collect-profiles" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.075312 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c6c516-5f1b-4ec2-99d3-9248472ce8d2" containerName="collect-profiles" Oct 02 13:00:46 crc kubenswrapper[4766]: E1002 13:00:46.075337 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="286eccce-3f88-4796-b896-cd03ccfc3eba" containerName="configure-network-openstack-openstack-cell1" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.075348 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="286eccce-3f88-4796-b896-cd03ccfc3eba" containerName="configure-network-openstack-openstack-cell1" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.075635 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="286eccce-3f88-4796-b896-cd03ccfc3eba" containerName="configure-network-openstack-openstack-cell1" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.075651 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="97c6c516-5f1b-4ec2-99d3-9248472ce8d2" containerName="collect-profiles" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.075687 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a291ba-8396-4ad0-8a38-7941858594ec" containerName="registry-server" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.076780 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-46x8b" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.080140 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.080562 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.080954 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.081484 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.097919 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-46x8b"] Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.180171 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-ceph\") pod \"validate-network-openstack-openstack-cell1-46x8b\" (UID: \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\") " pod="openstack/validate-network-openstack-openstack-cell1-46x8b" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.180242 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-inventory\") pod \"validate-network-openstack-openstack-cell1-46x8b\" (UID: \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\") " pod="openstack/validate-network-openstack-openstack-cell1-46x8b" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.180475 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsp4v\" (UniqueName: \"kubernetes.io/projected/3344e415-6ae1-4d8e-b27e-73aeb7bba387-kube-api-access-qsp4v\") pod \"validate-network-openstack-openstack-cell1-46x8b\" (UID: \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\") " pod="openstack/validate-network-openstack-openstack-cell1-46x8b" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.180650 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-ssh-key\") pod \"validate-network-openstack-openstack-cell1-46x8b\" (UID: \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\") " pod="openstack/validate-network-openstack-openstack-cell1-46x8b" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.282461 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-ceph\") pod \"validate-network-openstack-openstack-cell1-46x8b\" (UID: \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\") " pod="openstack/validate-network-openstack-openstack-cell1-46x8b" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.282550 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-inventory\") pod \"validate-network-openstack-openstack-cell1-46x8b\" (UID: \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\") " pod="openstack/validate-network-openstack-openstack-cell1-46x8b" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.282651 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsp4v\" (UniqueName: \"kubernetes.io/projected/3344e415-6ae1-4d8e-b27e-73aeb7bba387-kube-api-access-qsp4v\") pod \"validate-network-openstack-openstack-cell1-46x8b\" (UID: \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\") " pod="openstack/validate-network-openstack-openstack-cell1-46x8b" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.282701 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-ssh-key\") pod \"validate-network-openstack-openstack-cell1-46x8b\" (UID: \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\") " pod="openstack/validate-network-openstack-openstack-cell1-46x8b" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.287681 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-inventory\") pod \"validate-network-openstack-openstack-cell1-46x8b\" (UID: \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\") " pod="openstack/validate-network-openstack-openstack-cell1-46x8b" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.289074 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-ceph\") pod \"validate-network-openstack-openstack-cell1-46x8b\" (UID: \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\") " pod="openstack/validate-network-openstack-openstack-cell1-46x8b" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.290742 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-ssh-key\") pod \"validate-network-openstack-openstack-cell1-46x8b\" (UID: \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\") " pod="openstack/validate-network-openstack-openstack-cell1-46x8b" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.311282 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsp4v\" (UniqueName: \"kubernetes.io/projected/3344e415-6ae1-4d8e-b27e-73aeb7bba387-kube-api-access-qsp4v\") pod \"validate-network-openstack-openstack-cell1-46x8b\" (UID: \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\") " pod="openstack/validate-network-openstack-openstack-cell1-46x8b" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.396737 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-46x8b" Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.931043 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-46x8b"] Oct 02 13:00:46 crc kubenswrapper[4766]: I1002 13:00:46.993647 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-46x8b" event={"ID":"3344e415-6ae1-4d8e-b27e-73aeb7bba387","Type":"ContainerStarted","Data":"f77b3225372a8e5b5a66d1105b8c70a5746dd4d49ca5744c236a522578be7852"} Oct 02 13:00:48 crc kubenswrapper[4766]: I1002 13:00:48.008624 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-46x8b" event={"ID":"3344e415-6ae1-4d8e-b27e-73aeb7bba387","Type":"ContainerStarted","Data":"62438533cfb6b20b1e34dfb23ae08dcd625e1edea5c546d6bf5d023906efc6ee"} Oct 02 13:00:48 crc kubenswrapper[4766]: I1002 13:00:48.030484 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-46x8b" podStartSLOduration=1.839833117 podStartE2EDuration="2.030462213s" podCreationTimestamp="2025-10-02 13:00:46 +0000 UTC" firstStartedPulling="2025-10-02 13:00:46.945328224 +0000 UTC m=+7761.888199158" lastFinishedPulling="2025-10-02 13:00:47.13595731 +0000 UTC m=+7762.078828254" observedRunningTime="2025-10-02 13:00:48.024771151 +0000 UTC m=+7762.967642105" watchObservedRunningTime="2025-10-02 13:00:48.030462213 +0000 UTC m=+7762.973333157" Oct 02 13:00:49 crc kubenswrapper[4766]: I1002 13:00:49.881475 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:00:49 crc kubenswrapper[4766]: E1002 13:00:49.882274 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:00:53 crc kubenswrapper[4766]: I1002 13:00:53.080768 4766 generic.go:334] "Generic (PLEG): container finished" podID="3344e415-6ae1-4d8e-b27e-73aeb7bba387" containerID="62438533cfb6b20b1e34dfb23ae08dcd625e1edea5c546d6bf5d023906efc6ee" exitCode=0 Oct 02 13:00:53 crc kubenswrapper[4766]: I1002 13:00:53.080845 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-46x8b" event={"ID":"3344e415-6ae1-4d8e-b27e-73aeb7bba387","Type":"ContainerDied","Data":"62438533cfb6b20b1e34dfb23ae08dcd625e1edea5c546d6bf5d023906efc6ee"} Oct 02 13:00:54 crc kubenswrapper[4766]: I1002 13:00:54.522084 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-46x8b" Oct 02 13:00:54 crc kubenswrapper[4766]: I1002 13:00:54.577439 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsp4v\" (UniqueName: \"kubernetes.io/projected/3344e415-6ae1-4d8e-b27e-73aeb7bba387-kube-api-access-qsp4v\") pod \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\" (UID: \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\") " Oct 02 13:00:54 crc kubenswrapper[4766]: I1002 13:00:54.577545 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-inventory\") pod \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\" (UID: \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\") " Oct 02 13:00:54 crc kubenswrapper[4766]: I1002 13:00:54.577609 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-ssh-key\") pod \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\" (UID: \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\") " Oct 02 13:00:54 crc kubenswrapper[4766]: I1002 13:00:54.577834 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-ceph\") pod \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\" (UID: \"3344e415-6ae1-4d8e-b27e-73aeb7bba387\") " Oct 02 13:00:54 crc kubenswrapper[4766]: I1002 13:00:54.584466 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-ceph" (OuterVolumeSpecName: "ceph") pod "3344e415-6ae1-4d8e-b27e-73aeb7bba387" (UID: "3344e415-6ae1-4d8e-b27e-73aeb7bba387"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:00:54 crc kubenswrapper[4766]: I1002 13:00:54.585106 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3344e415-6ae1-4d8e-b27e-73aeb7bba387-kube-api-access-qsp4v" (OuterVolumeSpecName: "kube-api-access-qsp4v") pod "3344e415-6ae1-4d8e-b27e-73aeb7bba387" (UID: "3344e415-6ae1-4d8e-b27e-73aeb7bba387"). InnerVolumeSpecName "kube-api-access-qsp4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:00:54 crc kubenswrapper[4766]: I1002 13:00:54.620191 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-inventory" (OuterVolumeSpecName: "inventory") pod "3344e415-6ae1-4d8e-b27e-73aeb7bba387" (UID: "3344e415-6ae1-4d8e-b27e-73aeb7bba387"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:00:54 crc kubenswrapper[4766]: I1002 13:00:54.620315 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3344e415-6ae1-4d8e-b27e-73aeb7bba387" (UID: "3344e415-6ae1-4d8e-b27e-73aeb7bba387"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:00:54 crc kubenswrapper[4766]: I1002 13:00:54.680711 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:54 crc kubenswrapper[4766]: I1002 13:00:54.680986 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsp4v\" (UniqueName: \"kubernetes.io/projected/3344e415-6ae1-4d8e-b27e-73aeb7bba387-kube-api-access-qsp4v\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:54 crc kubenswrapper[4766]: I1002 13:00:54.681065 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:54 crc kubenswrapper[4766]: I1002 13:00:54.681139 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3344e415-6ae1-4d8e-b27e-73aeb7bba387-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.104658 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-46x8b" event={"ID":"3344e415-6ae1-4d8e-b27e-73aeb7bba387","Type":"ContainerDied","Data":"f77b3225372a8e5b5a66d1105b8c70a5746dd4d49ca5744c236a522578be7852"} Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.104706 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f77b3225372a8e5b5a66d1105b8c70a5746dd4d49ca5744c236a522578be7852" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.104778 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-46x8b" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.176841 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-56g6g"] Oct 02 13:00:55 crc kubenswrapper[4766]: E1002 13:00:55.177395 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3344e415-6ae1-4d8e-b27e-73aeb7bba387" containerName="validate-network-openstack-openstack-cell1" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.177415 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3344e415-6ae1-4d8e-b27e-73aeb7bba387" containerName="validate-network-openstack-openstack-cell1" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.177682 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3344e415-6ae1-4d8e-b27e-73aeb7bba387" containerName="validate-network-openstack-openstack-cell1" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.178628 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-56g6g" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.183828 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.183940 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.184127 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.184213 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.188987 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-56g6g"] Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.296150 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-inventory\") pod \"install-os-openstack-openstack-cell1-56g6g\" (UID: \"78818c83-db7c-4b55-bef0-a04d906450e7\") " pod="openstack/install-os-openstack-openstack-cell1-56g6g" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.296226 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2l2k\" (UniqueName: \"kubernetes.io/projected/78818c83-db7c-4b55-bef0-a04d906450e7-kube-api-access-j2l2k\") pod \"install-os-openstack-openstack-cell1-56g6g\" (UID: \"78818c83-db7c-4b55-bef0-a04d906450e7\") " pod="openstack/install-os-openstack-openstack-cell1-56g6g" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.296476 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-ssh-key\") pod \"install-os-openstack-openstack-cell1-56g6g\" (UID: \"78818c83-db7c-4b55-bef0-a04d906450e7\") " pod="openstack/install-os-openstack-openstack-cell1-56g6g" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.296566 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-ceph\") pod \"install-os-openstack-openstack-cell1-56g6g\" (UID: \"78818c83-db7c-4b55-bef0-a04d906450e7\") " pod="openstack/install-os-openstack-openstack-cell1-56g6g" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.399119 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-inventory\") pod \"install-os-openstack-openstack-cell1-56g6g\" (UID: \"78818c83-db7c-4b55-bef0-a04d906450e7\") " pod="openstack/install-os-openstack-openstack-cell1-56g6g" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.399191 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2l2k\" (UniqueName: \"kubernetes.io/projected/78818c83-db7c-4b55-bef0-a04d906450e7-kube-api-access-j2l2k\") pod \"install-os-openstack-openstack-cell1-56g6g\" (UID: \"78818c83-db7c-4b55-bef0-a04d906450e7\") " pod="openstack/install-os-openstack-openstack-cell1-56g6g" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.399366 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-ssh-key\") pod \"install-os-openstack-openstack-cell1-56g6g\" (UID: \"78818c83-db7c-4b55-bef0-a04d906450e7\") " pod="openstack/install-os-openstack-openstack-cell1-56g6g" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.399410 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-ceph\") pod \"install-os-openstack-openstack-cell1-56g6g\" (UID: \"78818c83-db7c-4b55-bef0-a04d906450e7\") " pod="openstack/install-os-openstack-openstack-cell1-56g6g" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.408420 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-inventory\") pod \"install-os-openstack-openstack-cell1-56g6g\" (UID: \"78818c83-db7c-4b55-bef0-a04d906450e7\") " pod="openstack/install-os-openstack-openstack-cell1-56g6g" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.408712 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-ssh-key\") pod \"install-os-openstack-openstack-cell1-56g6g\" (UID: \"78818c83-db7c-4b55-bef0-a04d906450e7\") " pod="openstack/install-os-openstack-openstack-cell1-56g6g" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.408952 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-ceph\") pod \"install-os-openstack-openstack-cell1-56g6g\" (UID: \"78818c83-db7c-4b55-bef0-a04d906450e7\") " pod="openstack/install-os-openstack-openstack-cell1-56g6g" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.420518 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2l2k\" (UniqueName: \"kubernetes.io/projected/78818c83-db7c-4b55-bef0-a04d906450e7-kube-api-access-j2l2k\") pod \"install-os-openstack-openstack-cell1-56g6g\" (UID: \"78818c83-db7c-4b55-bef0-a04d906450e7\") " pod="openstack/install-os-openstack-openstack-cell1-56g6g" Oct 02 13:00:55 crc kubenswrapper[4766]: I1002 13:00:55.506133 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-56g6g" Oct 02 13:00:56 crc kubenswrapper[4766]: I1002 13:00:56.074411 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-56g6g"] Oct 02 13:00:56 crc kubenswrapper[4766]: I1002 13:00:56.123259 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-56g6g" event={"ID":"78818c83-db7c-4b55-bef0-a04d906450e7","Type":"ContainerStarted","Data":"6dbd601a199f3d8375b08b25f24704e29bb95438e5225a3926cb14c43c95197a"} Oct 02 13:00:57 crc kubenswrapper[4766]: I1002 13:00:57.137307 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-56g6g" event={"ID":"78818c83-db7c-4b55-bef0-a04d906450e7","Type":"ContainerStarted","Data":"2c70d4ab44c49872cfea92302c479f93db222f6f0d7d8cede5939e61452d46d4"} Oct 02 13:00:57 crc kubenswrapper[4766]: I1002 13:00:57.163413 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-56g6g" podStartSLOduration=1.995184241 podStartE2EDuration="2.163391389s" podCreationTimestamp="2025-10-02 13:00:55 +0000 UTC" firstStartedPulling="2025-10-02 13:00:56.084232331 +0000 UTC m=+7771.027103275" lastFinishedPulling="2025-10-02 13:00:56.252439469 +0000 UTC m=+7771.195310423" observedRunningTime="2025-10-02 13:00:57.158093778 +0000 UTC m=+7772.100964722" watchObservedRunningTime="2025-10-02 13:00:57.163391389 +0000 UTC m=+7772.106262333" Oct 02 13:01:00 crc kubenswrapper[4766]: I1002 13:01:00.131867 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29323501-8bcqs"] Oct 02 13:01:00 crc kubenswrapper[4766]: I1002 13:01:00.134483 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323501-8bcqs" Oct 02 13:01:00 crc kubenswrapper[4766]: I1002 13:01:00.151255 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29323501-8bcqs"] Oct 02 13:01:00 crc kubenswrapper[4766]: I1002 13:01:00.209114 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-fernet-keys\") pod \"keystone-cron-29323501-8bcqs\" (UID: \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\") " pod="openstack/keystone-cron-29323501-8bcqs" Oct 02 13:01:00 crc kubenswrapper[4766]: I1002 13:01:00.209256 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqcxw\" (UniqueName: \"kubernetes.io/projected/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-kube-api-access-dqcxw\") pod \"keystone-cron-29323501-8bcqs\" (UID: \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\") " pod="openstack/keystone-cron-29323501-8bcqs" Oct 02 13:01:00 crc kubenswrapper[4766]: I1002 13:01:00.209298 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-config-data\") pod \"keystone-cron-29323501-8bcqs\" (UID: \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\") " pod="openstack/keystone-cron-29323501-8bcqs" Oct 02 13:01:00 crc kubenswrapper[4766]: I1002 13:01:00.209346 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-combined-ca-bundle\") pod \"keystone-cron-29323501-8bcqs\" (UID: \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\") " pod="openstack/keystone-cron-29323501-8bcqs" Oct 02 13:01:00 crc kubenswrapper[4766]: I1002 13:01:00.313280 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqcxw\" (UniqueName: \"kubernetes.io/projected/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-kube-api-access-dqcxw\") pod \"keystone-cron-29323501-8bcqs\" (UID: \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\") " pod="openstack/keystone-cron-29323501-8bcqs" Oct 02 13:01:00 crc kubenswrapper[4766]: I1002 13:01:00.313601 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-config-data\") pod \"keystone-cron-29323501-8bcqs\" (UID: \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\") " pod="openstack/keystone-cron-29323501-8bcqs" Oct 02 13:01:00 crc kubenswrapper[4766]: I1002 13:01:00.313738 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-combined-ca-bundle\") pod \"keystone-cron-29323501-8bcqs\" (UID: \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\") " pod="openstack/keystone-cron-29323501-8bcqs" Oct 02 13:01:00 crc kubenswrapper[4766]: I1002 13:01:00.313842 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-fernet-keys\") pod \"keystone-cron-29323501-8bcqs\" (UID: \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\") " pod="openstack/keystone-cron-29323501-8bcqs" Oct 02 13:01:00 crc kubenswrapper[4766]: I1002 13:01:00.319074 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-fernet-keys\") pod \"keystone-cron-29323501-8bcqs\" (UID: \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\") " pod="openstack/keystone-cron-29323501-8bcqs" Oct 02 13:01:00 crc kubenswrapper[4766]: I1002 13:01:00.319495 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-combined-ca-bundle\") pod \"keystone-cron-29323501-8bcqs\" (UID: \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\") " pod="openstack/keystone-cron-29323501-8bcqs" Oct 02 13:01:00 crc kubenswrapper[4766]: I1002 13:01:00.322002 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-config-data\") pod \"keystone-cron-29323501-8bcqs\" (UID: \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\") " pod="openstack/keystone-cron-29323501-8bcqs" Oct 02 13:01:00 crc kubenswrapper[4766]: I1002 13:01:00.337777 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqcxw\" (UniqueName: \"kubernetes.io/projected/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-kube-api-access-dqcxw\") pod \"keystone-cron-29323501-8bcqs\" (UID: \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\") " pod="openstack/keystone-cron-29323501-8bcqs" Oct 02 13:01:00 crc kubenswrapper[4766]: I1002 13:01:00.462925 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323501-8bcqs" Oct 02 13:01:00 crc kubenswrapper[4766]: I1002 13:01:00.977999 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29323501-8bcqs"] Oct 02 13:01:00 crc kubenswrapper[4766]: W1002 13:01:00.988346 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f7942f7_8765_4c2f_8fbb_2f66d8170a56.slice/crio-6061d7c42982cdc96429b7dfb9e760de08a98c46170d892b694f3ccf510bbcef WatchSource:0}: Error finding container 6061d7c42982cdc96429b7dfb9e760de08a98c46170d892b694f3ccf510bbcef: Status 404 returned error can't find the container with id 6061d7c42982cdc96429b7dfb9e760de08a98c46170d892b694f3ccf510bbcef Oct 02 13:01:01 crc kubenswrapper[4766]: I1002 13:01:01.188842 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323501-8bcqs" event={"ID":"3f7942f7-8765-4c2f-8fbb-2f66d8170a56","Type":"ContainerStarted","Data":"6061d7c42982cdc96429b7dfb9e760de08a98c46170d892b694f3ccf510bbcef"} Oct 02 13:01:01 crc kubenswrapper[4766]: I1002 13:01:01.216577 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29323501-8bcqs" podStartSLOduration=1.216557479 podStartE2EDuration="1.216557479s" podCreationTimestamp="2025-10-02 13:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:01:01.205196495 +0000 UTC m=+7776.148067429" watchObservedRunningTime="2025-10-02 13:01:01.216557479 +0000 UTC m=+7776.159428423" Oct 02 13:01:01 crc kubenswrapper[4766]: I1002 13:01:01.882562 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:01:01 crc kubenswrapper[4766]: E1002 13:01:01.883553 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:01:02 crc kubenswrapper[4766]: I1002 13:01:02.198947 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323501-8bcqs" event={"ID":"3f7942f7-8765-4c2f-8fbb-2f66d8170a56","Type":"ContainerStarted","Data":"b0a32a7fafb98b82fadee2cd758326ebcf5636bd007a8081a03e25c9d47758e6"} Oct 02 13:01:05 crc kubenswrapper[4766]: I1002 13:01:05.238410 4766 generic.go:334] "Generic (PLEG): container finished" podID="3f7942f7-8765-4c2f-8fbb-2f66d8170a56" containerID="b0a32a7fafb98b82fadee2cd758326ebcf5636bd007a8081a03e25c9d47758e6" exitCode=0 Oct 02 13:01:05 crc kubenswrapper[4766]: I1002 13:01:05.238496 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323501-8bcqs" event={"ID":"3f7942f7-8765-4c2f-8fbb-2f66d8170a56","Type":"ContainerDied","Data":"b0a32a7fafb98b82fadee2cd758326ebcf5636bd007a8081a03e25c9d47758e6"} Oct 02 13:01:06 crc kubenswrapper[4766]: I1002 13:01:06.676938 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323501-8bcqs" Oct 02 13:01:06 crc kubenswrapper[4766]: I1002 13:01:06.791618 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-fernet-keys\") pod \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\" (UID: \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\") " Oct 02 13:01:06 crc kubenswrapper[4766]: I1002 13:01:06.791680 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqcxw\" (UniqueName: \"kubernetes.io/projected/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-kube-api-access-dqcxw\") pod \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\" (UID: \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\") " Oct 02 13:01:06 crc kubenswrapper[4766]: I1002 13:01:06.791741 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-combined-ca-bundle\") pod \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\" (UID: \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\") " Oct 02 13:01:06 crc kubenswrapper[4766]: I1002 13:01:06.791844 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-config-data\") pod \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\" (UID: \"3f7942f7-8765-4c2f-8fbb-2f66d8170a56\") " Oct 02 13:01:06 crc kubenswrapper[4766]: I1002 13:01:06.797003 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-kube-api-access-dqcxw" (OuterVolumeSpecName: "kube-api-access-dqcxw") pod "3f7942f7-8765-4c2f-8fbb-2f66d8170a56" (UID: "3f7942f7-8765-4c2f-8fbb-2f66d8170a56"). InnerVolumeSpecName "kube-api-access-dqcxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:01:06 crc kubenswrapper[4766]: I1002 13:01:06.799231 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3f7942f7-8765-4c2f-8fbb-2f66d8170a56" (UID: "3f7942f7-8765-4c2f-8fbb-2f66d8170a56"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:01:06 crc kubenswrapper[4766]: I1002 13:01:06.844939 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3f7942f7-8765-4c2f-8fbb-2f66d8170a56" (UID: "3f7942f7-8765-4c2f-8fbb-2f66d8170a56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:01:06 crc kubenswrapper[4766]: I1002 13:01:06.870242 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-config-data" (OuterVolumeSpecName: "config-data") pod "3f7942f7-8765-4c2f-8fbb-2f66d8170a56" (UID: "3f7942f7-8765-4c2f-8fbb-2f66d8170a56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:01:06 crc kubenswrapper[4766]: I1002 13:01:06.895617 4766 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:06 crc kubenswrapper[4766]: I1002 13:01:06.895940 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqcxw\" (UniqueName: \"kubernetes.io/projected/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-kube-api-access-dqcxw\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:06 crc kubenswrapper[4766]: I1002 13:01:06.896110 4766 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:06 crc kubenswrapper[4766]: I1002 13:01:06.896244 4766 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f7942f7-8765-4c2f-8fbb-2f66d8170a56-config-data\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:07 crc kubenswrapper[4766]: I1002 13:01:07.268339 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29323501-8bcqs" event={"ID":"3f7942f7-8765-4c2f-8fbb-2f66d8170a56","Type":"ContainerDied","Data":"6061d7c42982cdc96429b7dfb9e760de08a98c46170d892b694f3ccf510bbcef"} Oct 02 13:01:07 crc kubenswrapper[4766]: I1002 13:01:07.268688 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6061d7c42982cdc96429b7dfb9e760de08a98c46170d892b694f3ccf510bbcef" Oct 02 13:01:07 crc kubenswrapper[4766]: I1002 13:01:07.268377 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29323501-8bcqs" Oct 02 13:01:14 crc kubenswrapper[4766]: I1002 13:01:14.881944 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:01:14 crc kubenswrapper[4766]: E1002 13:01:14.882948 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:01:28 crc kubenswrapper[4766]: I1002 13:01:28.881857 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:01:28 crc kubenswrapper[4766]: E1002 13:01:28.882739 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:01:40 crc kubenswrapper[4766]: I1002 13:01:40.633998 4766 generic.go:334] "Generic (PLEG): container finished" podID="78818c83-db7c-4b55-bef0-a04d906450e7" containerID="2c70d4ab44c49872cfea92302c479f93db222f6f0d7d8cede5939e61452d46d4" exitCode=0 Oct 02 13:01:40 crc kubenswrapper[4766]: I1002 13:01:40.634077 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-56g6g" event={"ID":"78818c83-db7c-4b55-bef0-a04d906450e7","Type":"ContainerDied","Data":"2c70d4ab44c49872cfea92302c479f93db222f6f0d7d8cede5939e61452d46d4"} Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.205085 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-56g6g" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.316113 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-ceph\") pod \"78818c83-db7c-4b55-bef0-a04d906450e7\" (UID: \"78818c83-db7c-4b55-bef0-a04d906450e7\") " Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.316186 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-ssh-key\") pod \"78818c83-db7c-4b55-bef0-a04d906450e7\" (UID: \"78818c83-db7c-4b55-bef0-a04d906450e7\") " Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.317544 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2l2k\" (UniqueName: \"kubernetes.io/projected/78818c83-db7c-4b55-bef0-a04d906450e7-kube-api-access-j2l2k\") pod \"78818c83-db7c-4b55-bef0-a04d906450e7\" (UID: \"78818c83-db7c-4b55-bef0-a04d906450e7\") " Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.317684 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-inventory\") pod \"78818c83-db7c-4b55-bef0-a04d906450e7\" (UID: \"78818c83-db7c-4b55-bef0-a04d906450e7\") " Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.322014 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78818c83-db7c-4b55-bef0-a04d906450e7-kube-api-access-j2l2k" (OuterVolumeSpecName: "kube-api-access-j2l2k") pod "78818c83-db7c-4b55-bef0-a04d906450e7" (UID: "78818c83-db7c-4b55-bef0-a04d906450e7"). InnerVolumeSpecName "kube-api-access-j2l2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.322211 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-ceph" (OuterVolumeSpecName: "ceph") pod "78818c83-db7c-4b55-bef0-a04d906450e7" (UID: "78818c83-db7c-4b55-bef0-a04d906450e7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.399128 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-inventory" (OuterVolumeSpecName: "inventory") pod "78818c83-db7c-4b55-bef0-a04d906450e7" (UID: "78818c83-db7c-4b55-bef0-a04d906450e7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.403652 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "78818c83-db7c-4b55-bef0-a04d906450e7" (UID: "78818c83-db7c-4b55-bef0-a04d906450e7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.423745 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.423784 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.423972 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2l2k\" (UniqueName: \"kubernetes.io/projected/78818c83-db7c-4b55-bef0-a04d906450e7-kube-api-access-j2l2k\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.423987 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78818c83-db7c-4b55-bef0-a04d906450e7-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.655304 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-56g6g" event={"ID":"78818c83-db7c-4b55-bef0-a04d906450e7","Type":"ContainerDied","Data":"6dbd601a199f3d8375b08b25f24704e29bb95438e5225a3926cb14c43c95197a"} Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.655344 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dbd601a199f3d8375b08b25f24704e29bb95438e5225a3926cb14c43c95197a" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.655402 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-56g6g" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.760612 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-v5xkb"] Oct 02 13:01:42 crc kubenswrapper[4766]: E1002 13:01:42.761151 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f7942f7-8765-4c2f-8fbb-2f66d8170a56" containerName="keystone-cron" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.761174 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f7942f7-8765-4c2f-8fbb-2f66d8170a56" containerName="keystone-cron" Oct 02 13:01:42 crc kubenswrapper[4766]: E1002 13:01:42.761222 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78818c83-db7c-4b55-bef0-a04d906450e7" containerName="install-os-openstack-openstack-cell1" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.761232 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="78818c83-db7c-4b55-bef0-a04d906450e7" containerName="install-os-openstack-openstack-cell1" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.764461 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f7942f7-8765-4c2f-8fbb-2f66d8170a56" containerName="keystone-cron" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.764501 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="78818c83-db7c-4b55-bef0-a04d906450e7" containerName="install-os-openstack-openstack-cell1" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.765775 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.768412 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.768756 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.768884 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.769440 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.788853 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-v5xkb"] Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.834101 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-ceph\") pod \"configure-os-openstack-openstack-cell1-v5xkb\" (UID: \"72b4bb1d-0502-443a-814d-26667e9885f8\") " pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.834209 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-inventory\") pod \"configure-os-openstack-openstack-cell1-v5xkb\" (UID: \"72b4bb1d-0502-443a-814d-26667e9885f8\") " pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.834245 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-ssh-key\") pod \"configure-os-openstack-openstack-cell1-v5xkb\" (UID: \"72b4bb1d-0502-443a-814d-26667e9885f8\") " pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.834301 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkz6q\" (UniqueName: \"kubernetes.io/projected/72b4bb1d-0502-443a-814d-26667e9885f8-kube-api-access-bkz6q\") pod \"configure-os-openstack-openstack-cell1-v5xkb\" (UID: \"72b4bb1d-0502-443a-814d-26667e9885f8\") " pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.881421 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:01:42 crc kubenswrapper[4766]: E1002 13:01:42.881737 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.936971 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-ceph\") pod \"configure-os-openstack-openstack-cell1-v5xkb\" (UID: \"72b4bb1d-0502-443a-814d-26667e9885f8\") " pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.937072 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-inventory\") pod \"configure-os-openstack-openstack-cell1-v5xkb\" (UID: \"72b4bb1d-0502-443a-814d-26667e9885f8\") " pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.937114 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-ssh-key\") pod \"configure-os-openstack-openstack-cell1-v5xkb\" (UID: \"72b4bb1d-0502-443a-814d-26667e9885f8\") " pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.937181 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkz6q\" (UniqueName: \"kubernetes.io/projected/72b4bb1d-0502-443a-814d-26667e9885f8-kube-api-access-bkz6q\") pod \"configure-os-openstack-openstack-cell1-v5xkb\" (UID: \"72b4bb1d-0502-443a-814d-26667e9885f8\") " pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.942399 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-ssh-key\") pod \"configure-os-openstack-openstack-cell1-v5xkb\" (UID: \"72b4bb1d-0502-443a-814d-26667e9885f8\") " pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.942417 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-inventory\") pod \"configure-os-openstack-openstack-cell1-v5xkb\" (UID: \"72b4bb1d-0502-443a-814d-26667e9885f8\") " pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.943369 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-ceph\") pod \"configure-os-openstack-openstack-cell1-v5xkb\" (UID: \"72b4bb1d-0502-443a-814d-26667e9885f8\") " pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" Oct 02 13:01:42 crc kubenswrapper[4766]: I1002 13:01:42.957199 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkz6q\" (UniqueName: \"kubernetes.io/projected/72b4bb1d-0502-443a-814d-26667e9885f8-kube-api-access-bkz6q\") pod \"configure-os-openstack-openstack-cell1-v5xkb\" (UID: \"72b4bb1d-0502-443a-814d-26667e9885f8\") " pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" Oct 02 13:01:43 crc kubenswrapper[4766]: I1002 13:01:43.098637 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" Oct 02 13:01:43 crc kubenswrapper[4766]: I1002 13:01:43.635382 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-v5xkb"] Oct 02 13:01:43 crc kubenswrapper[4766]: I1002 13:01:43.667606 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" event={"ID":"72b4bb1d-0502-443a-814d-26667e9885f8","Type":"ContainerStarted","Data":"4dfc510427760a333c792d20edc4299dc1fc6ba22eb956142fe03d3f73b55f0b"} Oct 02 13:01:44 crc kubenswrapper[4766]: I1002 13:01:44.682767 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" event={"ID":"72b4bb1d-0502-443a-814d-26667e9885f8","Type":"ContainerStarted","Data":"2690a3903ccb41ca02f33e40ffd4af85b208145d975ffb892a7c714c374fd6fd"} Oct 02 13:01:44 crc kubenswrapper[4766]: I1002 13:01:44.717607 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" podStartSLOduration=2.5229160349999997 podStartE2EDuration="2.717585231s" podCreationTimestamp="2025-10-02 13:01:42 +0000 UTC" firstStartedPulling="2025-10-02 13:01:43.642402431 +0000 UTC m=+7818.585273375" lastFinishedPulling="2025-10-02 13:01:43.837071627 +0000 UTC m=+7818.779942571" observedRunningTime="2025-10-02 13:01:44.70381058 +0000 UTC m=+7819.646681554" watchObservedRunningTime="2025-10-02 13:01:44.717585231 +0000 UTC m=+7819.660456185" Oct 02 13:01:57 crc kubenswrapper[4766]: I1002 13:01:57.882669 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:01:57 crc kubenswrapper[4766]: E1002 13:01:57.883925 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:02:12 crc kubenswrapper[4766]: I1002 13:02:12.882304 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:02:12 crc kubenswrapper[4766]: E1002 13:02:12.883333 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:02:27 crc kubenswrapper[4766]: I1002 13:02:27.882597 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:02:27 crc kubenswrapper[4766]: E1002 13:02:27.883923 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:02:29 crc kubenswrapper[4766]: I1002 13:02:29.186721 4766 generic.go:334] "Generic (PLEG): container finished" podID="72b4bb1d-0502-443a-814d-26667e9885f8" containerID="2690a3903ccb41ca02f33e40ffd4af85b208145d975ffb892a7c714c374fd6fd" exitCode=0 Oct 02 13:02:29 crc kubenswrapper[4766]: I1002 13:02:29.186808 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" event={"ID":"72b4bb1d-0502-443a-814d-26667e9885f8","Type":"ContainerDied","Data":"2690a3903ccb41ca02f33e40ffd4af85b208145d975ffb892a7c714c374fd6fd"} Oct 02 13:02:30 crc kubenswrapper[4766]: I1002 13:02:30.664023 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" Oct 02 13:02:30 crc kubenswrapper[4766]: I1002 13:02:30.852709 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkz6q\" (UniqueName: \"kubernetes.io/projected/72b4bb1d-0502-443a-814d-26667e9885f8-kube-api-access-bkz6q\") pod \"72b4bb1d-0502-443a-814d-26667e9885f8\" (UID: \"72b4bb1d-0502-443a-814d-26667e9885f8\") " Oct 02 13:02:30 crc kubenswrapper[4766]: I1002 13:02:30.852986 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-ssh-key\") pod \"72b4bb1d-0502-443a-814d-26667e9885f8\" (UID: \"72b4bb1d-0502-443a-814d-26667e9885f8\") " Oct 02 13:02:30 crc kubenswrapper[4766]: I1002 13:02:30.853030 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-ceph\") pod \"72b4bb1d-0502-443a-814d-26667e9885f8\" (UID: \"72b4bb1d-0502-443a-814d-26667e9885f8\") " Oct 02 13:02:30 crc kubenswrapper[4766]: I1002 13:02:30.853198 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-inventory\") pod \"72b4bb1d-0502-443a-814d-26667e9885f8\" (UID: \"72b4bb1d-0502-443a-814d-26667e9885f8\") " Oct 02 13:02:30 crc kubenswrapper[4766]: I1002 13:02:30.858124 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-ceph" (OuterVolumeSpecName: "ceph") pod "72b4bb1d-0502-443a-814d-26667e9885f8" (UID: "72b4bb1d-0502-443a-814d-26667e9885f8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:02:30 crc kubenswrapper[4766]: I1002 13:02:30.859516 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b4bb1d-0502-443a-814d-26667e9885f8-kube-api-access-bkz6q" (OuterVolumeSpecName: "kube-api-access-bkz6q") pod "72b4bb1d-0502-443a-814d-26667e9885f8" (UID: "72b4bb1d-0502-443a-814d-26667e9885f8"). InnerVolumeSpecName "kube-api-access-bkz6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:02:30 crc kubenswrapper[4766]: I1002 13:02:30.885494 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "72b4bb1d-0502-443a-814d-26667e9885f8" (UID: "72b4bb1d-0502-443a-814d-26667e9885f8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:02:30 crc kubenswrapper[4766]: I1002 13:02:30.886789 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-inventory" (OuterVolumeSpecName: "inventory") pod "72b4bb1d-0502-443a-814d-26667e9885f8" (UID: "72b4bb1d-0502-443a-814d-26667e9885f8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:02:30 crc kubenswrapper[4766]: I1002 13:02:30.955907 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:30 crc kubenswrapper[4766]: I1002 13:02:30.955945 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:30 crc kubenswrapper[4766]: I1002 13:02:30.955959 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72b4bb1d-0502-443a-814d-26667e9885f8-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:30 crc kubenswrapper[4766]: I1002 13:02:30.955971 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkz6q\" (UniqueName: \"kubernetes.io/projected/72b4bb1d-0502-443a-814d-26667e9885f8-kube-api-access-bkz6q\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.206876 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" event={"ID":"72b4bb1d-0502-443a-814d-26667e9885f8","Type":"ContainerDied","Data":"4dfc510427760a333c792d20edc4299dc1fc6ba22eb956142fe03d3f73b55f0b"} Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.206940 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dfc510427760a333c792d20edc4299dc1fc6ba22eb956142fe03d3f73b55f0b" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.207018 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-v5xkb" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.295488 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-5hwdj"] Oct 02 13:02:31 crc kubenswrapper[4766]: E1002 13:02:31.296254 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b4bb1d-0502-443a-814d-26667e9885f8" containerName="configure-os-openstack-openstack-cell1" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.296287 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b4bb1d-0502-443a-814d-26667e9885f8" containerName="configure-os-openstack-openstack-cell1" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.296697 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b4bb1d-0502-443a-814d-26667e9885f8" containerName="configure-os-openstack-openstack-cell1" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.298159 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-5hwdj" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.302480 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.302834 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.303166 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.303419 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.312944 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-5hwdj"] Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.466967 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-5hwdj\" (UID: \"8fbae716-c2f8-43f5-9129-632f37db1f4e\") " pod="openstack/ssh-known-hosts-openstack-5hwdj" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.467543 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-inventory-0\") pod \"ssh-known-hosts-openstack-5hwdj\" (UID: \"8fbae716-c2f8-43f5-9129-632f37db1f4e\") " pod="openstack/ssh-known-hosts-openstack-5hwdj" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.467595 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b56mx\" (UniqueName: \"kubernetes.io/projected/8fbae716-c2f8-43f5-9129-632f37db1f4e-kube-api-access-b56mx\") pod \"ssh-known-hosts-openstack-5hwdj\" (UID: \"8fbae716-c2f8-43f5-9129-632f37db1f4e\") " pod="openstack/ssh-known-hosts-openstack-5hwdj" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.467618 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-ceph\") pod \"ssh-known-hosts-openstack-5hwdj\" (UID: \"8fbae716-c2f8-43f5-9129-632f37db1f4e\") " pod="openstack/ssh-known-hosts-openstack-5hwdj" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.569292 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-inventory-0\") pod \"ssh-known-hosts-openstack-5hwdj\" (UID: \"8fbae716-c2f8-43f5-9129-632f37db1f4e\") " pod="openstack/ssh-known-hosts-openstack-5hwdj" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.569357 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b56mx\" (UniqueName: \"kubernetes.io/projected/8fbae716-c2f8-43f5-9129-632f37db1f4e-kube-api-access-b56mx\") pod \"ssh-known-hosts-openstack-5hwdj\" (UID: \"8fbae716-c2f8-43f5-9129-632f37db1f4e\") " pod="openstack/ssh-known-hosts-openstack-5hwdj" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.569380 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-ceph\") pod \"ssh-known-hosts-openstack-5hwdj\" (UID: \"8fbae716-c2f8-43f5-9129-632f37db1f4e\") " pod="openstack/ssh-known-hosts-openstack-5hwdj" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.569452 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-5hwdj\" (UID: \"8fbae716-c2f8-43f5-9129-632f37db1f4e\") " pod="openstack/ssh-known-hosts-openstack-5hwdj" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.575600 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-5hwdj\" (UID: \"8fbae716-c2f8-43f5-9129-632f37db1f4e\") " pod="openstack/ssh-known-hosts-openstack-5hwdj" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.576698 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-inventory-0\") pod \"ssh-known-hosts-openstack-5hwdj\" (UID: \"8fbae716-c2f8-43f5-9129-632f37db1f4e\") " pod="openstack/ssh-known-hosts-openstack-5hwdj" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.579807 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-ceph\") pod \"ssh-known-hosts-openstack-5hwdj\" (UID: \"8fbae716-c2f8-43f5-9129-632f37db1f4e\") " pod="openstack/ssh-known-hosts-openstack-5hwdj" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.596803 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b56mx\" (UniqueName: \"kubernetes.io/projected/8fbae716-c2f8-43f5-9129-632f37db1f4e-kube-api-access-b56mx\") pod \"ssh-known-hosts-openstack-5hwdj\" (UID: \"8fbae716-c2f8-43f5-9129-632f37db1f4e\") " pod="openstack/ssh-known-hosts-openstack-5hwdj" Oct 02 13:02:31 crc kubenswrapper[4766]: I1002 13:02:31.620063 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-5hwdj" Oct 02 13:02:32 crc kubenswrapper[4766]: I1002 13:02:32.918320 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-5hwdj"] Oct 02 13:02:33 crc kubenswrapper[4766]: I1002 13:02:33.230170 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-5hwdj" event={"ID":"8fbae716-c2f8-43f5-9129-632f37db1f4e","Type":"ContainerStarted","Data":"09653ca31565e63e8af5e50686d9a0530048b73295d709f190df5d4a9625be8a"} Oct 02 13:02:34 crc kubenswrapper[4766]: I1002 13:02:34.247303 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-5hwdj" event={"ID":"8fbae716-c2f8-43f5-9129-632f37db1f4e","Type":"ContainerStarted","Data":"991821eed737794eb8277dd05e606517d4c38b75331d44b104f01465554e0003"} Oct 02 13:02:34 crc kubenswrapper[4766]: I1002 13:02:34.280812 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-5hwdj" podStartSLOduration=3.063345281 podStartE2EDuration="3.280785775s" podCreationTimestamp="2025-10-02 13:02:31 +0000 UTC" firstStartedPulling="2025-10-02 13:02:32.930247876 +0000 UTC m=+7867.873118820" lastFinishedPulling="2025-10-02 13:02:33.14768837 +0000 UTC m=+7868.090559314" observedRunningTime="2025-10-02 13:02:34.265102153 +0000 UTC m=+7869.207973167" watchObservedRunningTime="2025-10-02 13:02:34.280785775 +0000 UTC m=+7869.223656749" Oct 02 13:02:38 crc kubenswrapper[4766]: I1002 13:02:38.881549 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:02:38 crc kubenswrapper[4766]: E1002 13:02:38.882580 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:02:42 crc kubenswrapper[4766]: I1002 13:02:42.360254 4766 generic.go:334] "Generic (PLEG): container finished" podID="8fbae716-c2f8-43f5-9129-632f37db1f4e" containerID="991821eed737794eb8277dd05e606517d4c38b75331d44b104f01465554e0003" exitCode=0 Oct 02 13:02:42 crc kubenswrapper[4766]: I1002 13:02:42.361539 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-5hwdj" event={"ID":"8fbae716-c2f8-43f5-9129-632f37db1f4e","Type":"ContainerDied","Data":"991821eed737794eb8277dd05e606517d4c38b75331d44b104f01465554e0003"} Oct 02 13:02:43 crc kubenswrapper[4766]: I1002 13:02:43.817777 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-5hwdj" Oct 02 13:02:43 crc kubenswrapper[4766]: I1002 13:02:43.978418 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b56mx\" (UniqueName: \"kubernetes.io/projected/8fbae716-c2f8-43f5-9129-632f37db1f4e-kube-api-access-b56mx\") pod \"8fbae716-c2f8-43f5-9129-632f37db1f4e\" (UID: \"8fbae716-c2f8-43f5-9129-632f37db1f4e\") " Oct 02 13:02:43 crc kubenswrapper[4766]: I1002 13:02:43.978761 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-inventory-0\") pod \"8fbae716-c2f8-43f5-9129-632f37db1f4e\" (UID: \"8fbae716-c2f8-43f5-9129-632f37db1f4e\") " Oct 02 13:02:43 crc kubenswrapper[4766]: I1002 13:02:43.978849 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-ssh-key-openstack-cell1\") pod \"8fbae716-c2f8-43f5-9129-632f37db1f4e\" (UID: \"8fbae716-c2f8-43f5-9129-632f37db1f4e\") " Oct 02 13:02:43 crc kubenswrapper[4766]: I1002 13:02:43.978886 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-ceph\") pod \"8fbae716-c2f8-43f5-9129-632f37db1f4e\" (UID: \"8fbae716-c2f8-43f5-9129-632f37db1f4e\") " Oct 02 13:02:43 crc kubenswrapper[4766]: I1002 13:02:43.987954 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-ceph" (OuterVolumeSpecName: "ceph") pod "8fbae716-c2f8-43f5-9129-632f37db1f4e" (UID: "8fbae716-c2f8-43f5-9129-632f37db1f4e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:02:43 crc kubenswrapper[4766]: I1002 13:02:43.988021 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fbae716-c2f8-43f5-9129-632f37db1f4e-kube-api-access-b56mx" (OuterVolumeSpecName: "kube-api-access-b56mx") pod "8fbae716-c2f8-43f5-9129-632f37db1f4e" (UID: "8fbae716-c2f8-43f5-9129-632f37db1f4e"). InnerVolumeSpecName "kube-api-access-b56mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.005533 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "8fbae716-c2f8-43f5-9129-632f37db1f4e" (UID: "8fbae716-c2f8-43f5-9129-632f37db1f4e"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.012189 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "8fbae716-c2f8-43f5-9129-632f37db1f4e" (UID: "8fbae716-c2f8-43f5-9129-632f37db1f4e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.081616 4766 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.081660 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.081675 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8fbae716-c2f8-43f5-9129-632f37db1f4e-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.081690 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b56mx\" (UniqueName: \"kubernetes.io/projected/8fbae716-c2f8-43f5-9129-632f37db1f4e-kube-api-access-b56mx\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.393874 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-5hwdj" event={"ID":"8fbae716-c2f8-43f5-9129-632f37db1f4e","Type":"ContainerDied","Data":"09653ca31565e63e8af5e50686d9a0530048b73295d709f190df5d4a9625be8a"} Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.394231 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09653ca31565e63e8af5e50686d9a0530048b73295d709f190df5d4a9625be8a" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.394075 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-5hwdj" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.483896 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-wxt5h"] Oct 02 13:02:44 crc kubenswrapper[4766]: E1002 13:02:44.484439 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fbae716-c2f8-43f5-9129-632f37db1f4e" containerName="ssh-known-hosts-openstack" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.484469 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fbae716-c2f8-43f5-9129-632f37db1f4e" containerName="ssh-known-hosts-openstack" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.484773 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fbae716-c2f8-43f5-9129-632f37db1f4e" containerName="ssh-known-hosts-openstack" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.485573 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-wxt5h" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.488354 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.488811 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.488943 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.489258 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.499973 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-wxt5h"] Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.592988 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpqw9\" (UniqueName: \"kubernetes.io/projected/598a570b-1120-474e-a6a1-e46a82ff8272-kube-api-access-lpqw9\") pod \"run-os-openstack-openstack-cell1-wxt5h\" (UID: \"598a570b-1120-474e-a6a1-e46a82ff8272\") " pod="openstack/run-os-openstack-openstack-cell1-wxt5h" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.593127 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-inventory\") pod \"run-os-openstack-openstack-cell1-wxt5h\" (UID: \"598a570b-1120-474e-a6a1-e46a82ff8272\") " pod="openstack/run-os-openstack-openstack-cell1-wxt5h" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.593242 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-ceph\") pod \"run-os-openstack-openstack-cell1-wxt5h\" (UID: \"598a570b-1120-474e-a6a1-e46a82ff8272\") " pod="openstack/run-os-openstack-openstack-cell1-wxt5h" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.593411 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-ssh-key\") pod \"run-os-openstack-openstack-cell1-wxt5h\" (UID: \"598a570b-1120-474e-a6a1-e46a82ff8272\") " pod="openstack/run-os-openstack-openstack-cell1-wxt5h" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.694802 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-ssh-key\") pod \"run-os-openstack-openstack-cell1-wxt5h\" (UID: \"598a570b-1120-474e-a6a1-e46a82ff8272\") " pod="openstack/run-os-openstack-openstack-cell1-wxt5h" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.694907 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpqw9\" (UniqueName: \"kubernetes.io/projected/598a570b-1120-474e-a6a1-e46a82ff8272-kube-api-access-lpqw9\") pod \"run-os-openstack-openstack-cell1-wxt5h\" (UID: \"598a570b-1120-474e-a6a1-e46a82ff8272\") " pod="openstack/run-os-openstack-openstack-cell1-wxt5h" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.694963 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-inventory\") pod \"run-os-openstack-openstack-cell1-wxt5h\" (UID: \"598a570b-1120-474e-a6a1-e46a82ff8272\") " pod="openstack/run-os-openstack-openstack-cell1-wxt5h" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.695012 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-ceph\") pod \"run-os-openstack-openstack-cell1-wxt5h\" (UID: \"598a570b-1120-474e-a6a1-e46a82ff8272\") " pod="openstack/run-os-openstack-openstack-cell1-wxt5h" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.699184 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-ceph\") pod \"run-os-openstack-openstack-cell1-wxt5h\" (UID: \"598a570b-1120-474e-a6a1-e46a82ff8272\") " pod="openstack/run-os-openstack-openstack-cell1-wxt5h" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.705990 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-ssh-key\") pod \"run-os-openstack-openstack-cell1-wxt5h\" (UID: \"598a570b-1120-474e-a6a1-e46a82ff8272\") " pod="openstack/run-os-openstack-openstack-cell1-wxt5h" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.712731 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-inventory\") pod \"run-os-openstack-openstack-cell1-wxt5h\" (UID: \"598a570b-1120-474e-a6a1-e46a82ff8272\") " pod="openstack/run-os-openstack-openstack-cell1-wxt5h" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.716765 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpqw9\" (UniqueName: \"kubernetes.io/projected/598a570b-1120-474e-a6a1-e46a82ff8272-kube-api-access-lpqw9\") pod \"run-os-openstack-openstack-cell1-wxt5h\" (UID: \"598a570b-1120-474e-a6a1-e46a82ff8272\") " pod="openstack/run-os-openstack-openstack-cell1-wxt5h" Oct 02 13:02:44 crc kubenswrapper[4766]: I1002 13:02:44.825299 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-wxt5h" Oct 02 13:02:45 crc kubenswrapper[4766]: I1002 13:02:45.409830 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-wxt5h"] Oct 02 13:02:46 crc kubenswrapper[4766]: I1002 13:02:46.420992 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-wxt5h" event={"ID":"598a570b-1120-474e-a6a1-e46a82ff8272","Type":"ContainerStarted","Data":"edc758b4ab324d9813fe45d8853a158c557e6315c3811c5c5d68bfe2566b634b"} Oct 02 13:02:46 crc kubenswrapper[4766]: I1002 13:02:46.421564 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-wxt5h" event={"ID":"598a570b-1120-474e-a6a1-e46a82ff8272","Type":"ContainerStarted","Data":"ccd72cd490d989980bf58aed81f7a6eb41070b647cd1cf3a5f1db76e11787ff8"} Oct 02 13:02:46 crc kubenswrapper[4766]: I1002 13:02:46.448874 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-wxt5h" podStartSLOduration=2.26301369 podStartE2EDuration="2.448853103s" podCreationTimestamp="2025-10-02 13:02:44 +0000 UTC" firstStartedPulling="2025-10-02 13:02:45.424301304 +0000 UTC m=+7880.367172238" lastFinishedPulling="2025-10-02 13:02:45.610140707 +0000 UTC m=+7880.553011651" observedRunningTime="2025-10-02 13:02:46.44689984 +0000 UTC m=+7881.389770834" watchObservedRunningTime="2025-10-02 13:02:46.448853103 +0000 UTC m=+7881.391724047" Oct 02 13:02:49 crc kubenswrapper[4766]: I1002 13:02:49.882635 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:02:49 crc kubenswrapper[4766]: E1002 13:02:49.883805 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:02:54 crc kubenswrapper[4766]: I1002 13:02:54.515959 4766 generic.go:334] "Generic (PLEG): container finished" podID="598a570b-1120-474e-a6a1-e46a82ff8272" containerID="edc758b4ab324d9813fe45d8853a158c557e6315c3811c5c5d68bfe2566b634b" exitCode=0 Oct 02 13:02:54 crc kubenswrapper[4766]: I1002 13:02:54.516030 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-wxt5h" event={"ID":"598a570b-1120-474e-a6a1-e46a82ff8272","Type":"ContainerDied","Data":"edc758b4ab324d9813fe45d8853a158c557e6315c3811c5c5d68bfe2566b634b"} Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.052805 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-wxt5h" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.159765 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpqw9\" (UniqueName: \"kubernetes.io/projected/598a570b-1120-474e-a6a1-e46a82ff8272-kube-api-access-lpqw9\") pod \"598a570b-1120-474e-a6a1-e46a82ff8272\" (UID: \"598a570b-1120-474e-a6a1-e46a82ff8272\") " Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.159882 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-ssh-key\") pod \"598a570b-1120-474e-a6a1-e46a82ff8272\" (UID: \"598a570b-1120-474e-a6a1-e46a82ff8272\") " Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.160070 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-ceph\") pod \"598a570b-1120-474e-a6a1-e46a82ff8272\" (UID: \"598a570b-1120-474e-a6a1-e46a82ff8272\") " Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.160133 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-inventory\") pod \"598a570b-1120-474e-a6a1-e46a82ff8272\" (UID: \"598a570b-1120-474e-a6a1-e46a82ff8272\") " Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.166128 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-ceph" (OuterVolumeSpecName: "ceph") pod "598a570b-1120-474e-a6a1-e46a82ff8272" (UID: "598a570b-1120-474e-a6a1-e46a82ff8272"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.166469 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/598a570b-1120-474e-a6a1-e46a82ff8272-kube-api-access-lpqw9" (OuterVolumeSpecName: "kube-api-access-lpqw9") pod "598a570b-1120-474e-a6a1-e46a82ff8272" (UID: "598a570b-1120-474e-a6a1-e46a82ff8272"). InnerVolumeSpecName "kube-api-access-lpqw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.193423 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "598a570b-1120-474e-a6a1-e46a82ff8272" (UID: "598a570b-1120-474e-a6a1-e46a82ff8272"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.194010 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-inventory" (OuterVolumeSpecName: "inventory") pod "598a570b-1120-474e-a6a1-e46a82ff8272" (UID: "598a570b-1120-474e-a6a1-e46a82ff8272"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.263054 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.263089 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.263101 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/598a570b-1120-474e-a6a1-e46a82ff8272-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.263113 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpqw9\" (UniqueName: \"kubernetes.io/projected/598a570b-1120-474e-a6a1-e46a82ff8272-kube-api-access-lpqw9\") on node \"crc\" DevicePath \"\"" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.542054 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-wxt5h" event={"ID":"598a570b-1120-474e-a6a1-e46a82ff8272","Type":"ContainerDied","Data":"ccd72cd490d989980bf58aed81f7a6eb41070b647cd1cf3a5f1db76e11787ff8"} Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.542103 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccd72cd490d989980bf58aed81f7a6eb41070b647cd1cf3a5f1db76e11787ff8" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.542112 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-wxt5h" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.623741 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-4zfvg"] Oct 02 13:02:56 crc kubenswrapper[4766]: E1002 13:02:56.624550 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="598a570b-1120-474e-a6a1-e46a82ff8272" containerName="run-os-openstack-openstack-cell1" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.624579 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="598a570b-1120-474e-a6a1-e46a82ff8272" containerName="run-os-openstack-openstack-cell1" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.624937 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="598a570b-1120-474e-a6a1-e46a82ff8272" containerName="run-os-openstack-openstack-cell1" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.626327 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.629851 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.630033 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.630238 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.630496 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.653288 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-4zfvg"] Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.674966 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-ceph\") pod \"reboot-os-openstack-openstack-cell1-4zfvg\" (UID: \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\") " pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.675026 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-inventory\") pod \"reboot-os-openstack-openstack-cell1-4zfvg\" (UID: \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\") " pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.675179 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6279\" (UniqueName: \"kubernetes.io/projected/96d36e75-8d95-4fb5-9601-bbc75eb150d4-kube-api-access-h6279\") pod \"reboot-os-openstack-openstack-cell1-4zfvg\" (UID: \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\") " pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.675528 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-4zfvg\" (UID: \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\") " pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.778331 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-4zfvg\" (UID: \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\") " pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.778479 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-ceph\") pod \"reboot-os-openstack-openstack-cell1-4zfvg\" (UID: \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\") " pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.778525 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-inventory\") pod \"reboot-os-openstack-openstack-cell1-4zfvg\" (UID: \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\") " pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.778635 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6279\" (UniqueName: \"kubernetes.io/projected/96d36e75-8d95-4fb5-9601-bbc75eb150d4-kube-api-access-h6279\") pod \"reboot-os-openstack-openstack-cell1-4zfvg\" (UID: \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\") " pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.783722 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-4zfvg\" (UID: \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\") " pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.785045 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-inventory\") pod \"reboot-os-openstack-openstack-cell1-4zfvg\" (UID: \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\") " pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.786141 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-ceph\") pod \"reboot-os-openstack-openstack-cell1-4zfvg\" (UID: \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\") " pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.814970 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6279\" (UniqueName: \"kubernetes.io/projected/96d36e75-8d95-4fb5-9601-bbc75eb150d4-kube-api-access-h6279\") pod \"reboot-os-openstack-openstack-cell1-4zfvg\" (UID: \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\") " pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" Oct 02 13:02:56 crc kubenswrapper[4766]: I1002 13:02:56.952107 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" Oct 02 13:02:57 crc kubenswrapper[4766]: I1002 13:02:57.587478 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-4zfvg"] Oct 02 13:02:58 crc kubenswrapper[4766]: I1002 13:02:58.584875 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" event={"ID":"96d36e75-8d95-4fb5-9601-bbc75eb150d4","Type":"ContainerStarted","Data":"6828ae605e9eaa77744089a9281ce11f888339ae1755fa1b4ff9911779afdd7f"} Oct 02 13:02:58 crc kubenswrapper[4766]: I1002 13:02:58.585317 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" event={"ID":"96d36e75-8d95-4fb5-9601-bbc75eb150d4","Type":"ContainerStarted","Data":"e3a553b3ad44851e24d480f583c87f469ac7cfa31e8959e75a2b92a069fc6c5f"} Oct 02 13:02:58 crc kubenswrapper[4766]: I1002 13:02:58.607465 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" podStartSLOduration=2.407382745 podStartE2EDuration="2.607444634s" podCreationTimestamp="2025-10-02 13:02:56 +0000 UTC" firstStartedPulling="2025-10-02 13:02:57.595809199 +0000 UTC m=+7892.538680143" lastFinishedPulling="2025-10-02 13:02:57.795871078 +0000 UTC m=+7892.738742032" observedRunningTime="2025-10-02 13:02:58.606542925 +0000 UTC m=+7893.549413899" watchObservedRunningTime="2025-10-02 13:02:58.607444634 +0000 UTC m=+7893.550315578" Oct 02 13:03:00 crc kubenswrapper[4766]: I1002 13:03:00.881498 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:03:00 crc kubenswrapper[4766]: E1002 13:03:00.882600 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:03:11 crc kubenswrapper[4766]: I1002 13:03:11.882137 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:03:11 crc kubenswrapper[4766]: E1002 13:03:11.883058 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:03:13 crc kubenswrapper[4766]: I1002 13:03:13.773896 4766 generic.go:334] "Generic (PLEG): container finished" podID="96d36e75-8d95-4fb5-9601-bbc75eb150d4" containerID="6828ae605e9eaa77744089a9281ce11f888339ae1755fa1b4ff9911779afdd7f" exitCode=0 Oct 02 13:03:13 crc kubenswrapper[4766]: I1002 13:03:13.774439 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" event={"ID":"96d36e75-8d95-4fb5-9601-bbc75eb150d4","Type":"ContainerDied","Data":"6828ae605e9eaa77744089a9281ce11f888339ae1755fa1b4ff9911779afdd7f"} Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.348163 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.456017 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6279\" (UniqueName: \"kubernetes.io/projected/96d36e75-8d95-4fb5-9601-bbc75eb150d4-kube-api-access-h6279\") pod \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\" (UID: \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\") " Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.456527 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-inventory\") pod \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\" (UID: \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\") " Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.456778 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-ssh-key\") pod \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\" (UID: \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\") " Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.456905 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-ceph\") pod \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\" (UID: \"96d36e75-8d95-4fb5-9601-bbc75eb150d4\") " Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.462334 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-ceph" (OuterVolumeSpecName: "ceph") pod "96d36e75-8d95-4fb5-9601-bbc75eb150d4" (UID: "96d36e75-8d95-4fb5-9601-bbc75eb150d4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.462532 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96d36e75-8d95-4fb5-9601-bbc75eb150d4-kube-api-access-h6279" (OuterVolumeSpecName: "kube-api-access-h6279") pod "96d36e75-8d95-4fb5-9601-bbc75eb150d4" (UID: "96d36e75-8d95-4fb5-9601-bbc75eb150d4"). InnerVolumeSpecName "kube-api-access-h6279". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.486895 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-inventory" (OuterVolumeSpecName: "inventory") pod "96d36e75-8d95-4fb5-9601-bbc75eb150d4" (UID: "96d36e75-8d95-4fb5-9601-bbc75eb150d4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.487956 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "96d36e75-8d95-4fb5-9601-bbc75eb150d4" (UID: "96d36e75-8d95-4fb5-9601-bbc75eb150d4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.559763 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6279\" (UniqueName: \"kubernetes.io/projected/96d36e75-8d95-4fb5-9601-bbc75eb150d4-kube-api-access-h6279\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.559809 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.559822 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.559834 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/96d36e75-8d95-4fb5-9601-bbc75eb150d4-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.797861 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" event={"ID":"96d36e75-8d95-4fb5-9601-bbc75eb150d4","Type":"ContainerDied","Data":"e3a553b3ad44851e24d480f583c87f469ac7cfa31e8959e75a2b92a069fc6c5f"} Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.797913 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3a553b3ad44851e24d480f583c87f469ac7cfa31e8959e75a2b92a069fc6c5f" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.797936 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-4zfvg" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.904865 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-cq4mv"] Oct 02 13:03:15 crc kubenswrapper[4766]: E1002 13:03:15.905548 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96d36e75-8d95-4fb5-9601-bbc75eb150d4" containerName="reboot-os-openstack-openstack-cell1" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.905578 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="96d36e75-8d95-4fb5-9601-bbc75eb150d4" containerName="reboot-os-openstack-openstack-cell1" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.905978 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="96d36e75-8d95-4fb5-9601-bbc75eb150d4" containerName="reboot-os-openstack-openstack-cell1" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.907246 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.909861 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.910264 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.910741 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.912265 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:03:15 crc kubenswrapper[4766]: I1002 13:03:15.916645 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-cq4mv"] Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.070982 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.071038 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.071171 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.071280 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ceph\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.071383 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.071449 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.071818 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.071949 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.072118 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkhvc\" (UniqueName: \"kubernetes.io/projected/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-kube-api-access-tkhvc\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.072272 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ssh-key\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.072608 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.072740 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-inventory\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.175107 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.175187 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-inventory\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.175260 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.175303 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.175329 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.175361 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ceph\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.175395 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.175430 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.175544 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.175597 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.175632 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkhvc\" (UniqueName: \"kubernetes.io/projected/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-kube-api-access-tkhvc\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.175676 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ssh-key\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.180231 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-inventory\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.180898 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.180967 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ceph\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.181752 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.181766 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.182081 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.182992 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.183126 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.183828 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.183911 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ssh-key\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.185745 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.197979 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkhvc\" (UniqueName: \"kubernetes.io/projected/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-kube-api-access-tkhvc\") pod \"install-certs-openstack-openstack-cell1-cq4mv\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.235938 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:16 crc kubenswrapper[4766]: I1002 13:03:16.829777 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-cq4mv"] Oct 02 13:03:17 crc kubenswrapper[4766]: I1002 13:03:17.826526 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" event={"ID":"c4528d6f-0f1c-4624-abbd-03e6ee59ebba","Type":"ContainerStarted","Data":"cfae4a59084640ee30a6912d3723a66ed384b5b7d18b543dfa6f05bfc463b552"} Oct 02 13:03:17 crc kubenswrapper[4766]: I1002 13:03:17.826998 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" event={"ID":"c4528d6f-0f1c-4624-abbd-03e6ee59ebba","Type":"ContainerStarted","Data":"2ca43b3a81a703d5b6f7196fb57fd6f7d29e9a55f401b88c17781a3f5ca33b2c"} Oct 02 13:03:17 crc kubenswrapper[4766]: I1002 13:03:17.868817 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" podStartSLOduration=2.710951747 podStartE2EDuration="2.868796363s" podCreationTimestamp="2025-10-02 13:03:15 +0000 UTC" firstStartedPulling="2025-10-02 13:03:16.832898091 +0000 UTC m=+7911.775769055" lastFinishedPulling="2025-10-02 13:03:16.990742727 +0000 UTC m=+7911.933613671" observedRunningTime="2025-10-02 13:03:17.85528456 +0000 UTC m=+7912.798155514" watchObservedRunningTime="2025-10-02 13:03:17.868796363 +0000 UTC m=+7912.811667297" Oct 02 13:03:23 crc kubenswrapper[4766]: I1002 13:03:23.882315 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:03:23 crc kubenswrapper[4766]: E1002 13:03:23.883210 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:03:36 crc kubenswrapper[4766]: I1002 13:03:36.011196 4766 generic.go:334] "Generic (PLEG): container finished" podID="c4528d6f-0f1c-4624-abbd-03e6ee59ebba" containerID="cfae4a59084640ee30a6912d3723a66ed384b5b7d18b543dfa6f05bfc463b552" exitCode=0 Oct 02 13:03:36 crc kubenswrapper[4766]: I1002 13:03:36.011281 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" event={"ID":"c4528d6f-0f1c-4624-abbd-03e6ee59ebba","Type":"ContainerDied","Data":"cfae4a59084640ee30a6912d3723a66ed384b5b7d18b543dfa6f05bfc463b552"} Oct 02 13:03:36 crc kubenswrapper[4766]: I1002 13:03:36.883369 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:03:36 crc kubenswrapper[4766]: E1002 13:03:36.883913 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.560425 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.614165 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-metadata-combined-ca-bundle\") pod \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.614236 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-bootstrap-combined-ca-bundle\") pod \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.614275 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ovn-combined-ca-bundle\") pod \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.615288 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkhvc\" (UniqueName: \"kubernetes.io/projected/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-kube-api-access-tkhvc\") pod \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.615656 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-dhcp-combined-ca-bundle\") pod \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.615732 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-sriov-combined-ca-bundle\") pod \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.615805 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-nova-combined-ca-bundle\") pod \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.615847 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-telemetry-combined-ca-bundle\") pod \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.615870 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ssh-key\") pod \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.615900 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-inventory\") pod \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.615926 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-libvirt-combined-ca-bundle\") pod \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.615962 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ceph\") pod \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\" (UID: \"c4528d6f-0f1c-4624-abbd-03e6ee59ebba\") " Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.620639 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-kube-api-access-tkhvc" (OuterVolumeSpecName: "kube-api-access-tkhvc") pod "c4528d6f-0f1c-4624-abbd-03e6ee59ebba" (UID: "c4528d6f-0f1c-4624-abbd-03e6ee59ebba"). InnerVolumeSpecName "kube-api-access-tkhvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.622271 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c4528d6f-0f1c-4624-abbd-03e6ee59ebba" (UID: "c4528d6f-0f1c-4624-abbd-03e6ee59ebba"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.622455 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c4528d6f-0f1c-4624-abbd-03e6ee59ebba" (UID: "c4528d6f-0f1c-4624-abbd-03e6ee59ebba"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.622677 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "c4528d6f-0f1c-4624-abbd-03e6ee59ebba" (UID: "c4528d6f-0f1c-4624-abbd-03e6ee59ebba"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.623456 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "c4528d6f-0f1c-4624-abbd-03e6ee59ebba" (UID: "c4528d6f-0f1c-4624-abbd-03e6ee59ebba"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.624935 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "c4528d6f-0f1c-4624-abbd-03e6ee59ebba" (UID: "c4528d6f-0f1c-4624-abbd-03e6ee59ebba"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.625306 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c4528d6f-0f1c-4624-abbd-03e6ee59ebba" (UID: "c4528d6f-0f1c-4624-abbd-03e6ee59ebba"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.625560 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ceph" (OuterVolumeSpecName: "ceph") pod "c4528d6f-0f1c-4624-abbd-03e6ee59ebba" (UID: "c4528d6f-0f1c-4624-abbd-03e6ee59ebba"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.625599 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "c4528d6f-0f1c-4624-abbd-03e6ee59ebba" (UID: "c4528d6f-0f1c-4624-abbd-03e6ee59ebba"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.640775 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "c4528d6f-0f1c-4624-abbd-03e6ee59ebba" (UID: "c4528d6f-0f1c-4624-abbd-03e6ee59ebba"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.649324 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-inventory" (OuterVolumeSpecName: "inventory") pod "c4528d6f-0f1c-4624-abbd-03e6ee59ebba" (UID: "c4528d6f-0f1c-4624-abbd-03e6ee59ebba"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.651938 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c4528d6f-0f1c-4624-abbd-03e6ee59ebba" (UID: "c4528d6f-0f1c-4624-abbd-03e6ee59ebba"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.718945 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkhvc\" (UniqueName: \"kubernetes.io/projected/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-kube-api-access-tkhvc\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.718994 4766 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.719028 4766 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.719042 4766 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.719080 4766 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.719093 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.719105 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.719117 4766 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.719126 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.719136 4766 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.719146 4766 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:37 crc kubenswrapper[4766]: I1002 13:03:37.719156 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c4528d6f-0f1c-4624-abbd-03e6ee59ebba-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.036873 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" event={"ID":"c4528d6f-0f1c-4624-abbd-03e6ee59ebba","Type":"ContainerDied","Data":"2ca43b3a81a703d5b6f7196fb57fd6f7d29e9a55f401b88c17781a3f5ca33b2c"} Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.037126 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ca43b3a81a703d5b6f7196fb57fd6f7d29e9a55f401b88c17781a3f5ca33b2c" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.036975 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-cq4mv" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.132688 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-d6hht"] Oct 02 13:03:38 crc kubenswrapper[4766]: E1002 13:03:38.133231 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4528d6f-0f1c-4624-abbd-03e6ee59ebba" containerName="install-certs-openstack-openstack-cell1" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.133250 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4528d6f-0f1c-4624-abbd-03e6ee59ebba" containerName="install-certs-openstack-openstack-cell1" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.133517 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4528d6f-0f1c-4624-abbd-03e6ee59ebba" containerName="install-certs-openstack-openstack-cell1" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.134330 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.138074 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.144598 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.144600 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.145234 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.149945 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-d6hht"] Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.235659 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pkcq\" (UniqueName: \"kubernetes.io/projected/79198143-fe64-4121-bb10-e86f9425fc1d-kube-api-access-9pkcq\") pod \"ceph-client-openstack-openstack-cell1-d6hht\" (UID: \"79198143-fe64-4121-bb10-e86f9425fc1d\") " pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.235963 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-d6hht\" (UID: \"79198143-fe64-4121-bb10-e86f9425fc1d\") " pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.236050 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-inventory\") pod \"ceph-client-openstack-openstack-cell1-d6hht\" (UID: \"79198143-fe64-4121-bb10-e86f9425fc1d\") " pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.236311 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-ceph\") pod \"ceph-client-openstack-openstack-cell1-d6hht\" (UID: \"79198143-fe64-4121-bb10-e86f9425fc1d\") " pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.338759 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-ceph\") pod \"ceph-client-openstack-openstack-cell1-d6hht\" (UID: \"79198143-fe64-4121-bb10-e86f9425fc1d\") " pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.338892 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pkcq\" (UniqueName: \"kubernetes.io/projected/79198143-fe64-4121-bb10-e86f9425fc1d-kube-api-access-9pkcq\") pod \"ceph-client-openstack-openstack-cell1-d6hht\" (UID: \"79198143-fe64-4121-bb10-e86f9425fc1d\") " pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.339373 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-d6hht\" (UID: \"79198143-fe64-4121-bb10-e86f9425fc1d\") " pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.339781 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-inventory\") pod \"ceph-client-openstack-openstack-cell1-d6hht\" (UID: \"79198143-fe64-4121-bb10-e86f9425fc1d\") " pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.350893 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-ceph\") pod \"ceph-client-openstack-openstack-cell1-d6hht\" (UID: \"79198143-fe64-4121-bb10-e86f9425fc1d\") " pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.350984 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-d6hht\" (UID: \"79198143-fe64-4121-bb10-e86f9425fc1d\") " pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.351065 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-inventory\") pod \"ceph-client-openstack-openstack-cell1-d6hht\" (UID: \"79198143-fe64-4121-bb10-e86f9425fc1d\") " pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.407458 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pkcq\" (UniqueName: \"kubernetes.io/projected/79198143-fe64-4121-bb10-e86f9425fc1d-kube-api-access-9pkcq\") pod \"ceph-client-openstack-openstack-cell1-d6hht\" (UID: \"79198143-fe64-4121-bb10-e86f9425fc1d\") " pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" Oct 02 13:03:38 crc kubenswrapper[4766]: I1002 13:03:38.475398 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" Oct 02 13:03:39 crc kubenswrapper[4766]: I1002 13:03:39.005980 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-d6hht"] Oct 02 13:03:39 crc kubenswrapper[4766]: W1002 13:03:39.014655 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79198143_fe64_4121_bb10_e86f9425fc1d.slice/crio-b3465ba272c22b9e9f9cab01d3c9c0ae6f838a3fff7678453c7be6cdb5b68c38 WatchSource:0}: Error finding container b3465ba272c22b9e9f9cab01d3c9c0ae6f838a3fff7678453c7be6cdb5b68c38: Status 404 returned error can't find the container with id b3465ba272c22b9e9f9cab01d3c9c0ae6f838a3fff7678453c7be6cdb5b68c38 Oct 02 13:03:39 crc kubenswrapper[4766]: I1002 13:03:39.048090 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" event={"ID":"79198143-fe64-4121-bb10-e86f9425fc1d","Type":"ContainerStarted","Data":"b3465ba272c22b9e9f9cab01d3c9c0ae6f838a3fff7678453c7be6cdb5b68c38"} Oct 02 13:03:40 crc kubenswrapper[4766]: I1002 13:03:40.073990 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" event={"ID":"79198143-fe64-4121-bb10-e86f9425fc1d","Type":"ContainerStarted","Data":"f78d4e813a59357ccb4675b51b759457c6cac521c3a8ab7e7c51b75a58aedbbe"} Oct 02 13:03:41 crc kubenswrapper[4766]: I1002 13:03:41.133853 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" podStartSLOduration=2.409536784 podStartE2EDuration="3.133831365s" podCreationTimestamp="2025-10-02 13:03:38 +0000 UTC" firstStartedPulling="2025-10-02 13:03:39.020056916 +0000 UTC m=+7933.962927860" lastFinishedPulling="2025-10-02 13:03:39.744351497 +0000 UTC m=+7934.687222441" observedRunningTime="2025-10-02 13:03:41.131869771 +0000 UTC m=+7936.074740725" watchObservedRunningTime="2025-10-02 13:03:41.133831365 +0000 UTC m=+7936.076702299" Oct 02 13:03:46 crc kubenswrapper[4766]: I1002 13:03:46.142557 4766 generic.go:334] "Generic (PLEG): container finished" podID="79198143-fe64-4121-bb10-e86f9425fc1d" containerID="f78d4e813a59357ccb4675b51b759457c6cac521c3a8ab7e7c51b75a58aedbbe" exitCode=0 Oct 02 13:03:46 crc kubenswrapper[4766]: I1002 13:03:46.142728 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" event={"ID":"79198143-fe64-4121-bb10-e86f9425fc1d","Type":"ContainerDied","Data":"f78d4e813a59357ccb4675b51b759457c6cac521c3a8ab7e7c51b75a58aedbbe"} Oct 02 13:03:47 crc kubenswrapper[4766]: I1002 13:03:47.597640 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" Oct 02 13:03:47 crc kubenswrapper[4766]: I1002 13:03:47.659099 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-ceph\") pod \"79198143-fe64-4121-bb10-e86f9425fc1d\" (UID: \"79198143-fe64-4121-bb10-e86f9425fc1d\") " Oct 02 13:03:47 crc kubenswrapper[4766]: I1002 13:03:47.659165 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pkcq\" (UniqueName: \"kubernetes.io/projected/79198143-fe64-4121-bb10-e86f9425fc1d-kube-api-access-9pkcq\") pod \"79198143-fe64-4121-bb10-e86f9425fc1d\" (UID: \"79198143-fe64-4121-bb10-e86f9425fc1d\") " Oct 02 13:03:47 crc kubenswrapper[4766]: I1002 13:03:47.659469 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-ssh-key\") pod \"79198143-fe64-4121-bb10-e86f9425fc1d\" (UID: \"79198143-fe64-4121-bb10-e86f9425fc1d\") " Oct 02 13:03:47 crc kubenswrapper[4766]: I1002 13:03:47.659580 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-inventory\") pod \"79198143-fe64-4121-bb10-e86f9425fc1d\" (UID: \"79198143-fe64-4121-bb10-e86f9425fc1d\") " Oct 02 13:03:47 crc kubenswrapper[4766]: I1002 13:03:47.665393 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-ceph" (OuterVolumeSpecName: "ceph") pod "79198143-fe64-4121-bb10-e86f9425fc1d" (UID: "79198143-fe64-4121-bb10-e86f9425fc1d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:47 crc kubenswrapper[4766]: I1002 13:03:47.666023 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79198143-fe64-4121-bb10-e86f9425fc1d-kube-api-access-9pkcq" (OuterVolumeSpecName: "kube-api-access-9pkcq") pod "79198143-fe64-4121-bb10-e86f9425fc1d" (UID: "79198143-fe64-4121-bb10-e86f9425fc1d"). InnerVolumeSpecName "kube-api-access-9pkcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:03:47 crc kubenswrapper[4766]: I1002 13:03:47.694027 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "79198143-fe64-4121-bb10-e86f9425fc1d" (UID: "79198143-fe64-4121-bb10-e86f9425fc1d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:47 crc kubenswrapper[4766]: I1002 13:03:47.702262 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-inventory" (OuterVolumeSpecName: "inventory") pod "79198143-fe64-4121-bb10-e86f9425fc1d" (UID: "79198143-fe64-4121-bb10-e86f9425fc1d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:03:47 crc kubenswrapper[4766]: I1002 13:03:47.763003 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:47 crc kubenswrapper[4766]: I1002 13:03:47.763109 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:47 crc kubenswrapper[4766]: I1002 13:03:47.763118 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/79198143-fe64-4121-bb10-e86f9425fc1d-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:47 crc kubenswrapper[4766]: I1002 13:03:47.763128 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pkcq\" (UniqueName: \"kubernetes.io/projected/79198143-fe64-4121-bb10-e86f9425fc1d-kube-api-access-9pkcq\") on node \"crc\" DevicePath \"\"" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.171371 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" event={"ID":"79198143-fe64-4121-bb10-e86f9425fc1d","Type":"ContainerDied","Data":"b3465ba272c22b9e9f9cab01d3c9c0ae6f838a3fff7678453c7be6cdb5b68c38"} Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.171411 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-d6hht" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.171420 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3465ba272c22b9e9f9cab01d3c9c0ae6f838a3fff7678453c7be6cdb5b68c38" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.254314 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-2wfs4"] Oct 02 13:03:48 crc kubenswrapper[4766]: E1002 13:03:48.254772 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79198143-fe64-4121-bb10-e86f9425fc1d" containerName="ceph-client-openstack-openstack-cell1" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.254790 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="79198143-fe64-4121-bb10-e86f9425fc1d" containerName="ceph-client-openstack-openstack-cell1" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.255027 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="79198143-fe64-4121-bb10-e86f9425fc1d" containerName="ceph-client-openstack-openstack-cell1" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.255785 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.260297 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.260574 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.260714 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.260834 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.260943 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.268167 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-2wfs4"] Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.373453 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn9pc\" (UniqueName: \"kubernetes.io/projected/a5b18380-ebf6-4969-b53a-463b4734baa9-kube-api-access-kn9pc\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.373605 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ceph\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.373667 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ssh-key\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.373710 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.373735 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a5b18380-ebf6-4969-b53a-463b4734baa9-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.373876 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-inventory\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.476137 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-inventory\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.476270 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn9pc\" (UniqueName: \"kubernetes.io/projected/a5b18380-ebf6-4969-b53a-463b4734baa9-kube-api-access-kn9pc\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.476470 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ceph\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.476592 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ssh-key\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.476661 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.476702 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a5b18380-ebf6-4969-b53a-463b4734baa9-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.478367 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a5b18380-ebf6-4969-b53a-463b4734baa9-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.482740 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ssh-key\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.482917 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ceph\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.483227 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.483707 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-inventory\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.500420 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn9pc\" (UniqueName: \"kubernetes.io/projected/a5b18380-ebf6-4969-b53a-463b4734baa9-kube-api-access-kn9pc\") pod \"ovn-openstack-openstack-cell1-2wfs4\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:48 crc kubenswrapper[4766]: I1002 13:03:48.583978 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:03:49 crc kubenswrapper[4766]: W1002 13:03:49.156784 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5b18380_ebf6_4969_b53a_463b4734baa9.slice/crio-d84ab1ae267f12f217fc61d7529be2eaa7a876c5d0c5b34a5c06e5c50eb15cb1 WatchSource:0}: Error finding container d84ab1ae267f12f217fc61d7529be2eaa7a876c5d0c5b34a5c06e5c50eb15cb1: Status 404 returned error can't find the container with id d84ab1ae267f12f217fc61d7529be2eaa7a876c5d0c5b34a5c06e5c50eb15cb1 Oct 02 13:03:49 crc kubenswrapper[4766]: I1002 13:03:49.160312 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-2wfs4"] Oct 02 13:03:49 crc kubenswrapper[4766]: I1002 13:03:49.187959 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-2wfs4" event={"ID":"a5b18380-ebf6-4969-b53a-463b4734baa9","Type":"ContainerStarted","Data":"d84ab1ae267f12f217fc61d7529be2eaa7a876c5d0c5b34a5c06e5c50eb15cb1"} Oct 02 13:03:50 crc kubenswrapper[4766]: I1002 13:03:50.198543 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-2wfs4" event={"ID":"a5b18380-ebf6-4969-b53a-463b4734baa9","Type":"ContainerStarted","Data":"64d32aa9935483df2fd23af5518cf149ac4e4fac20351b4769980551b0988a26"} Oct 02 13:03:50 crc kubenswrapper[4766]: I1002 13:03:50.250880 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-2wfs4" podStartSLOduration=2.101608049 podStartE2EDuration="2.25083734s" podCreationTimestamp="2025-10-02 13:03:48 +0000 UTC" firstStartedPulling="2025-10-02 13:03:49.159559664 +0000 UTC m=+7944.102430608" lastFinishedPulling="2025-10-02 13:03:49.308788935 +0000 UTC m=+7944.251659899" observedRunningTime="2025-10-02 13:03:50.233155483 +0000 UTC m=+7945.176026467" watchObservedRunningTime="2025-10-02 13:03:50.25083734 +0000 UTC m=+7945.193708304" Oct 02 13:03:51 crc kubenswrapper[4766]: I1002 13:03:51.881399 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:03:51 crc kubenswrapper[4766]: E1002 13:03:51.881946 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:04:05 crc kubenswrapper[4766]: I1002 13:04:05.891074 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:04:05 crc kubenswrapper[4766]: E1002 13:04:05.892273 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:04:17 crc kubenswrapper[4766]: I1002 13:04:17.882080 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:04:17 crc kubenswrapper[4766]: E1002 13:04:17.882844 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:04:31 crc kubenswrapper[4766]: I1002 13:04:31.882529 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:04:31 crc kubenswrapper[4766]: E1002 13:04:31.883531 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:04:42 crc kubenswrapper[4766]: I1002 13:04:42.881423 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:04:42 crc kubenswrapper[4766]: E1002 13:04:42.882212 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:04:43 crc kubenswrapper[4766]: I1002 13:04:43.767311 4766 generic.go:334] "Generic (PLEG): container finished" podID="a5b18380-ebf6-4969-b53a-463b4734baa9" containerID="64d32aa9935483df2fd23af5518cf149ac4e4fac20351b4769980551b0988a26" exitCode=0 Oct 02 13:04:43 crc kubenswrapper[4766]: I1002 13:04:43.767381 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-2wfs4" event={"ID":"a5b18380-ebf6-4969-b53a-463b4734baa9","Type":"ContainerDied","Data":"64d32aa9935483df2fd23af5518cf149ac4e4fac20351b4769980551b0988a26"} Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.310106 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.478437 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a5b18380-ebf6-4969-b53a-463b4734baa9-ovncontroller-config-0\") pod \"a5b18380-ebf6-4969-b53a-463b4734baa9\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.478616 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ovn-combined-ca-bundle\") pod \"a5b18380-ebf6-4969-b53a-463b4734baa9\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.478722 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-inventory\") pod \"a5b18380-ebf6-4969-b53a-463b4734baa9\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.478875 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ceph\") pod \"a5b18380-ebf6-4969-b53a-463b4734baa9\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.479023 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn9pc\" (UniqueName: \"kubernetes.io/projected/a5b18380-ebf6-4969-b53a-463b4734baa9-kube-api-access-kn9pc\") pod \"a5b18380-ebf6-4969-b53a-463b4734baa9\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.479055 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ssh-key\") pod \"a5b18380-ebf6-4969-b53a-463b4734baa9\" (UID: \"a5b18380-ebf6-4969-b53a-463b4734baa9\") " Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.484261 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5b18380-ebf6-4969-b53a-463b4734baa9-kube-api-access-kn9pc" (OuterVolumeSpecName: "kube-api-access-kn9pc") pod "a5b18380-ebf6-4969-b53a-463b4734baa9" (UID: "a5b18380-ebf6-4969-b53a-463b4734baa9"). InnerVolumeSpecName "kube-api-access-kn9pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.484675 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ceph" (OuterVolumeSpecName: "ceph") pod "a5b18380-ebf6-4969-b53a-463b4734baa9" (UID: "a5b18380-ebf6-4969-b53a-463b4734baa9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.486858 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a5b18380-ebf6-4969-b53a-463b4734baa9" (UID: "a5b18380-ebf6-4969-b53a-463b4734baa9"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.507470 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5b18380-ebf6-4969-b53a-463b4734baa9-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "a5b18380-ebf6-4969-b53a-463b4734baa9" (UID: "a5b18380-ebf6-4969-b53a-463b4734baa9"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.511807 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-inventory" (OuterVolumeSpecName: "inventory") pod "a5b18380-ebf6-4969-b53a-463b4734baa9" (UID: "a5b18380-ebf6-4969-b53a-463b4734baa9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.514173 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a5b18380-ebf6-4969-b53a-463b4734baa9" (UID: "a5b18380-ebf6-4969-b53a-463b4734baa9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.580658 4766 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a5b18380-ebf6-4969-b53a-463b4734baa9-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.580867 4766 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.580955 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.581009 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.581060 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn9pc\" (UniqueName: \"kubernetes.io/projected/a5b18380-ebf6-4969-b53a-463b4734baa9-kube-api-access-kn9pc\") on node \"crc\" DevicePath \"\"" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.581117 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5b18380-ebf6-4969-b53a-463b4734baa9-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.791476 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-2wfs4" event={"ID":"a5b18380-ebf6-4969-b53a-463b4734baa9","Type":"ContainerDied","Data":"d84ab1ae267f12f217fc61d7529be2eaa7a876c5d0c5b34a5c06e5c50eb15cb1"} Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.791553 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d84ab1ae267f12f217fc61d7529be2eaa7a876c5d0c5b34a5c06e5c50eb15cb1" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.791644 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-2wfs4" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.992876 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-xvfww"] Oct 02 13:04:45 crc kubenswrapper[4766]: E1002 13:04:45.997363 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b18380-ebf6-4969-b53a-463b4734baa9" containerName="ovn-openstack-openstack-cell1" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.997465 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b18380-ebf6-4969-b53a-463b4734baa9" containerName="ovn-openstack-openstack-cell1" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.997834 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5b18380-ebf6-4969-b53a-463b4734baa9" containerName="ovn-openstack-openstack-cell1" Oct 02 13:04:45 crc kubenswrapper[4766]: I1002 13:04:45.998821 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.007287 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.007605 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.007769 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.007892 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.015973 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-xvfww"] Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.016146 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.016148 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.193959 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.194676 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.194726 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.194756 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.194873 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.194942 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcw9j\" (UniqueName: \"kubernetes.io/projected/4e487036-bfdb-4c21-9e4a-7abec8f180d7-kube-api-access-lcw9j\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.195023 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.296981 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcw9j\" (UniqueName: \"kubernetes.io/projected/4e487036-bfdb-4c21-9e4a-7abec8f180d7-kube-api-access-lcw9j\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.297100 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.297291 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.297547 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.297662 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.297728 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.297831 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.301072 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.301574 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.302098 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.302116 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.303286 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.303806 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.319277 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcw9j\" (UniqueName: \"kubernetes.io/projected/4e487036-bfdb-4c21-9e4a-7abec8f180d7-kube-api-access-lcw9j\") pod \"neutron-metadata-openstack-openstack-cell1-xvfww\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.345933 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.904294 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-xvfww"] Oct 02 13:04:46 crc kubenswrapper[4766]: I1002 13:04:46.904413 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 13:04:47 crc kubenswrapper[4766]: I1002 13:04:47.811055 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" event={"ID":"4e487036-bfdb-4c21-9e4a-7abec8f180d7","Type":"ContainerStarted","Data":"70dc1a75bfd08e3e2a8f0845eb6bc7cc1b5d18293ba43d254c45289f25a2413b"} Oct 02 13:04:47 crc kubenswrapper[4766]: I1002 13:04:47.811438 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" event={"ID":"4e487036-bfdb-4c21-9e4a-7abec8f180d7","Type":"ContainerStarted","Data":"9d424d198e40ed492d57d4d9ed77b793337e0fb7a1813e68b3da79febd3daa05"} Oct 02 13:04:47 crc kubenswrapper[4766]: I1002 13:04:47.846395 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" podStartSLOduration=2.661623189 podStartE2EDuration="2.846366637s" podCreationTimestamp="2025-10-02 13:04:45 +0000 UTC" firstStartedPulling="2025-10-02 13:04:46.904069984 +0000 UTC m=+8001.846940938" lastFinishedPulling="2025-10-02 13:04:47.088813432 +0000 UTC m=+8002.031684386" observedRunningTime="2025-10-02 13:04:47.838275638 +0000 UTC m=+8002.781146582" watchObservedRunningTime="2025-10-02 13:04:47.846366637 +0000 UTC m=+8002.789237591" Oct 02 13:04:55 crc kubenswrapper[4766]: I1002 13:04:55.894175 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:04:55 crc kubenswrapper[4766]: E1002 13:04:55.895127 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:05:07 crc kubenswrapper[4766]: I1002 13:05:07.883366 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:05:07 crc kubenswrapper[4766]: E1002 13:05:07.884371 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:05:21 crc kubenswrapper[4766]: I1002 13:05:21.881752 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:05:21 crc kubenswrapper[4766]: E1002 13:05:21.882729 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:05:27 crc kubenswrapper[4766]: I1002 13:05:27.297830 4766 generic.go:334] "Generic (PLEG): container finished" podID="4e487036-bfdb-4c21-9e4a-7abec8f180d7" containerID="70dc1a75bfd08e3e2a8f0845eb6bc7cc1b5d18293ba43d254c45289f25a2413b" exitCode=0 Oct 02 13:05:27 crc kubenswrapper[4766]: I1002 13:05:27.299028 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" event={"ID":"4e487036-bfdb-4c21-9e4a-7abec8f180d7","Type":"ContainerDied","Data":"70dc1a75bfd08e3e2a8f0845eb6bc7cc1b5d18293ba43d254c45289f25a2413b"} Oct 02 13:05:28 crc kubenswrapper[4766]: I1002 13:05:28.819196 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:05:28 crc kubenswrapper[4766]: I1002 13:05:28.920023 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-inventory\") pod \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " Oct 02 13:05:28 crc kubenswrapper[4766]: I1002 13:05:28.920080 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-ssh-key\") pod \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " Oct 02 13:05:28 crc kubenswrapper[4766]: I1002 13:05:28.920162 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-neutron-ovn-metadata-agent-neutron-config-0\") pod \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " Oct 02 13:05:28 crc kubenswrapper[4766]: I1002 13:05:28.920218 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-ceph\") pod \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " Oct 02 13:05:28 crc kubenswrapper[4766]: I1002 13:05:28.920285 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcw9j\" (UniqueName: \"kubernetes.io/projected/4e487036-bfdb-4c21-9e4a-7abec8f180d7-kube-api-access-lcw9j\") pod \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " Oct 02 13:05:28 crc kubenswrapper[4766]: I1002 13:05:28.920318 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-nova-metadata-neutron-config-0\") pod \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " Oct 02 13:05:28 crc kubenswrapper[4766]: I1002 13:05:28.920334 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-neutron-metadata-combined-ca-bundle\") pod \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\" (UID: \"4e487036-bfdb-4c21-9e4a-7abec8f180d7\") " Oct 02 13:05:28 crc kubenswrapper[4766]: I1002 13:05:28.926703 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4e487036-bfdb-4c21-9e4a-7abec8f180d7" (UID: "4e487036-bfdb-4c21-9e4a-7abec8f180d7"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:05:28 crc kubenswrapper[4766]: I1002 13:05:28.926980 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-ceph" (OuterVolumeSpecName: "ceph") pod "4e487036-bfdb-4c21-9e4a-7abec8f180d7" (UID: "4e487036-bfdb-4c21-9e4a-7abec8f180d7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:05:28 crc kubenswrapper[4766]: I1002 13:05:28.927679 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e487036-bfdb-4c21-9e4a-7abec8f180d7-kube-api-access-lcw9j" (OuterVolumeSpecName: "kube-api-access-lcw9j") pod "4e487036-bfdb-4c21-9e4a-7abec8f180d7" (UID: "4e487036-bfdb-4c21-9e4a-7abec8f180d7"). InnerVolumeSpecName "kube-api-access-lcw9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:05:28 crc kubenswrapper[4766]: I1002 13:05:28.953794 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4e487036-bfdb-4c21-9e4a-7abec8f180d7" (UID: "4e487036-bfdb-4c21-9e4a-7abec8f180d7"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:05:28 crc kubenswrapper[4766]: I1002 13:05:28.954257 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "4e487036-bfdb-4c21-9e4a-7abec8f180d7" (UID: "4e487036-bfdb-4c21-9e4a-7abec8f180d7"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:05:28 crc kubenswrapper[4766]: I1002 13:05:28.961333 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-inventory" (OuterVolumeSpecName: "inventory") pod "4e487036-bfdb-4c21-9e4a-7abec8f180d7" (UID: "4e487036-bfdb-4c21-9e4a-7abec8f180d7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:05:28 crc kubenswrapper[4766]: I1002 13:05:28.962732 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "4e487036-bfdb-4c21-9e4a-7abec8f180d7" (UID: "4e487036-bfdb-4c21-9e4a-7abec8f180d7"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.022935 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.023061 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.023072 4766 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.023081 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.023092 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcw9j\" (UniqueName: \"kubernetes.io/projected/4e487036-bfdb-4c21-9e4a-7abec8f180d7-kube-api-access-lcw9j\") on node \"crc\" DevicePath \"\"" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.023106 4766 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.023116 4766 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e487036-bfdb-4c21-9e4a-7abec8f180d7-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.326097 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" event={"ID":"4e487036-bfdb-4c21-9e4a-7abec8f180d7","Type":"ContainerDied","Data":"9d424d198e40ed492d57d4d9ed77b793337e0fb7a1813e68b3da79febd3daa05"} Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.326149 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d424d198e40ed492d57d4d9ed77b793337e0fb7a1813e68b3da79febd3daa05" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.326177 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-xvfww" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.478707 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-qhxsq"] Oct 02 13:05:29 crc kubenswrapper[4766]: E1002 13:05:29.479763 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e487036-bfdb-4c21-9e4a-7abec8f180d7" containerName="neutron-metadata-openstack-openstack-cell1" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.479801 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e487036-bfdb-4c21-9e4a-7abec8f180d7" containerName="neutron-metadata-openstack-openstack-cell1" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.480336 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e487036-bfdb-4c21-9e4a-7abec8f180d7" containerName="neutron-metadata-openstack-openstack-cell1" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.482033 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.485284 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.485570 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.485860 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.486205 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.486554 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.491199 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-qhxsq"] Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.639577 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.639773 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-inventory\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.640220 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdb27\" (UniqueName: \"kubernetes.io/projected/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-kube-api-access-kdb27\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.640355 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.640461 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-ssh-key\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.640655 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-ceph\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.743111 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdb27\" (UniqueName: \"kubernetes.io/projected/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-kube-api-access-kdb27\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.743328 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.743380 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-ssh-key\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.743635 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-ceph\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.743669 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.743986 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-inventory\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.748889 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-inventory\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.748950 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.749764 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-ceph\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.749958 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.755452 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-ssh-key\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.763146 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdb27\" (UniqueName: \"kubernetes.io/projected/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-kube-api-access-kdb27\") pod \"libvirt-openstack-openstack-cell1-qhxsq\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:29 crc kubenswrapper[4766]: I1002 13:05:29.821873 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:05:30 crc kubenswrapper[4766]: I1002 13:05:30.373477 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-qhxsq"] Oct 02 13:05:31 crc kubenswrapper[4766]: I1002 13:05:31.353629 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" event={"ID":"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22","Type":"ContainerStarted","Data":"89cc69b4c7b152b6262012b9a0f19a2c3bce7d4c26bce97c653bc2ead78c1eb2"} Oct 02 13:05:31 crc kubenswrapper[4766]: I1002 13:05:31.354085 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" event={"ID":"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22","Type":"ContainerStarted","Data":"a7fa9af246ef4f508ddcac6db02cb6dc53e1c55dd67a2d990eb70e748e934f6a"} Oct 02 13:05:31 crc kubenswrapper[4766]: I1002 13:05:31.386805 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" podStartSLOduration=2.168480326 podStartE2EDuration="2.386776469s" podCreationTimestamp="2025-10-02 13:05:29 +0000 UTC" firstStartedPulling="2025-10-02 13:05:30.376652562 +0000 UTC m=+8045.319523526" lastFinishedPulling="2025-10-02 13:05:30.594948715 +0000 UTC m=+8045.537819669" observedRunningTime="2025-10-02 13:05:31.376973405 +0000 UTC m=+8046.319844379" watchObservedRunningTime="2025-10-02 13:05:31.386776469 +0000 UTC m=+8046.329647453" Oct 02 13:05:36 crc kubenswrapper[4766]: I1002 13:05:36.882677 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:05:37 crc kubenswrapper[4766]: I1002 13:05:37.442397 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"74c2163ca1fa7ad09b7ba7a3b4eb6190f280a4c4e20f090ee0f37ef7b01fce4d"} Oct 02 13:07:24 crc kubenswrapper[4766]: I1002 13:07:24.676083 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8vg9j"] Oct 02 13:07:24 crc kubenswrapper[4766]: I1002 13:07:24.682105 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vg9j" Oct 02 13:07:24 crc kubenswrapper[4766]: I1002 13:07:24.690256 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8vg9j"] Oct 02 13:07:24 crc kubenswrapper[4766]: I1002 13:07:24.800984 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a31e8aa-eb2f-467c-a985-a719a3040921-catalog-content\") pod \"certified-operators-8vg9j\" (UID: \"3a31e8aa-eb2f-467c-a985-a719a3040921\") " pod="openshift-marketplace/certified-operators-8vg9j" Oct 02 13:07:24 crc kubenswrapper[4766]: I1002 13:07:24.801038 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4w4q\" (UniqueName: \"kubernetes.io/projected/3a31e8aa-eb2f-467c-a985-a719a3040921-kube-api-access-f4w4q\") pod \"certified-operators-8vg9j\" (UID: \"3a31e8aa-eb2f-467c-a985-a719a3040921\") " pod="openshift-marketplace/certified-operators-8vg9j" Oct 02 13:07:24 crc kubenswrapper[4766]: I1002 13:07:24.801075 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a31e8aa-eb2f-467c-a985-a719a3040921-utilities\") pod \"certified-operators-8vg9j\" (UID: \"3a31e8aa-eb2f-467c-a985-a719a3040921\") " pod="openshift-marketplace/certified-operators-8vg9j" Oct 02 13:07:24 crc kubenswrapper[4766]: I1002 13:07:24.902768 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a31e8aa-eb2f-467c-a985-a719a3040921-catalog-content\") pod \"certified-operators-8vg9j\" (UID: \"3a31e8aa-eb2f-467c-a985-a719a3040921\") " pod="openshift-marketplace/certified-operators-8vg9j" Oct 02 13:07:24 crc kubenswrapper[4766]: I1002 13:07:24.902830 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4w4q\" (UniqueName: \"kubernetes.io/projected/3a31e8aa-eb2f-467c-a985-a719a3040921-kube-api-access-f4w4q\") pod \"certified-operators-8vg9j\" (UID: \"3a31e8aa-eb2f-467c-a985-a719a3040921\") " pod="openshift-marketplace/certified-operators-8vg9j" Oct 02 13:07:24 crc kubenswrapper[4766]: I1002 13:07:24.902879 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a31e8aa-eb2f-467c-a985-a719a3040921-utilities\") pod \"certified-operators-8vg9j\" (UID: \"3a31e8aa-eb2f-467c-a985-a719a3040921\") " pod="openshift-marketplace/certified-operators-8vg9j" Oct 02 13:07:24 crc kubenswrapper[4766]: I1002 13:07:24.903305 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a31e8aa-eb2f-467c-a985-a719a3040921-catalog-content\") pod \"certified-operators-8vg9j\" (UID: \"3a31e8aa-eb2f-467c-a985-a719a3040921\") " pod="openshift-marketplace/certified-operators-8vg9j" Oct 02 13:07:24 crc kubenswrapper[4766]: I1002 13:07:24.903376 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a31e8aa-eb2f-467c-a985-a719a3040921-utilities\") pod \"certified-operators-8vg9j\" (UID: \"3a31e8aa-eb2f-467c-a985-a719a3040921\") " pod="openshift-marketplace/certified-operators-8vg9j" Oct 02 13:07:24 crc kubenswrapper[4766]: I1002 13:07:24.923199 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4w4q\" (UniqueName: \"kubernetes.io/projected/3a31e8aa-eb2f-467c-a985-a719a3040921-kube-api-access-f4w4q\") pod \"certified-operators-8vg9j\" (UID: \"3a31e8aa-eb2f-467c-a985-a719a3040921\") " pod="openshift-marketplace/certified-operators-8vg9j" Oct 02 13:07:25 crc kubenswrapper[4766]: I1002 13:07:25.006019 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vg9j" Oct 02 13:07:25 crc kubenswrapper[4766]: I1002 13:07:25.499931 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8vg9j"] Oct 02 13:07:25 crc kubenswrapper[4766]: I1002 13:07:25.693799 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vg9j" event={"ID":"3a31e8aa-eb2f-467c-a985-a719a3040921","Type":"ContainerStarted","Data":"6a55b0177479eb21bd8d603d6b37c0288eb49c10b1461e024635d7ebe9b9f3a0"} Oct 02 13:07:26 crc kubenswrapper[4766]: I1002 13:07:26.720101 4766 generic.go:334] "Generic (PLEG): container finished" podID="3a31e8aa-eb2f-467c-a985-a719a3040921" containerID="f76cf80a0c646b5f17b5ecbf58d90f32bbf4039fef95cd23072bbf4401255beb" exitCode=0 Oct 02 13:07:26 crc kubenswrapper[4766]: I1002 13:07:26.720186 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vg9j" event={"ID":"3a31e8aa-eb2f-467c-a985-a719a3040921","Type":"ContainerDied","Data":"f76cf80a0c646b5f17b5ecbf58d90f32bbf4039fef95cd23072bbf4401255beb"} Oct 02 13:07:27 crc kubenswrapper[4766]: I1002 13:07:27.067982 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r949w"] Oct 02 13:07:27 crc kubenswrapper[4766]: I1002 13:07:27.073527 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r949w" Oct 02 13:07:27 crc kubenswrapper[4766]: I1002 13:07:27.079174 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r949w"] Oct 02 13:07:27 crc kubenswrapper[4766]: I1002 13:07:27.148052 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/572cfce3-4886-4695-b545-f21a3fe74c07-catalog-content\") pod \"community-operators-r949w\" (UID: \"572cfce3-4886-4695-b545-f21a3fe74c07\") " pod="openshift-marketplace/community-operators-r949w" Oct 02 13:07:27 crc kubenswrapper[4766]: I1002 13:07:27.148146 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5k55\" (UniqueName: \"kubernetes.io/projected/572cfce3-4886-4695-b545-f21a3fe74c07-kube-api-access-n5k55\") pod \"community-operators-r949w\" (UID: \"572cfce3-4886-4695-b545-f21a3fe74c07\") " pod="openshift-marketplace/community-operators-r949w" Oct 02 13:07:27 crc kubenswrapper[4766]: I1002 13:07:27.148189 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/572cfce3-4886-4695-b545-f21a3fe74c07-utilities\") pod \"community-operators-r949w\" (UID: \"572cfce3-4886-4695-b545-f21a3fe74c07\") " pod="openshift-marketplace/community-operators-r949w" Oct 02 13:07:27 crc kubenswrapper[4766]: I1002 13:07:27.250197 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/572cfce3-4886-4695-b545-f21a3fe74c07-catalog-content\") pod \"community-operators-r949w\" (UID: \"572cfce3-4886-4695-b545-f21a3fe74c07\") " pod="openshift-marketplace/community-operators-r949w" Oct 02 13:07:27 crc kubenswrapper[4766]: I1002 13:07:27.250637 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5k55\" (UniqueName: \"kubernetes.io/projected/572cfce3-4886-4695-b545-f21a3fe74c07-kube-api-access-n5k55\") pod \"community-operators-r949w\" (UID: \"572cfce3-4886-4695-b545-f21a3fe74c07\") " pod="openshift-marketplace/community-operators-r949w" Oct 02 13:07:27 crc kubenswrapper[4766]: I1002 13:07:27.250679 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/572cfce3-4886-4695-b545-f21a3fe74c07-utilities\") pod \"community-operators-r949w\" (UID: \"572cfce3-4886-4695-b545-f21a3fe74c07\") " pod="openshift-marketplace/community-operators-r949w" Oct 02 13:07:27 crc kubenswrapper[4766]: I1002 13:07:27.250829 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/572cfce3-4886-4695-b545-f21a3fe74c07-catalog-content\") pod \"community-operators-r949w\" (UID: \"572cfce3-4886-4695-b545-f21a3fe74c07\") " pod="openshift-marketplace/community-operators-r949w" Oct 02 13:07:27 crc kubenswrapper[4766]: I1002 13:07:27.251221 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/572cfce3-4886-4695-b545-f21a3fe74c07-utilities\") pod \"community-operators-r949w\" (UID: \"572cfce3-4886-4695-b545-f21a3fe74c07\") " pod="openshift-marketplace/community-operators-r949w" Oct 02 13:07:27 crc kubenswrapper[4766]: I1002 13:07:27.275374 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5k55\" (UniqueName: \"kubernetes.io/projected/572cfce3-4886-4695-b545-f21a3fe74c07-kube-api-access-n5k55\") pod \"community-operators-r949w\" (UID: \"572cfce3-4886-4695-b545-f21a3fe74c07\") " pod="openshift-marketplace/community-operators-r949w" Oct 02 13:07:27 crc kubenswrapper[4766]: I1002 13:07:27.407475 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r949w" Oct 02 13:07:27 crc kubenswrapper[4766]: I1002 13:07:27.980355 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r949w"] Oct 02 13:07:28 crc kubenswrapper[4766]: I1002 13:07:28.739273 4766 generic.go:334] "Generic (PLEG): container finished" podID="572cfce3-4886-4695-b545-f21a3fe74c07" containerID="a40a9d2e481fe780729f819b501a0792112d110cf6769a696197a6a5cad09b50" exitCode=0 Oct 02 13:07:28 crc kubenswrapper[4766]: I1002 13:07:28.739382 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r949w" event={"ID":"572cfce3-4886-4695-b545-f21a3fe74c07","Type":"ContainerDied","Data":"a40a9d2e481fe780729f819b501a0792112d110cf6769a696197a6a5cad09b50"} Oct 02 13:07:28 crc kubenswrapper[4766]: I1002 13:07:28.739687 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r949w" event={"ID":"572cfce3-4886-4695-b545-f21a3fe74c07","Type":"ContainerStarted","Data":"bf794f4bf161971d50056399ee0079b9cc4251972977b2938ad714d859de5a81"} Oct 02 13:07:28 crc kubenswrapper[4766]: I1002 13:07:28.742540 4766 generic.go:334] "Generic (PLEG): container finished" podID="3a31e8aa-eb2f-467c-a985-a719a3040921" containerID="2f6b7f7c8437922f9ec40df0f6a2042259edd45a6f1c23aa18e29b9f567108af" exitCode=0 Oct 02 13:07:28 crc kubenswrapper[4766]: I1002 13:07:28.742615 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vg9j" event={"ID":"3a31e8aa-eb2f-467c-a985-a719a3040921","Type":"ContainerDied","Data":"2f6b7f7c8437922f9ec40df0f6a2042259edd45a6f1c23aa18e29b9f567108af"} Oct 02 13:07:29 crc kubenswrapper[4766]: I1002 13:07:29.756029 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r949w" event={"ID":"572cfce3-4886-4695-b545-f21a3fe74c07","Type":"ContainerStarted","Data":"115de909e80f8fdebf619143847067bc62734ff362385fff38fd2836a1fba5e3"} Oct 02 13:07:29 crc kubenswrapper[4766]: I1002 13:07:29.758884 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vg9j" event={"ID":"3a31e8aa-eb2f-467c-a985-a719a3040921","Type":"ContainerStarted","Data":"0ca4de0e3d4fc7994ffb24658c7deff201c86736e5115844d8ce4f3d8a3c767e"} Oct 02 13:07:29 crc kubenswrapper[4766]: I1002 13:07:29.800811 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8vg9j" podStartSLOduration=3.316240168 podStartE2EDuration="5.800786792s" podCreationTimestamp="2025-10-02 13:07:24 +0000 UTC" firstStartedPulling="2025-10-02 13:07:26.723624402 +0000 UTC m=+8161.666495356" lastFinishedPulling="2025-10-02 13:07:29.208171036 +0000 UTC m=+8164.151041980" observedRunningTime="2025-10-02 13:07:29.796565068 +0000 UTC m=+8164.739436012" watchObservedRunningTime="2025-10-02 13:07:29.800786792 +0000 UTC m=+8164.743657736" Oct 02 13:07:30 crc kubenswrapper[4766]: I1002 13:07:30.783053 4766 generic.go:334] "Generic (PLEG): container finished" podID="572cfce3-4886-4695-b545-f21a3fe74c07" containerID="115de909e80f8fdebf619143847067bc62734ff362385fff38fd2836a1fba5e3" exitCode=0 Oct 02 13:07:30 crc kubenswrapper[4766]: I1002 13:07:30.783109 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r949w" event={"ID":"572cfce3-4886-4695-b545-f21a3fe74c07","Type":"ContainerDied","Data":"115de909e80f8fdebf619143847067bc62734ff362385fff38fd2836a1fba5e3"} Oct 02 13:07:31 crc kubenswrapper[4766]: E1002 13:07:31.033950 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod572cfce3_4886_4695_b545_f21a3fe74c07.slice/crio-conmon-115de909e80f8fdebf619143847067bc62734ff362385fff38fd2836a1fba5e3.scope\": RecentStats: unable to find data in memory cache]" Oct 02 13:07:31 crc kubenswrapper[4766]: I1002 13:07:31.799866 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r949w" event={"ID":"572cfce3-4886-4695-b545-f21a3fe74c07","Type":"ContainerStarted","Data":"25f6c06c63597a110a0380b3e3b1910b192b104ef64cdc0c77a55ed5045db4c7"} Oct 02 13:07:31 crc kubenswrapper[4766]: I1002 13:07:31.829383 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r949w" podStartSLOduration=2.334936717 podStartE2EDuration="4.829355007s" podCreationTimestamp="2025-10-02 13:07:27 +0000 UTC" firstStartedPulling="2025-10-02 13:07:28.742234358 +0000 UTC m=+8163.685105302" lastFinishedPulling="2025-10-02 13:07:31.236652638 +0000 UTC m=+8166.179523592" observedRunningTime="2025-10-02 13:07:31.821682402 +0000 UTC m=+8166.764553356" watchObservedRunningTime="2025-10-02 13:07:31.829355007 +0000 UTC m=+8166.772225971" Oct 02 13:07:35 crc kubenswrapper[4766]: I1002 13:07:35.006912 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8vg9j" Oct 02 13:07:35 crc kubenswrapper[4766]: I1002 13:07:35.007487 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8vg9j" Oct 02 13:07:35 crc kubenswrapper[4766]: I1002 13:07:35.093406 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8vg9j" Oct 02 13:07:35 crc kubenswrapper[4766]: I1002 13:07:35.910490 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8vg9j" Oct 02 13:07:36 crc kubenswrapper[4766]: I1002 13:07:36.273725 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8vg9j"] Oct 02 13:07:37 crc kubenswrapper[4766]: I1002 13:07:37.409144 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r949w" Oct 02 13:07:37 crc kubenswrapper[4766]: I1002 13:07:37.409211 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r949w" Oct 02 13:07:37 crc kubenswrapper[4766]: I1002 13:07:37.483810 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r949w" Oct 02 13:07:37 crc kubenswrapper[4766]: I1002 13:07:37.871357 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8vg9j" podUID="3a31e8aa-eb2f-467c-a985-a719a3040921" containerName="registry-server" containerID="cri-o://0ca4de0e3d4fc7994ffb24658c7deff201c86736e5115844d8ce4f3d8a3c767e" gracePeriod=2 Oct 02 13:07:37 crc kubenswrapper[4766]: I1002 13:07:37.939844 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r949w" Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.452172 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vg9j" Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.642359 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a31e8aa-eb2f-467c-a985-a719a3040921-utilities\") pod \"3a31e8aa-eb2f-467c-a985-a719a3040921\" (UID: \"3a31e8aa-eb2f-467c-a985-a719a3040921\") " Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.642442 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a31e8aa-eb2f-467c-a985-a719a3040921-catalog-content\") pod \"3a31e8aa-eb2f-467c-a985-a719a3040921\" (UID: \"3a31e8aa-eb2f-467c-a985-a719a3040921\") " Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.642628 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4w4q\" (UniqueName: \"kubernetes.io/projected/3a31e8aa-eb2f-467c-a985-a719a3040921-kube-api-access-f4w4q\") pod \"3a31e8aa-eb2f-467c-a985-a719a3040921\" (UID: \"3a31e8aa-eb2f-467c-a985-a719a3040921\") " Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.645643 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a31e8aa-eb2f-467c-a985-a719a3040921-utilities" (OuterVolumeSpecName: "utilities") pod "3a31e8aa-eb2f-467c-a985-a719a3040921" (UID: "3a31e8aa-eb2f-467c-a985-a719a3040921"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.651761 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a31e8aa-eb2f-467c-a985-a719a3040921-kube-api-access-f4w4q" (OuterVolumeSpecName: "kube-api-access-f4w4q") pod "3a31e8aa-eb2f-467c-a985-a719a3040921" (UID: "3a31e8aa-eb2f-467c-a985-a719a3040921"). InnerVolumeSpecName "kube-api-access-f4w4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.661917 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r949w"] Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.739290 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a31e8aa-eb2f-467c-a985-a719a3040921-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3a31e8aa-eb2f-467c-a985-a719a3040921" (UID: "3a31e8aa-eb2f-467c-a985-a719a3040921"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.750143 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3a31e8aa-eb2f-467c-a985-a719a3040921-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.750300 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3a31e8aa-eb2f-467c-a985-a719a3040921-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.750615 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4w4q\" (UniqueName: \"kubernetes.io/projected/3a31e8aa-eb2f-467c-a985-a719a3040921-kube-api-access-f4w4q\") on node \"crc\" DevicePath \"\"" Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.884490 4766 generic.go:334] "Generic (PLEG): container finished" podID="3a31e8aa-eb2f-467c-a985-a719a3040921" containerID="0ca4de0e3d4fc7994ffb24658c7deff201c86736e5115844d8ce4f3d8a3c767e" exitCode=0 Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.885280 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vg9j" Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.892517 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vg9j" event={"ID":"3a31e8aa-eb2f-467c-a985-a719a3040921","Type":"ContainerDied","Data":"0ca4de0e3d4fc7994ffb24658c7deff201c86736e5115844d8ce4f3d8a3c767e"} Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.892574 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vg9j" event={"ID":"3a31e8aa-eb2f-467c-a985-a719a3040921","Type":"ContainerDied","Data":"6a55b0177479eb21bd8d603d6b37c0288eb49c10b1461e024635d7ebe9b9f3a0"} Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.892628 4766 scope.go:117] "RemoveContainer" containerID="0ca4de0e3d4fc7994ffb24658c7deff201c86736e5115844d8ce4f3d8a3c767e" Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.922868 4766 scope.go:117] "RemoveContainer" containerID="2f6b7f7c8437922f9ec40df0f6a2042259edd45a6f1c23aa18e29b9f567108af" Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.937264 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8vg9j"] Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.951556 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8vg9j"] Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.958213 4766 scope.go:117] "RemoveContainer" containerID="f76cf80a0c646b5f17b5ecbf58d90f32bbf4039fef95cd23072bbf4401255beb" Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.995726 4766 scope.go:117] "RemoveContainer" containerID="0ca4de0e3d4fc7994ffb24658c7deff201c86736e5115844d8ce4f3d8a3c767e" Oct 02 13:07:38 crc kubenswrapper[4766]: E1002 13:07:38.996266 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ca4de0e3d4fc7994ffb24658c7deff201c86736e5115844d8ce4f3d8a3c767e\": container with ID starting with 0ca4de0e3d4fc7994ffb24658c7deff201c86736e5115844d8ce4f3d8a3c767e not found: ID does not exist" containerID="0ca4de0e3d4fc7994ffb24658c7deff201c86736e5115844d8ce4f3d8a3c767e" Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.996307 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca4de0e3d4fc7994ffb24658c7deff201c86736e5115844d8ce4f3d8a3c767e"} err="failed to get container status \"0ca4de0e3d4fc7994ffb24658c7deff201c86736e5115844d8ce4f3d8a3c767e\": rpc error: code = NotFound desc = could not find container \"0ca4de0e3d4fc7994ffb24658c7deff201c86736e5115844d8ce4f3d8a3c767e\": container with ID starting with 0ca4de0e3d4fc7994ffb24658c7deff201c86736e5115844d8ce4f3d8a3c767e not found: ID does not exist" Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.996335 4766 scope.go:117] "RemoveContainer" containerID="2f6b7f7c8437922f9ec40df0f6a2042259edd45a6f1c23aa18e29b9f567108af" Oct 02 13:07:38 crc kubenswrapper[4766]: E1002 13:07:38.996746 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f6b7f7c8437922f9ec40df0f6a2042259edd45a6f1c23aa18e29b9f567108af\": container with ID starting with 2f6b7f7c8437922f9ec40df0f6a2042259edd45a6f1c23aa18e29b9f567108af not found: ID does not exist" containerID="2f6b7f7c8437922f9ec40df0f6a2042259edd45a6f1c23aa18e29b9f567108af" Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.996780 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f6b7f7c8437922f9ec40df0f6a2042259edd45a6f1c23aa18e29b9f567108af"} err="failed to get container status \"2f6b7f7c8437922f9ec40df0f6a2042259edd45a6f1c23aa18e29b9f567108af\": rpc error: code = NotFound desc = could not find container \"2f6b7f7c8437922f9ec40df0f6a2042259edd45a6f1c23aa18e29b9f567108af\": container with ID starting with 2f6b7f7c8437922f9ec40df0f6a2042259edd45a6f1c23aa18e29b9f567108af not found: ID does not exist" Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.996798 4766 scope.go:117] "RemoveContainer" containerID="f76cf80a0c646b5f17b5ecbf58d90f32bbf4039fef95cd23072bbf4401255beb" Oct 02 13:07:38 crc kubenswrapper[4766]: E1002 13:07:38.997028 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f76cf80a0c646b5f17b5ecbf58d90f32bbf4039fef95cd23072bbf4401255beb\": container with ID starting with f76cf80a0c646b5f17b5ecbf58d90f32bbf4039fef95cd23072bbf4401255beb not found: ID does not exist" containerID="f76cf80a0c646b5f17b5ecbf58d90f32bbf4039fef95cd23072bbf4401255beb" Oct 02 13:07:38 crc kubenswrapper[4766]: I1002 13:07:38.997054 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f76cf80a0c646b5f17b5ecbf58d90f32bbf4039fef95cd23072bbf4401255beb"} err="failed to get container status \"f76cf80a0c646b5f17b5ecbf58d90f32bbf4039fef95cd23072bbf4401255beb\": rpc error: code = NotFound desc = could not find container \"f76cf80a0c646b5f17b5ecbf58d90f32bbf4039fef95cd23072bbf4401255beb\": container with ID starting with f76cf80a0c646b5f17b5ecbf58d90f32bbf4039fef95cd23072bbf4401255beb not found: ID does not exist" Oct 02 13:07:39 crc kubenswrapper[4766]: I1002 13:07:39.893949 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a31e8aa-eb2f-467c-a985-a719a3040921" path="/var/lib/kubelet/pods/3a31e8aa-eb2f-467c-a985-a719a3040921/volumes" Oct 02 13:07:39 crc kubenswrapper[4766]: I1002 13:07:39.894974 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r949w" podUID="572cfce3-4886-4695-b545-f21a3fe74c07" containerName="registry-server" containerID="cri-o://25f6c06c63597a110a0380b3e3b1910b192b104ef64cdc0c77a55ed5045db4c7" gracePeriod=2 Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.413174 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r949w" Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.489694 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5k55\" (UniqueName: \"kubernetes.io/projected/572cfce3-4886-4695-b545-f21a3fe74c07-kube-api-access-n5k55\") pod \"572cfce3-4886-4695-b545-f21a3fe74c07\" (UID: \"572cfce3-4886-4695-b545-f21a3fe74c07\") " Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.489840 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/572cfce3-4886-4695-b545-f21a3fe74c07-utilities\") pod \"572cfce3-4886-4695-b545-f21a3fe74c07\" (UID: \"572cfce3-4886-4695-b545-f21a3fe74c07\") " Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.490237 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/572cfce3-4886-4695-b545-f21a3fe74c07-catalog-content\") pod \"572cfce3-4886-4695-b545-f21a3fe74c07\" (UID: \"572cfce3-4886-4695-b545-f21a3fe74c07\") " Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.490985 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/572cfce3-4886-4695-b545-f21a3fe74c07-utilities" (OuterVolumeSpecName: "utilities") pod "572cfce3-4886-4695-b545-f21a3fe74c07" (UID: "572cfce3-4886-4695-b545-f21a3fe74c07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.491347 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/572cfce3-4886-4695-b545-f21a3fe74c07-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.498436 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/572cfce3-4886-4695-b545-f21a3fe74c07-kube-api-access-n5k55" (OuterVolumeSpecName: "kube-api-access-n5k55") pod "572cfce3-4886-4695-b545-f21a3fe74c07" (UID: "572cfce3-4886-4695-b545-f21a3fe74c07"). InnerVolumeSpecName "kube-api-access-n5k55". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.537688 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/572cfce3-4886-4695-b545-f21a3fe74c07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "572cfce3-4886-4695-b545-f21a3fe74c07" (UID: "572cfce3-4886-4695-b545-f21a3fe74c07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.593408 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/572cfce3-4886-4695-b545-f21a3fe74c07-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.593448 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5k55\" (UniqueName: \"kubernetes.io/projected/572cfce3-4886-4695-b545-f21a3fe74c07-kube-api-access-n5k55\") on node \"crc\" DevicePath \"\"" Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.915138 4766 generic.go:334] "Generic (PLEG): container finished" podID="572cfce3-4886-4695-b545-f21a3fe74c07" containerID="25f6c06c63597a110a0380b3e3b1910b192b104ef64cdc0c77a55ed5045db4c7" exitCode=0 Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.915197 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r949w" Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.915225 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r949w" event={"ID":"572cfce3-4886-4695-b545-f21a3fe74c07","Type":"ContainerDied","Data":"25f6c06c63597a110a0380b3e3b1910b192b104ef64cdc0c77a55ed5045db4c7"} Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.915675 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r949w" event={"ID":"572cfce3-4886-4695-b545-f21a3fe74c07","Type":"ContainerDied","Data":"bf794f4bf161971d50056399ee0079b9cc4251972977b2938ad714d859de5a81"} Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.915700 4766 scope.go:117] "RemoveContainer" containerID="25f6c06c63597a110a0380b3e3b1910b192b104ef64cdc0c77a55ed5045db4c7" Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.948757 4766 scope.go:117] "RemoveContainer" containerID="115de909e80f8fdebf619143847067bc62734ff362385fff38fd2836a1fba5e3" Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.957154 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r949w"] Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.967046 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r949w"] Oct 02 13:07:40 crc kubenswrapper[4766]: I1002 13:07:40.974403 4766 scope.go:117] "RemoveContainer" containerID="a40a9d2e481fe780729f819b501a0792112d110cf6769a696197a6a5cad09b50" Oct 02 13:07:41 crc kubenswrapper[4766]: I1002 13:07:41.023630 4766 scope.go:117] "RemoveContainer" containerID="25f6c06c63597a110a0380b3e3b1910b192b104ef64cdc0c77a55ed5045db4c7" Oct 02 13:07:41 crc kubenswrapper[4766]: E1002 13:07:41.024183 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f6c06c63597a110a0380b3e3b1910b192b104ef64cdc0c77a55ed5045db4c7\": container with ID starting with 25f6c06c63597a110a0380b3e3b1910b192b104ef64cdc0c77a55ed5045db4c7 not found: ID does not exist" containerID="25f6c06c63597a110a0380b3e3b1910b192b104ef64cdc0c77a55ed5045db4c7" Oct 02 13:07:41 crc kubenswrapper[4766]: I1002 13:07:41.024217 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f6c06c63597a110a0380b3e3b1910b192b104ef64cdc0c77a55ed5045db4c7"} err="failed to get container status \"25f6c06c63597a110a0380b3e3b1910b192b104ef64cdc0c77a55ed5045db4c7\": rpc error: code = NotFound desc = could not find container \"25f6c06c63597a110a0380b3e3b1910b192b104ef64cdc0c77a55ed5045db4c7\": container with ID starting with 25f6c06c63597a110a0380b3e3b1910b192b104ef64cdc0c77a55ed5045db4c7 not found: ID does not exist" Oct 02 13:07:41 crc kubenswrapper[4766]: I1002 13:07:41.024240 4766 scope.go:117] "RemoveContainer" containerID="115de909e80f8fdebf619143847067bc62734ff362385fff38fd2836a1fba5e3" Oct 02 13:07:41 crc kubenswrapper[4766]: E1002 13:07:41.025126 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"115de909e80f8fdebf619143847067bc62734ff362385fff38fd2836a1fba5e3\": container with ID starting with 115de909e80f8fdebf619143847067bc62734ff362385fff38fd2836a1fba5e3 not found: ID does not exist" containerID="115de909e80f8fdebf619143847067bc62734ff362385fff38fd2836a1fba5e3" Oct 02 13:07:41 crc kubenswrapper[4766]: I1002 13:07:41.025175 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"115de909e80f8fdebf619143847067bc62734ff362385fff38fd2836a1fba5e3"} err="failed to get container status \"115de909e80f8fdebf619143847067bc62734ff362385fff38fd2836a1fba5e3\": rpc error: code = NotFound desc = could not find container \"115de909e80f8fdebf619143847067bc62734ff362385fff38fd2836a1fba5e3\": container with ID starting with 115de909e80f8fdebf619143847067bc62734ff362385fff38fd2836a1fba5e3 not found: ID does not exist" Oct 02 13:07:41 crc kubenswrapper[4766]: I1002 13:07:41.025209 4766 scope.go:117] "RemoveContainer" containerID="a40a9d2e481fe780729f819b501a0792112d110cf6769a696197a6a5cad09b50" Oct 02 13:07:41 crc kubenswrapper[4766]: E1002 13:07:41.025607 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a40a9d2e481fe780729f819b501a0792112d110cf6769a696197a6a5cad09b50\": container with ID starting with a40a9d2e481fe780729f819b501a0792112d110cf6769a696197a6a5cad09b50 not found: ID does not exist" containerID="a40a9d2e481fe780729f819b501a0792112d110cf6769a696197a6a5cad09b50" Oct 02 13:07:41 crc kubenswrapper[4766]: I1002 13:07:41.025652 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a40a9d2e481fe780729f819b501a0792112d110cf6769a696197a6a5cad09b50"} err="failed to get container status \"a40a9d2e481fe780729f819b501a0792112d110cf6769a696197a6a5cad09b50\": rpc error: code = NotFound desc = could not find container \"a40a9d2e481fe780729f819b501a0792112d110cf6769a696197a6a5cad09b50\": container with ID starting with a40a9d2e481fe780729f819b501a0792112d110cf6769a696197a6a5cad09b50 not found: ID does not exist" Oct 02 13:07:41 crc kubenswrapper[4766]: I1002 13:07:41.901347 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="572cfce3-4886-4695-b545-f21a3fe74c07" path="/var/lib/kubelet/pods/572cfce3-4886-4695-b545-f21a3fe74c07/volumes" Oct 02 13:07:54 crc kubenswrapper[4766]: I1002 13:07:54.431960 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:07:54 crc kubenswrapper[4766]: I1002 13:07:54.432598 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:08:24 crc kubenswrapper[4766]: I1002 13:08:24.431848 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:08:24 crc kubenswrapper[4766]: I1002 13:08:24.433318 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:08:54 crc kubenswrapper[4766]: I1002 13:08:54.432564 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:08:54 crc kubenswrapper[4766]: I1002 13:08:54.433177 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:08:54 crc kubenswrapper[4766]: I1002 13:08:54.433240 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 13:08:54 crc kubenswrapper[4766]: I1002 13:08:54.434255 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"74c2163ca1fa7ad09b7ba7a3b4eb6190f280a4c4e20f090ee0f37ef7b01fce4d"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:08:54 crc kubenswrapper[4766]: I1002 13:08:54.434359 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://74c2163ca1fa7ad09b7ba7a3b4eb6190f280a4c4e20f090ee0f37ef7b01fce4d" gracePeriod=600 Oct 02 13:08:54 crc kubenswrapper[4766]: I1002 13:08:54.822056 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="74c2163ca1fa7ad09b7ba7a3b4eb6190f280a4c4e20f090ee0f37ef7b01fce4d" exitCode=0 Oct 02 13:08:54 crc kubenswrapper[4766]: I1002 13:08:54.822104 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"74c2163ca1fa7ad09b7ba7a3b4eb6190f280a4c4e20f090ee0f37ef7b01fce4d"} Oct 02 13:08:54 crc kubenswrapper[4766]: I1002 13:08:54.822152 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8"} Oct 02 13:08:54 crc kubenswrapper[4766]: I1002 13:08:54.822186 4766 scope.go:117] "RemoveContainer" containerID="e6d354695c4439e1c6510d3f550913d05f82a6b3287374b9d182c3813c1be6bf" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.497984 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ml5qm"] Oct 02 13:09:33 crc kubenswrapper[4766]: E1002 13:09:33.498988 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="572cfce3-4886-4695-b545-f21a3fe74c07" containerName="registry-server" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.499001 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="572cfce3-4886-4695-b545-f21a3fe74c07" containerName="registry-server" Oct 02 13:09:33 crc kubenswrapper[4766]: E1002 13:09:33.499020 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a31e8aa-eb2f-467c-a985-a719a3040921" containerName="extract-utilities" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.499025 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a31e8aa-eb2f-467c-a985-a719a3040921" containerName="extract-utilities" Oct 02 13:09:33 crc kubenswrapper[4766]: E1002 13:09:33.499040 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a31e8aa-eb2f-467c-a985-a719a3040921" containerName="registry-server" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.499047 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a31e8aa-eb2f-467c-a985-a719a3040921" containerName="registry-server" Oct 02 13:09:33 crc kubenswrapper[4766]: E1002 13:09:33.499061 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="572cfce3-4886-4695-b545-f21a3fe74c07" containerName="extract-utilities" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.499066 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="572cfce3-4886-4695-b545-f21a3fe74c07" containerName="extract-utilities" Oct 02 13:09:33 crc kubenswrapper[4766]: E1002 13:09:33.499084 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a31e8aa-eb2f-467c-a985-a719a3040921" containerName="extract-content" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.499090 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a31e8aa-eb2f-467c-a985-a719a3040921" containerName="extract-content" Oct 02 13:09:33 crc kubenswrapper[4766]: E1002 13:09:33.499104 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="572cfce3-4886-4695-b545-f21a3fe74c07" containerName="extract-content" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.499110 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="572cfce3-4886-4695-b545-f21a3fe74c07" containerName="extract-content" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.499385 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a31e8aa-eb2f-467c-a985-a719a3040921" containerName="registry-server" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.499412 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="572cfce3-4886-4695-b545-f21a3fe74c07" containerName="registry-server" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.501330 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ml5qm" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.507559 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml5qm"] Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.630035 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d5xp\" (UniqueName: \"kubernetes.io/projected/2b404470-9319-4df0-9b42-8b547eb823cd-kube-api-access-4d5xp\") pod \"redhat-marketplace-ml5qm\" (UID: \"2b404470-9319-4df0-9b42-8b547eb823cd\") " pod="openshift-marketplace/redhat-marketplace-ml5qm" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.631542 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b404470-9319-4df0-9b42-8b547eb823cd-catalog-content\") pod \"redhat-marketplace-ml5qm\" (UID: \"2b404470-9319-4df0-9b42-8b547eb823cd\") " pod="openshift-marketplace/redhat-marketplace-ml5qm" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.631665 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b404470-9319-4df0-9b42-8b547eb823cd-utilities\") pod \"redhat-marketplace-ml5qm\" (UID: \"2b404470-9319-4df0-9b42-8b547eb823cd\") " pod="openshift-marketplace/redhat-marketplace-ml5qm" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.733613 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b404470-9319-4df0-9b42-8b547eb823cd-catalog-content\") pod \"redhat-marketplace-ml5qm\" (UID: \"2b404470-9319-4df0-9b42-8b547eb823cd\") " pod="openshift-marketplace/redhat-marketplace-ml5qm" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.733747 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b404470-9319-4df0-9b42-8b547eb823cd-utilities\") pod \"redhat-marketplace-ml5qm\" (UID: \"2b404470-9319-4df0-9b42-8b547eb823cd\") " pod="openshift-marketplace/redhat-marketplace-ml5qm" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.733779 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d5xp\" (UniqueName: \"kubernetes.io/projected/2b404470-9319-4df0-9b42-8b547eb823cd-kube-api-access-4d5xp\") pod \"redhat-marketplace-ml5qm\" (UID: \"2b404470-9319-4df0-9b42-8b547eb823cd\") " pod="openshift-marketplace/redhat-marketplace-ml5qm" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.734924 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b404470-9319-4df0-9b42-8b547eb823cd-catalog-content\") pod \"redhat-marketplace-ml5qm\" (UID: \"2b404470-9319-4df0-9b42-8b547eb823cd\") " pod="openshift-marketplace/redhat-marketplace-ml5qm" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.734958 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b404470-9319-4df0-9b42-8b547eb823cd-utilities\") pod \"redhat-marketplace-ml5qm\" (UID: \"2b404470-9319-4df0-9b42-8b547eb823cd\") " pod="openshift-marketplace/redhat-marketplace-ml5qm" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.756572 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d5xp\" (UniqueName: \"kubernetes.io/projected/2b404470-9319-4df0-9b42-8b547eb823cd-kube-api-access-4d5xp\") pod \"redhat-marketplace-ml5qm\" (UID: \"2b404470-9319-4df0-9b42-8b547eb823cd\") " pod="openshift-marketplace/redhat-marketplace-ml5qm" Oct 02 13:09:33 crc kubenswrapper[4766]: I1002 13:09:33.837702 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ml5qm" Oct 02 13:09:34 crc kubenswrapper[4766]: I1002 13:09:34.305581 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml5qm"] Oct 02 13:09:34 crc kubenswrapper[4766]: I1002 13:09:34.403486 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml5qm" event={"ID":"2b404470-9319-4df0-9b42-8b547eb823cd","Type":"ContainerStarted","Data":"e4f013bb49c890e331a6409b28816e37ec3cb07f7b64e2c9d694e83e4afc7df6"} Oct 02 13:09:34 crc kubenswrapper[4766]: E1002 13:09:34.843111 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b404470_9319_4df0_9b42_8b547eb823cd.slice/crio-3d9adef2989468af3b550ac8654d6adae7992d8b112e34cec36a30c9d123667f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b404470_9319_4df0_9b42_8b547eb823cd.slice/crio-conmon-3d9adef2989468af3b550ac8654d6adae7992d8b112e34cec36a30c9d123667f.scope\": RecentStats: unable to find data in memory cache]" Oct 02 13:09:35 crc kubenswrapper[4766]: I1002 13:09:35.422763 4766 generic.go:334] "Generic (PLEG): container finished" podID="2b404470-9319-4df0-9b42-8b547eb823cd" containerID="3d9adef2989468af3b550ac8654d6adae7992d8b112e34cec36a30c9d123667f" exitCode=0 Oct 02 13:09:35 crc kubenswrapper[4766]: I1002 13:09:35.423146 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml5qm" event={"ID":"2b404470-9319-4df0-9b42-8b547eb823cd","Type":"ContainerDied","Data":"3d9adef2989468af3b550ac8654d6adae7992d8b112e34cec36a30c9d123667f"} Oct 02 13:09:36 crc kubenswrapper[4766]: I1002 13:09:36.439162 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml5qm" event={"ID":"2b404470-9319-4df0-9b42-8b547eb823cd","Type":"ContainerStarted","Data":"3ef8956fc645a08c5cc86e755522c1854ba586c9bed62dbd884acc0a91e4f170"} Oct 02 13:09:37 crc kubenswrapper[4766]: I1002 13:09:37.450821 4766 generic.go:334] "Generic (PLEG): container finished" podID="2b404470-9319-4df0-9b42-8b547eb823cd" containerID="3ef8956fc645a08c5cc86e755522c1854ba586c9bed62dbd884acc0a91e4f170" exitCode=0 Oct 02 13:09:37 crc kubenswrapper[4766]: I1002 13:09:37.450941 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml5qm" event={"ID":"2b404470-9319-4df0-9b42-8b547eb823cd","Type":"ContainerDied","Data":"3ef8956fc645a08c5cc86e755522c1854ba586c9bed62dbd884acc0a91e4f170"} Oct 02 13:09:38 crc kubenswrapper[4766]: I1002 13:09:38.462031 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml5qm" event={"ID":"2b404470-9319-4df0-9b42-8b547eb823cd","Type":"ContainerStarted","Data":"da2a36d1818af7ecbcdfbd569d957f9f0ae2246a54882802d485c9c55bf06eac"} Oct 02 13:09:38 crc kubenswrapper[4766]: I1002 13:09:38.495090 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ml5qm" podStartSLOduration=3.0065137 podStartE2EDuration="5.495071042s" podCreationTimestamp="2025-10-02 13:09:33 +0000 UTC" firstStartedPulling="2025-10-02 13:09:35.427400496 +0000 UTC m=+8290.370271450" lastFinishedPulling="2025-10-02 13:09:37.915957828 +0000 UTC m=+8292.858828792" observedRunningTime="2025-10-02 13:09:38.489902697 +0000 UTC m=+8293.432773641" watchObservedRunningTime="2025-10-02 13:09:38.495071042 +0000 UTC m=+8293.437941986" Oct 02 13:09:43 crc kubenswrapper[4766]: I1002 13:09:43.837976 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ml5qm" Oct 02 13:09:43 crc kubenswrapper[4766]: I1002 13:09:43.838669 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ml5qm" Oct 02 13:09:43 crc kubenswrapper[4766]: I1002 13:09:43.899660 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ml5qm" Oct 02 13:09:44 crc kubenswrapper[4766]: I1002 13:09:44.589380 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ml5qm" Oct 02 13:09:44 crc kubenswrapper[4766]: I1002 13:09:44.636325 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml5qm"] Oct 02 13:09:46 crc kubenswrapper[4766]: I1002 13:09:46.547717 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ml5qm" podUID="2b404470-9319-4df0-9b42-8b547eb823cd" containerName="registry-server" containerID="cri-o://da2a36d1818af7ecbcdfbd569d957f9f0ae2246a54882802d485c9c55bf06eac" gracePeriod=2 Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.043191 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ml5qm" Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.137655 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d5xp\" (UniqueName: \"kubernetes.io/projected/2b404470-9319-4df0-9b42-8b547eb823cd-kube-api-access-4d5xp\") pod \"2b404470-9319-4df0-9b42-8b547eb823cd\" (UID: \"2b404470-9319-4df0-9b42-8b547eb823cd\") " Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.137933 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b404470-9319-4df0-9b42-8b547eb823cd-utilities\") pod \"2b404470-9319-4df0-9b42-8b547eb823cd\" (UID: \"2b404470-9319-4df0-9b42-8b547eb823cd\") " Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.137956 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b404470-9319-4df0-9b42-8b547eb823cd-catalog-content\") pod \"2b404470-9319-4df0-9b42-8b547eb823cd\" (UID: \"2b404470-9319-4df0-9b42-8b547eb823cd\") " Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.139028 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b404470-9319-4df0-9b42-8b547eb823cd-utilities" (OuterVolumeSpecName: "utilities") pod "2b404470-9319-4df0-9b42-8b547eb823cd" (UID: "2b404470-9319-4df0-9b42-8b547eb823cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.145735 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b404470-9319-4df0-9b42-8b547eb823cd-kube-api-access-4d5xp" (OuterVolumeSpecName: "kube-api-access-4d5xp") pod "2b404470-9319-4df0-9b42-8b547eb823cd" (UID: "2b404470-9319-4df0-9b42-8b547eb823cd"). InnerVolumeSpecName "kube-api-access-4d5xp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.153968 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b404470-9319-4df0-9b42-8b547eb823cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b404470-9319-4df0-9b42-8b547eb823cd" (UID: "2b404470-9319-4df0-9b42-8b547eb823cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.240798 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d5xp\" (UniqueName: \"kubernetes.io/projected/2b404470-9319-4df0-9b42-8b547eb823cd-kube-api-access-4d5xp\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.240847 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b404470-9319-4df0-9b42-8b547eb823cd-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.240857 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b404470-9319-4df0-9b42-8b547eb823cd-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.560402 4766 generic.go:334] "Generic (PLEG): container finished" podID="2b404470-9319-4df0-9b42-8b547eb823cd" containerID="da2a36d1818af7ecbcdfbd569d957f9f0ae2246a54882802d485c9c55bf06eac" exitCode=0 Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.560459 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml5qm" event={"ID":"2b404470-9319-4df0-9b42-8b547eb823cd","Type":"ContainerDied","Data":"da2a36d1818af7ecbcdfbd569d957f9f0ae2246a54882802d485c9c55bf06eac"} Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.560814 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml5qm" event={"ID":"2b404470-9319-4df0-9b42-8b547eb823cd","Type":"ContainerDied","Data":"e4f013bb49c890e331a6409b28816e37ec3cb07f7b64e2c9d694e83e4afc7df6"} Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.560841 4766 scope.go:117] "RemoveContainer" containerID="da2a36d1818af7ecbcdfbd569d957f9f0ae2246a54882802d485c9c55bf06eac" Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.560480 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ml5qm" Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.588577 4766 scope.go:117] "RemoveContainer" containerID="3ef8956fc645a08c5cc86e755522c1854ba586c9bed62dbd884acc0a91e4f170" Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.625854 4766 scope.go:117] "RemoveContainer" containerID="3d9adef2989468af3b550ac8654d6adae7992d8b112e34cec36a30c9d123667f" Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.637764 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml5qm"] Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.664533 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml5qm"] Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.685728 4766 scope.go:117] "RemoveContainer" containerID="da2a36d1818af7ecbcdfbd569d957f9f0ae2246a54882802d485c9c55bf06eac" Oct 02 13:09:47 crc kubenswrapper[4766]: E1002 13:09:47.690879 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da2a36d1818af7ecbcdfbd569d957f9f0ae2246a54882802d485c9c55bf06eac\": container with ID starting with da2a36d1818af7ecbcdfbd569d957f9f0ae2246a54882802d485c9c55bf06eac not found: ID does not exist" containerID="da2a36d1818af7ecbcdfbd569d957f9f0ae2246a54882802d485c9c55bf06eac" Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.690945 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da2a36d1818af7ecbcdfbd569d957f9f0ae2246a54882802d485c9c55bf06eac"} err="failed to get container status \"da2a36d1818af7ecbcdfbd569d957f9f0ae2246a54882802d485c9c55bf06eac\": rpc error: code = NotFound desc = could not find container \"da2a36d1818af7ecbcdfbd569d957f9f0ae2246a54882802d485c9c55bf06eac\": container with ID starting with da2a36d1818af7ecbcdfbd569d957f9f0ae2246a54882802d485c9c55bf06eac not found: ID does not exist" Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.690980 4766 scope.go:117] "RemoveContainer" containerID="3ef8956fc645a08c5cc86e755522c1854ba586c9bed62dbd884acc0a91e4f170" Oct 02 13:09:47 crc kubenswrapper[4766]: E1002 13:09:47.691418 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ef8956fc645a08c5cc86e755522c1854ba586c9bed62dbd884acc0a91e4f170\": container with ID starting with 3ef8956fc645a08c5cc86e755522c1854ba586c9bed62dbd884acc0a91e4f170 not found: ID does not exist" containerID="3ef8956fc645a08c5cc86e755522c1854ba586c9bed62dbd884acc0a91e4f170" Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.691441 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ef8956fc645a08c5cc86e755522c1854ba586c9bed62dbd884acc0a91e4f170"} err="failed to get container status \"3ef8956fc645a08c5cc86e755522c1854ba586c9bed62dbd884acc0a91e4f170\": rpc error: code = NotFound desc = could not find container \"3ef8956fc645a08c5cc86e755522c1854ba586c9bed62dbd884acc0a91e4f170\": container with ID starting with 3ef8956fc645a08c5cc86e755522c1854ba586c9bed62dbd884acc0a91e4f170 not found: ID does not exist" Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.691454 4766 scope.go:117] "RemoveContainer" containerID="3d9adef2989468af3b550ac8654d6adae7992d8b112e34cec36a30c9d123667f" Oct 02 13:09:47 crc kubenswrapper[4766]: E1002 13:09:47.691653 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d9adef2989468af3b550ac8654d6adae7992d8b112e34cec36a30c9d123667f\": container with ID starting with 3d9adef2989468af3b550ac8654d6adae7992d8b112e34cec36a30c9d123667f not found: ID does not exist" containerID="3d9adef2989468af3b550ac8654d6adae7992d8b112e34cec36a30c9d123667f" Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.691676 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9adef2989468af3b550ac8654d6adae7992d8b112e34cec36a30c9d123667f"} err="failed to get container status \"3d9adef2989468af3b550ac8654d6adae7992d8b112e34cec36a30c9d123667f\": rpc error: code = NotFound desc = could not find container \"3d9adef2989468af3b550ac8654d6adae7992d8b112e34cec36a30c9d123667f\": container with ID starting with 3d9adef2989468af3b550ac8654d6adae7992d8b112e34cec36a30c9d123667f not found: ID does not exist" Oct 02 13:09:47 crc kubenswrapper[4766]: I1002 13:09:47.895153 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b404470-9319-4df0-9b42-8b547eb823cd" path="/var/lib/kubelet/pods/2b404470-9319-4df0-9b42-8b547eb823cd/volumes" Oct 02 13:09:53 crc kubenswrapper[4766]: I1002 13:09:53.796750 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hm457"] Oct 02 13:09:53 crc kubenswrapper[4766]: E1002 13:09:53.798002 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b404470-9319-4df0-9b42-8b547eb823cd" containerName="registry-server" Oct 02 13:09:53 crc kubenswrapper[4766]: I1002 13:09:53.798026 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b404470-9319-4df0-9b42-8b547eb823cd" containerName="registry-server" Oct 02 13:09:53 crc kubenswrapper[4766]: E1002 13:09:53.798056 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b404470-9319-4df0-9b42-8b547eb823cd" containerName="extract-utilities" Oct 02 13:09:53 crc kubenswrapper[4766]: I1002 13:09:53.798065 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b404470-9319-4df0-9b42-8b547eb823cd" containerName="extract-utilities" Oct 02 13:09:53 crc kubenswrapper[4766]: E1002 13:09:53.798101 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b404470-9319-4df0-9b42-8b547eb823cd" containerName="extract-content" Oct 02 13:09:53 crc kubenswrapper[4766]: I1002 13:09:53.798110 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b404470-9319-4df0-9b42-8b547eb823cd" containerName="extract-content" Oct 02 13:09:53 crc kubenswrapper[4766]: I1002 13:09:53.798458 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b404470-9319-4df0-9b42-8b547eb823cd" containerName="registry-server" Oct 02 13:09:53 crc kubenswrapper[4766]: I1002 13:09:53.800586 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hm457" Oct 02 13:09:53 crc kubenswrapper[4766]: I1002 13:09:53.823238 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hm457"] Oct 02 13:09:53 crc kubenswrapper[4766]: I1002 13:09:53.894653 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h675m\" (UniqueName: \"kubernetes.io/projected/edcd959c-f974-4f9d-acb4-4f649df7f068-kube-api-access-h675m\") pod \"redhat-operators-hm457\" (UID: \"edcd959c-f974-4f9d-acb4-4f649df7f068\") " pod="openshift-marketplace/redhat-operators-hm457" Oct 02 13:09:53 crc kubenswrapper[4766]: I1002 13:09:53.894710 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edcd959c-f974-4f9d-acb4-4f649df7f068-catalog-content\") pod \"redhat-operators-hm457\" (UID: \"edcd959c-f974-4f9d-acb4-4f649df7f068\") " pod="openshift-marketplace/redhat-operators-hm457" Oct 02 13:09:53 crc kubenswrapper[4766]: I1002 13:09:53.894737 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edcd959c-f974-4f9d-acb4-4f649df7f068-utilities\") pod \"redhat-operators-hm457\" (UID: \"edcd959c-f974-4f9d-acb4-4f649df7f068\") " pod="openshift-marketplace/redhat-operators-hm457" Oct 02 13:09:53 crc kubenswrapper[4766]: I1002 13:09:53.996890 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h675m\" (UniqueName: \"kubernetes.io/projected/edcd959c-f974-4f9d-acb4-4f649df7f068-kube-api-access-h675m\") pod \"redhat-operators-hm457\" (UID: \"edcd959c-f974-4f9d-acb4-4f649df7f068\") " pod="openshift-marketplace/redhat-operators-hm457" Oct 02 13:09:53 crc kubenswrapper[4766]: I1002 13:09:53.996955 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edcd959c-f974-4f9d-acb4-4f649df7f068-catalog-content\") pod \"redhat-operators-hm457\" (UID: \"edcd959c-f974-4f9d-acb4-4f649df7f068\") " pod="openshift-marketplace/redhat-operators-hm457" Oct 02 13:09:53 crc kubenswrapper[4766]: I1002 13:09:53.996991 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edcd959c-f974-4f9d-acb4-4f649df7f068-utilities\") pod \"redhat-operators-hm457\" (UID: \"edcd959c-f974-4f9d-acb4-4f649df7f068\") " pod="openshift-marketplace/redhat-operators-hm457" Oct 02 13:09:53 crc kubenswrapper[4766]: I1002 13:09:53.997386 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edcd959c-f974-4f9d-acb4-4f649df7f068-utilities\") pod \"redhat-operators-hm457\" (UID: \"edcd959c-f974-4f9d-acb4-4f649df7f068\") " pod="openshift-marketplace/redhat-operators-hm457" Oct 02 13:09:53 crc kubenswrapper[4766]: I1002 13:09:53.997625 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edcd959c-f974-4f9d-acb4-4f649df7f068-catalog-content\") pod \"redhat-operators-hm457\" (UID: \"edcd959c-f974-4f9d-acb4-4f649df7f068\") " pod="openshift-marketplace/redhat-operators-hm457" Oct 02 13:09:54 crc kubenswrapper[4766]: I1002 13:09:54.018267 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h675m\" (UniqueName: \"kubernetes.io/projected/edcd959c-f974-4f9d-acb4-4f649df7f068-kube-api-access-h675m\") pod \"redhat-operators-hm457\" (UID: \"edcd959c-f974-4f9d-acb4-4f649df7f068\") " pod="openshift-marketplace/redhat-operators-hm457" Oct 02 13:09:54 crc kubenswrapper[4766]: I1002 13:09:54.120118 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hm457" Oct 02 13:09:54 crc kubenswrapper[4766]: I1002 13:09:54.649103 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hm457"] Oct 02 13:09:55 crc kubenswrapper[4766]: I1002 13:09:55.668194 4766 generic.go:334] "Generic (PLEG): container finished" podID="edcd959c-f974-4f9d-acb4-4f649df7f068" containerID="7ff41564c18bde105fb169084b74f476804aa2d077cbdc00b77aa31597f2b4a5" exitCode=0 Oct 02 13:09:55 crc kubenswrapper[4766]: I1002 13:09:55.668283 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hm457" event={"ID":"edcd959c-f974-4f9d-acb4-4f649df7f068","Type":"ContainerDied","Data":"7ff41564c18bde105fb169084b74f476804aa2d077cbdc00b77aa31597f2b4a5"} Oct 02 13:09:55 crc kubenswrapper[4766]: I1002 13:09:55.668598 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hm457" event={"ID":"edcd959c-f974-4f9d-acb4-4f649df7f068","Type":"ContainerStarted","Data":"3c36dcb19a112d1d1d2ce1da18f1ee800005e930a62c076008de462011c00ded"} Oct 02 13:09:55 crc kubenswrapper[4766]: I1002 13:09:55.671107 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 13:09:57 crc kubenswrapper[4766]: I1002 13:09:57.698392 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hm457" event={"ID":"edcd959c-f974-4f9d-acb4-4f649df7f068","Type":"ContainerStarted","Data":"31cd5a26a6594cd3cf27c1e44021b9dcbbf78ce3dc533baf014e9451627da2b2"} Oct 02 13:09:58 crc kubenswrapper[4766]: I1002 13:09:58.730495 4766 generic.go:334] "Generic (PLEG): container finished" podID="edcd959c-f974-4f9d-acb4-4f649df7f068" containerID="31cd5a26a6594cd3cf27c1e44021b9dcbbf78ce3dc533baf014e9451627da2b2" exitCode=0 Oct 02 13:09:58 crc kubenswrapper[4766]: I1002 13:09:58.730708 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hm457" event={"ID":"edcd959c-f974-4f9d-acb4-4f649df7f068","Type":"ContainerDied","Data":"31cd5a26a6594cd3cf27c1e44021b9dcbbf78ce3dc533baf014e9451627da2b2"} Oct 02 13:09:59 crc kubenswrapper[4766]: I1002 13:09:59.742326 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hm457" event={"ID":"edcd959c-f974-4f9d-acb4-4f649df7f068","Type":"ContainerStarted","Data":"efd227372ac83c3f8f3a82e0c1fb7157112087422c8d827a8d9b35a290fa92e6"} Oct 02 13:09:59 crc kubenswrapper[4766]: I1002 13:09:59.765367 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hm457" podStartSLOduration=3.124216713 podStartE2EDuration="6.765345363s" podCreationTimestamp="2025-10-02 13:09:53 +0000 UTC" firstStartedPulling="2025-10-02 13:09:55.670900478 +0000 UTC m=+8310.613771422" lastFinishedPulling="2025-10-02 13:09:59.312029128 +0000 UTC m=+8314.254900072" observedRunningTime="2025-10-02 13:09:59.763219744 +0000 UTC m=+8314.706090738" watchObservedRunningTime="2025-10-02 13:09:59.765345363 +0000 UTC m=+8314.708216307" Oct 02 13:10:04 crc kubenswrapper[4766]: I1002 13:10:04.120837 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hm457" Oct 02 13:10:04 crc kubenswrapper[4766]: I1002 13:10:04.121366 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hm457" Oct 02 13:10:04 crc kubenswrapper[4766]: I1002 13:10:04.182391 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hm457" Oct 02 13:10:04 crc kubenswrapper[4766]: I1002 13:10:04.838858 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hm457" Oct 02 13:10:04 crc kubenswrapper[4766]: I1002 13:10:04.885467 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hm457"] Oct 02 13:10:06 crc kubenswrapper[4766]: I1002 13:10:06.817989 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hm457" podUID="edcd959c-f974-4f9d-acb4-4f649df7f068" containerName="registry-server" containerID="cri-o://efd227372ac83c3f8f3a82e0c1fb7157112087422c8d827a8d9b35a290fa92e6" gracePeriod=2 Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.506798 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hm457" Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.617631 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edcd959c-f974-4f9d-acb4-4f649df7f068-utilities\") pod \"edcd959c-f974-4f9d-acb4-4f649df7f068\" (UID: \"edcd959c-f974-4f9d-acb4-4f649df7f068\") " Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.617815 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h675m\" (UniqueName: \"kubernetes.io/projected/edcd959c-f974-4f9d-acb4-4f649df7f068-kube-api-access-h675m\") pod \"edcd959c-f974-4f9d-acb4-4f649df7f068\" (UID: \"edcd959c-f974-4f9d-acb4-4f649df7f068\") " Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.618000 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edcd959c-f974-4f9d-acb4-4f649df7f068-catalog-content\") pod \"edcd959c-f974-4f9d-acb4-4f649df7f068\" (UID: \"edcd959c-f974-4f9d-acb4-4f649df7f068\") " Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.618589 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edcd959c-f974-4f9d-acb4-4f649df7f068-utilities" (OuterVolumeSpecName: "utilities") pod "edcd959c-f974-4f9d-acb4-4f649df7f068" (UID: "edcd959c-f974-4f9d-acb4-4f649df7f068"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.627764 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edcd959c-f974-4f9d-acb4-4f649df7f068-kube-api-access-h675m" (OuterVolumeSpecName: "kube-api-access-h675m") pod "edcd959c-f974-4f9d-acb4-4f649df7f068" (UID: "edcd959c-f974-4f9d-acb4-4f649df7f068"). InnerVolumeSpecName "kube-api-access-h675m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.703101 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edcd959c-f974-4f9d-acb4-4f649df7f068-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edcd959c-f974-4f9d-acb4-4f649df7f068" (UID: "edcd959c-f974-4f9d-acb4-4f649df7f068"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.720542 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h675m\" (UniqueName: \"kubernetes.io/projected/edcd959c-f974-4f9d-acb4-4f649df7f068-kube-api-access-h675m\") on node \"crc\" DevicePath \"\"" Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.720582 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edcd959c-f974-4f9d-acb4-4f649df7f068-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.720592 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edcd959c-f974-4f9d-acb4-4f649df7f068-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.833562 4766 generic.go:334] "Generic (PLEG): container finished" podID="edcd959c-f974-4f9d-acb4-4f649df7f068" containerID="efd227372ac83c3f8f3a82e0c1fb7157112087422c8d827a8d9b35a290fa92e6" exitCode=0 Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.833620 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hm457" event={"ID":"edcd959c-f974-4f9d-acb4-4f649df7f068","Type":"ContainerDied","Data":"efd227372ac83c3f8f3a82e0c1fb7157112087422c8d827a8d9b35a290fa92e6"} Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.833647 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hm457" event={"ID":"edcd959c-f974-4f9d-acb4-4f649df7f068","Type":"ContainerDied","Data":"3c36dcb19a112d1d1d2ce1da18f1ee800005e930a62c076008de462011c00ded"} Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.833663 4766 scope.go:117] "RemoveContainer" containerID="efd227372ac83c3f8f3a82e0c1fb7157112087422c8d827a8d9b35a290fa92e6" Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.833689 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hm457" Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.859194 4766 scope.go:117] "RemoveContainer" containerID="31cd5a26a6594cd3cf27c1e44021b9dcbbf78ce3dc533baf014e9451627da2b2" Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.906728 4766 scope.go:117] "RemoveContainer" containerID="7ff41564c18bde105fb169084b74f476804aa2d077cbdc00b77aa31597f2b4a5" Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.913201 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hm457"] Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.913237 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hm457"] Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.930794 4766 scope.go:117] "RemoveContainer" containerID="efd227372ac83c3f8f3a82e0c1fb7157112087422c8d827a8d9b35a290fa92e6" Oct 02 13:10:07 crc kubenswrapper[4766]: E1002 13:10:07.931227 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efd227372ac83c3f8f3a82e0c1fb7157112087422c8d827a8d9b35a290fa92e6\": container with ID starting with efd227372ac83c3f8f3a82e0c1fb7157112087422c8d827a8d9b35a290fa92e6 not found: ID does not exist" containerID="efd227372ac83c3f8f3a82e0c1fb7157112087422c8d827a8d9b35a290fa92e6" Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.931266 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efd227372ac83c3f8f3a82e0c1fb7157112087422c8d827a8d9b35a290fa92e6"} err="failed to get container status \"efd227372ac83c3f8f3a82e0c1fb7157112087422c8d827a8d9b35a290fa92e6\": rpc error: code = NotFound desc = could not find container \"efd227372ac83c3f8f3a82e0c1fb7157112087422c8d827a8d9b35a290fa92e6\": container with ID starting with efd227372ac83c3f8f3a82e0c1fb7157112087422c8d827a8d9b35a290fa92e6 not found: ID does not exist" Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.931293 4766 scope.go:117] "RemoveContainer" containerID="31cd5a26a6594cd3cf27c1e44021b9dcbbf78ce3dc533baf014e9451627da2b2" Oct 02 13:10:07 crc kubenswrapper[4766]: E1002 13:10:07.931489 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31cd5a26a6594cd3cf27c1e44021b9dcbbf78ce3dc533baf014e9451627da2b2\": container with ID starting with 31cd5a26a6594cd3cf27c1e44021b9dcbbf78ce3dc533baf014e9451627da2b2 not found: ID does not exist" containerID="31cd5a26a6594cd3cf27c1e44021b9dcbbf78ce3dc533baf014e9451627da2b2" Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.931529 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31cd5a26a6594cd3cf27c1e44021b9dcbbf78ce3dc533baf014e9451627da2b2"} err="failed to get container status \"31cd5a26a6594cd3cf27c1e44021b9dcbbf78ce3dc533baf014e9451627da2b2\": rpc error: code = NotFound desc = could not find container \"31cd5a26a6594cd3cf27c1e44021b9dcbbf78ce3dc533baf014e9451627da2b2\": container with ID starting with 31cd5a26a6594cd3cf27c1e44021b9dcbbf78ce3dc533baf014e9451627da2b2 not found: ID does not exist" Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.931547 4766 scope.go:117] "RemoveContainer" containerID="7ff41564c18bde105fb169084b74f476804aa2d077cbdc00b77aa31597f2b4a5" Oct 02 13:10:07 crc kubenswrapper[4766]: E1002 13:10:07.931988 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ff41564c18bde105fb169084b74f476804aa2d077cbdc00b77aa31597f2b4a5\": container with ID starting with 7ff41564c18bde105fb169084b74f476804aa2d077cbdc00b77aa31597f2b4a5 not found: ID does not exist" containerID="7ff41564c18bde105fb169084b74f476804aa2d077cbdc00b77aa31597f2b4a5" Oct 02 13:10:07 crc kubenswrapper[4766]: I1002 13:10:07.932011 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ff41564c18bde105fb169084b74f476804aa2d077cbdc00b77aa31597f2b4a5"} err="failed to get container status \"7ff41564c18bde105fb169084b74f476804aa2d077cbdc00b77aa31597f2b4a5\": rpc error: code = NotFound desc = could not find container \"7ff41564c18bde105fb169084b74f476804aa2d077cbdc00b77aa31597f2b4a5\": container with ID starting with 7ff41564c18bde105fb169084b74f476804aa2d077cbdc00b77aa31597f2b4a5 not found: ID does not exist" Oct 02 13:10:08 crc kubenswrapper[4766]: I1002 13:10:08.853240 4766 generic.go:334] "Generic (PLEG): container finished" podID="5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22" containerID="89cc69b4c7b152b6262012b9a0f19a2c3bce7d4c26bce97c653bc2ead78c1eb2" exitCode=0 Oct 02 13:10:08 crc kubenswrapper[4766]: I1002 13:10:08.853342 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" event={"ID":"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22","Type":"ContainerDied","Data":"89cc69b4c7b152b6262012b9a0f19a2c3bce7d4c26bce97c653bc2ead78c1eb2"} Oct 02 13:10:09 crc kubenswrapper[4766]: I1002 13:10:09.896651 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edcd959c-f974-4f9d-acb4-4f649df7f068" path="/var/lib/kubelet/pods/edcd959c-f974-4f9d-acb4-4f649df7f068/volumes" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.289125 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.383238 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-libvirt-combined-ca-bundle\") pod \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.383292 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdb27\" (UniqueName: \"kubernetes.io/projected/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-kube-api-access-kdb27\") pod \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.383323 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-inventory\") pod \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.383396 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-libvirt-secret-0\") pod \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.383587 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-ssh-key\") pod \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.383668 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-ceph\") pod \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\" (UID: \"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22\") " Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.388451 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-ceph" (OuterVolumeSpecName: "ceph") pod "5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22" (UID: "5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.389080 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22" (UID: "5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.390304 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-kube-api-access-kdb27" (OuterVolumeSpecName: "kube-api-access-kdb27") pod "5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22" (UID: "5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22"). InnerVolumeSpecName "kube-api-access-kdb27". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.418598 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22" (UID: "5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.422118 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-inventory" (OuterVolumeSpecName: "inventory") pod "5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22" (UID: "5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.442378 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22" (UID: "5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.486767 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.486802 4766 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.486814 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdb27\" (UniqueName: \"kubernetes.io/projected/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-kube-api-access-kdb27\") on node \"crc\" DevicePath \"\"" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.486822 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.486835 4766 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.486843 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.877626 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" event={"ID":"5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22","Type":"ContainerDied","Data":"a7fa9af246ef4f508ddcac6db02cb6dc53e1c55dd67a2d990eb70e748e934f6a"} Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.877672 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7fa9af246ef4f508ddcac6db02cb6dc53e1c55dd67a2d990eb70e748e934f6a" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.877717 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-qhxsq" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.977874 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-7k7hx"] Oct 02 13:10:10 crc kubenswrapper[4766]: E1002 13:10:10.978304 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edcd959c-f974-4f9d-acb4-4f649df7f068" containerName="extract-utilities" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.978316 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="edcd959c-f974-4f9d-acb4-4f649df7f068" containerName="extract-utilities" Oct 02 13:10:10 crc kubenswrapper[4766]: E1002 13:10:10.978346 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edcd959c-f974-4f9d-acb4-4f649df7f068" containerName="registry-server" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.978352 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="edcd959c-f974-4f9d-acb4-4f649df7f068" containerName="registry-server" Oct 02 13:10:10 crc kubenswrapper[4766]: E1002 13:10:10.978365 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22" containerName="libvirt-openstack-openstack-cell1" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.978371 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22" containerName="libvirt-openstack-openstack-cell1" Oct 02 13:10:10 crc kubenswrapper[4766]: E1002 13:10:10.978403 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edcd959c-f974-4f9d-acb4-4f649df7f068" containerName="extract-content" Oct 02 13:10:10 crc kubenswrapper[4766]: I1002 13:10:10.978409 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="edcd959c-f974-4f9d-acb4-4f649df7f068" containerName="extract-content" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:10.995292 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="edcd959c-f974-4f9d-acb4-4f649df7f068" containerName="registry-server" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:10.995384 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22" containerName="libvirt-openstack-openstack-cell1" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.023246 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.026112 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.029181 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.029592 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.029808 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.029944 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.030037 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.030171 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.045041 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-7k7hx"] Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.110158 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.110219 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.110250 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqjlw\" (UniqueName: \"kubernetes.io/projected/3584b308-cda0-4e37-a0ef-63fef09a9be8-kube-api-access-qqjlw\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.110290 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.110313 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.110332 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.110352 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-inventory\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.110416 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.110468 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-ceph\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.110539 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.110569 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.211949 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.212026 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-ceph\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.212635 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.212675 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.212704 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.212733 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.212760 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqjlw\" (UniqueName: \"kubernetes.io/projected/3584b308-cda0-4e37-a0ef-63fef09a9be8-kube-api-access-qqjlw\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.212796 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.212815 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.212833 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.212850 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-inventory\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.212870 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.213553 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.217276 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.217646 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.218293 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.218321 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.218657 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.223011 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-ceph\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.223431 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-inventory\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.223996 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.229432 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqjlw\" (UniqueName: \"kubernetes.io/projected/3584b308-cda0-4e37-a0ef-63fef09a9be8-kube-api-access-qqjlw\") pod \"nova-cell1-openstack-openstack-cell1-7k7hx\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.346566 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:10:11 crc kubenswrapper[4766]: I1002 13:10:11.892977 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-7k7hx"] Oct 02 13:10:12 crc kubenswrapper[4766]: I1002 13:10:12.922087 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" event={"ID":"3584b308-cda0-4e37-a0ef-63fef09a9be8","Type":"ContainerStarted","Data":"79f99c99d8d70149a837d63d03fcb9d078992677b009752e44c1333678aa58f4"} Oct 02 13:10:12 crc kubenswrapper[4766]: I1002 13:10:12.922475 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" event={"ID":"3584b308-cda0-4e37-a0ef-63fef09a9be8","Type":"ContainerStarted","Data":"ab4eada98cff30fc1488d67a8ce35dad04063354f866caeef366bc40de6f92b7"} Oct 02 13:10:12 crc kubenswrapper[4766]: I1002 13:10:12.948634 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" podStartSLOduration=2.659326178 podStartE2EDuration="2.948601947s" podCreationTimestamp="2025-10-02 13:10:10 +0000 UTC" firstStartedPulling="2025-10-02 13:10:11.904921767 +0000 UTC m=+8326.847792711" lastFinishedPulling="2025-10-02 13:10:12.194197536 +0000 UTC m=+8327.137068480" observedRunningTime="2025-10-02 13:10:12.946328003 +0000 UTC m=+8327.889198947" watchObservedRunningTime="2025-10-02 13:10:12.948601947 +0000 UTC m=+8327.891472931" Oct 02 13:10:54 crc kubenswrapper[4766]: I1002 13:10:54.432444 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:10:54 crc kubenswrapper[4766]: I1002 13:10:54.433279 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:11:24 crc kubenswrapper[4766]: I1002 13:11:24.431885 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:11:24 crc kubenswrapper[4766]: I1002 13:11:24.432614 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:11:54 crc kubenswrapper[4766]: I1002 13:11:54.433127 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:11:54 crc kubenswrapper[4766]: I1002 13:11:54.434323 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:11:54 crc kubenswrapper[4766]: I1002 13:11:54.434404 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 13:11:54 crc kubenswrapper[4766]: I1002 13:11:54.437064 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:11:54 crc kubenswrapper[4766]: I1002 13:11:54.437263 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" gracePeriod=600 Oct 02 13:11:54 crc kubenswrapper[4766]: E1002 13:11:54.649984 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:11:55 crc kubenswrapper[4766]: I1002 13:11:55.129309 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" exitCode=0 Oct 02 13:11:55 crc kubenswrapper[4766]: I1002 13:11:55.129387 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8"} Oct 02 13:11:55 crc kubenswrapper[4766]: I1002 13:11:55.129438 4766 scope.go:117] "RemoveContainer" containerID="74c2163ca1fa7ad09b7ba7a3b4eb6190f280a4c4e20f090ee0f37ef7b01fce4d" Oct 02 13:11:55 crc kubenswrapper[4766]: I1002 13:11:55.130953 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:11:55 crc kubenswrapper[4766]: E1002 13:11:55.131412 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:12:05 crc kubenswrapper[4766]: I1002 13:12:05.898086 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:12:05 crc kubenswrapper[4766]: E1002 13:12:05.899627 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:12:19 crc kubenswrapper[4766]: I1002 13:12:19.882317 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:12:19 crc kubenswrapper[4766]: E1002 13:12:19.883528 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:12:33 crc kubenswrapper[4766]: I1002 13:12:33.882863 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:12:33 crc kubenswrapper[4766]: E1002 13:12:33.884219 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:12:44 crc kubenswrapper[4766]: I1002 13:12:44.881812 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:12:44 crc kubenswrapper[4766]: E1002 13:12:44.882938 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:12:59 crc kubenswrapper[4766]: I1002 13:12:59.882167 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:12:59 crc kubenswrapper[4766]: E1002 13:12:59.882771 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:13:11 crc kubenswrapper[4766]: I1002 13:13:11.058345 4766 generic.go:334] "Generic (PLEG): container finished" podID="3584b308-cda0-4e37-a0ef-63fef09a9be8" containerID="79f99c99d8d70149a837d63d03fcb9d078992677b009752e44c1333678aa58f4" exitCode=2 Oct 02 13:13:11 crc kubenswrapper[4766]: I1002 13:13:11.058569 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" event={"ID":"3584b308-cda0-4e37-a0ef-63fef09a9be8","Type":"ContainerDied","Data":"79f99c99d8d70149a837d63d03fcb9d078992677b009752e44c1333678aa58f4"} Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.636054 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.729678 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-ssh-key\") pod \"3584b308-cda0-4e37-a0ef-63fef09a9be8\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.729731 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-inventory\") pod \"3584b308-cda0-4e37-a0ef-63fef09a9be8\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.729848 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cells-global-config-0\") pod \"3584b308-cda0-4e37-a0ef-63fef09a9be8\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.729916 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-migration-ssh-key-0\") pod \"3584b308-cda0-4e37-a0ef-63fef09a9be8\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.729938 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-compute-config-1\") pod \"3584b308-cda0-4e37-a0ef-63fef09a9be8\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.729993 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-migration-ssh-key-1\") pod \"3584b308-cda0-4e37-a0ef-63fef09a9be8\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.730013 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-ceph\") pod \"3584b308-cda0-4e37-a0ef-63fef09a9be8\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.730070 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cells-global-config-1\") pod \"3584b308-cda0-4e37-a0ef-63fef09a9be8\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.730112 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-combined-ca-bundle\") pod \"3584b308-cda0-4e37-a0ef-63fef09a9be8\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.730176 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqjlw\" (UniqueName: \"kubernetes.io/projected/3584b308-cda0-4e37-a0ef-63fef09a9be8-kube-api-access-qqjlw\") pod \"3584b308-cda0-4e37-a0ef-63fef09a9be8\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.730195 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-compute-config-0\") pod \"3584b308-cda0-4e37-a0ef-63fef09a9be8\" (UID: \"3584b308-cda0-4e37-a0ef-63fef09a9be8\") " Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.735757 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3584b308-cda0-4e37-a0ef-63fef09a9be8-kube-api-access-qqjlw" (OuterVolumeSpecName: "kube-api-access-qqjlw") pod "3584b308-cda0-4e37-a0ef-63fef09a9be8" (UID: "3584b308-cda0-4e37-a0ef-63fef09a9be8"). InnerVolumeSpecName "kube-api-access-qqjlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.735937 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-ceph" (OuterVolumeSpecName: "ceph") pod "3584b308-cda0-4e37-a0ef-63fef09a9be8" (UID: "3584b308-cda0-4e37-a0ef-63fef09a9be8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.736366 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "3584b308-cda0-4e37-a0ef-63fef09a9be8" (UID: "3584b308-cda0-4e37-a0ef-63fef09a9be8"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.765731 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "3584b308-cda0-4e37-a0ef-63fef09a9be8" (UID: "3584b308-cda0-4e37-a0ef-63fef09a9be8"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.765728 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "3584b308-cda0-4e37-a0ef-63fef09a9be8" (UID: "3584b308-cda0-4e37-a0ef-63fef09a9be8"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.774767 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "3584b308-cda0-4e37-a0ef-63fef09a9be8" (UID: "3584b308-cda0-4e37-a0ef-63fef09a9be8"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.774796 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-inventory" (OuterVolumeSpecName: "inventory") pod "3584b308-cda0-4e37-a0ef-63fef09a9be8" (UID: "3584b308-cda0-4e37-a0ef-63fef09a9be8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.774849 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "3584b308-cda0-4e37-a0ef-63fef09a9be8" (UID: "3584b308-cda0-4e37-a0ef-63fef09a9be8"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.774889 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3584b308-cda0-4e37-a0ef-63fef09a9be8" (UID: "3584b308-cda0-4e37-a0ef-63fef09a9be8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.777758 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "3584b308-cda0-4e37-a0ef-63fef09a9be8" (UID: "3584b308-cda0-4e37-a0ef-63fef09a9be8"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.780701 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "3584b308-cda0-4e37-a0ef-63fef09a9be8" (UID: "3584b308-cda0-4e37-a0ef-63fef09a9be8"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.832339 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.832659 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.832741 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqjlw\" (UniqueName: \"kubernetes.io/projected/3584b308-cda0-4e37-a0ef-63fef09a9be8-kube-api-access-qqjlw\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.832812 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.832886 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.833013 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.833089 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.833158 4766 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.833226 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.833299 4766 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:12 crc kubenswrapper[4766]: I1002 13:13:12.833367 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3584b308-cda0-4e37-a0ef-63fef09a9be8-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:13:13 crc kubenswrapper[4766]: I1002 13:13:13.084232 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" event={"ID":"3584b308-cda0-4e37-a0ef-63fef09a9be8","Type":"ContainerDied","Data":"ab4eada98cff30fc1488d67a8ce35dad04063354f866caeef366bc40de6f92b7"} Oct 02 13:13:13 crc kubenswrapper[4766]: I1002 13:13:13.084269 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab4eada98cff30fc1488d67a8ce35dad04063354f866caeef366bc40de6f92b7" Oct 02 13:13:13 crc kubenswrapper[4766]: I1002 13:13:13.084328 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-7k7hx" Oct 02 13:13:13 crc kubenswrapper[4766]: I1002 13:13:13.881981 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:13:13 crc kubenswrapper[4766]: E1002 13:13:13.883372 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.040804 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-nbn66"] Oct 02 13:13:20 crc kubenswrapper[4766]: E1002 13:13:20.041722 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3584b308-cda0-4e37-a0ef-63fef09a9be8" containerName="nova-cell1-openstack-openstack-cell1" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.041734 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="3584b308-cda0-4e37-a0ef-63fef09a9be8" containerName="nova-cell1-openstack-openstack-cell1" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.041957 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="3584b308-cda0-4e37-a0ef-63fef09a9be8" containerName="nova-cell1-openstack-openstack-cell1" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.042739 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.045792 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.046453 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.046718 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.046843 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.047008 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.047316 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.050592 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.074598 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-nbn66"] Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.124778 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-ceph\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.124841 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.124875 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.125065 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w5z5\" (UniqueName: \"kubernetes.io/projected/48c97d74-b920-4e52-b90d-44faa051eba6-kube-api-access-5w5z5\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.125300 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.125326 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.125386 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.125436 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.125461 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.125606 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.125649 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-inventory\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.227957 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.228322 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-inventory\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.228396 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-ceph\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.228423 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.228464 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.228554 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w5z5\" (UniqueName: \"kubernetes.io/projected/48c97d74-b920-4e52-b90d-44faa051eba6-kube-api-access-5w5z5\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.228676 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.228696 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.228753 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.228790 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.228817 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.228868 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.238218 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.239306 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.240123 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.240617 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-ceph\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.240974 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-inventory\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.241346 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.252337 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.253152 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.253590 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.258115 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w5z5\" (UniqueName: \"kubernetes.io/projected/48c97d74-b920-4e52-b90d-44faa051eba6-kube-api-access-5w5z5\") pod \"nova-cell1-openstack-openstack-cell1-nbn66\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.380002 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:13:20 crc kubenswrapper[4766]: I1002 13:13:20.964423 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-nbn66"] Oct 02 13:13:20 crc kubenswrapper[4766]: W1002 13:13:20.971998 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48c97d74_b920_4e52_b90d_44faa051eba6.slice/crio-23cae37aa7f053a3ffcfb3fba65c77472ed41725b3994b3de02f83307d437652 WatchSource:0}: Error finding container 23cae37aa7f053a3ffcfb3fba65c77472ed41725b3994b3de02f83307d437652: Status 404 returned error can't find the container with id 23cae37aa7f053a3ffcfb3fba65c77472ed41725b3994b3de02f83307d437652 Oct 02 13:13:21 crc kubenswrapper[4766]: I1002 13:13:21.213600 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" event={"ID":"48c97d74-b920-4e52-b90d-44faa051eba6","Type":"ContainerStarted","Data":"23cae37aa7f053a3ffcfb3fba65c77472ed41725b3994b3de02f83307d437652"} Oct 02 13:13:22 crc kubenswrapper[4766]: I1002 13:13:22.228633 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" event={"ID":"48c97d74-b920-4e52-b90d-44faa051eba6","Type":"ContainerStarted","Data":"ac82397871a078adb5c5c77ef28952d12456fedb6ac993a48550df35357be462"} Oct 02 13:13:22 crc kubenswrapper[4766]: I1002 13:13:22.271152 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" podStartSLOduration=2.124637687 podStartE2EDuration="2.271119301s" podCreationTimestamp="2025-10-02 13:13:20 +0000 UTC" firstStartedPulling="2025-10-02 13:13:20.974485117 +0000 UTC m=+8515.917356061" lastFinishedPulling="2025-10-02 13:13:21.120966691 +0000 UTC m=+8516.063837675" observedRunningTime="2025-10-02 13:13:22.260204991 +0000 UTC m=+8517.203075995" watchObservedRunningTime="2025-10-02 13:13:22.271119301 +0000 UTC m=+8517.213990275" Oct 02 13:13:25 crc kubenswrapper[4766]: I1002 13:13:25.897761 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:13:25 crc kubenswrapper[4766]: E1002 13:13:25.898786 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:13:38 crc kubenswrapper[4766]: I1002 13:13:38.882825 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:13:38 crc kubenswrapper[4766]: E1002 13:13:38.883757 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:13:50 crc kubenswrapper[4766]: I1002 13:13:50.883063 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:13:50 crc kubenswrapper[4766]: E1002 13:13:50.884432 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:14:02 crc kubenswrapper[4766]: I1002 13:14:02.882122 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:14:02 crc kubenswrapper[4766]: E1002 13:14:02.882772 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:14:15 crc kubenswrapper[4766]: I1002 13:14:15.850127 4766 generic.go:334] "Generic (PLEG): container finished" podID="48c97d74-b920-4e52-b90d-44faa051eba6" containerID="ac82397871a078adb5c5c77ef28952d12456fedb6ac993a48550df35357be462" exitCode=2 Oct 02 13:14:15 crc kubenswrapper[4766]: I1002 13:14:15.850169 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" event={"ID":"48c97d74-b920-4e52-b90d-44faa051eba6","Type":"ContainerDied","Data":"ac82397871a078adb5c5c77ef28952d12456fedb6ac993a48550df35357be462"} Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.469154 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.501663 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-inventory\") pod \"48c97d74-b920-4e52-b90d-44faa051eba6\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.501769 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-compute-config-0\") pod \"48c97d74-b920-4e52-b90d-44faa051eba6\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.501831 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cells-global-config-0\") pod \"48c97d74-b920-4e52-b90d-44faa051eba6\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.501902 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-combined-ca-bundle\") pod \"48c97d74-b920-4e52-b90d-44faa051eba6\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.501993 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cells-global-config-1\") pod \"48c97d74-b920-4e52-b90d-44faa051eba6\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.502055 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-compute-config-1\") pod \"48c97d74-b920-4e52-b90d-44faa051eba6\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.502087 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-ceph\") pod \"48c97d74-b920-4e52-b90d-44faa051eba6\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.502155 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-migration-ssh-key-1\") pod \"48c97d74-b920-4e52-b90d-44faa051eba6\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.502186 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-ssh-key\") pod \"48c97d74-b920-4e52-b90d-44faa051eba6\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.502228 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w5z5\" (UniqueName: \"kubernetes.io/projected/48c97d74-b920-4e52-b90d-44faa051eba6-kube-api-access-5w5z5\") pod \"48c97d74-b920-4e52-b90d-44faa051eba6\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.502260 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-migration-ssh-key-0\") pod \"48c97d74-b920-4e52-b90d-44faa051eba6\" (UID: \"48c97d74-b920-4e52-b90d-44faa051eba6\") " Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.525100 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-ceph" (OuterVolumeSpecName: "ceph") pod "48c97d74-b920-4e52-b90d-44faa051eba6" (UID: "48c97d74-b920-4e52-b90d-44faa051eba6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.525229 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "48c97d74-b920-4e52-b90d-44faa051eba6" (UID: "48c97d74-b920-4e52-b90d-44faa051eba6"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.530755 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c97d74-b920-4e52-b90d-44faa051eba6-kube-api-access-5w5z5" (OuterVolumeSpecName: "kube-api-access-5w5z5") pod "48c97d74-b920-4e52-b90d-44faa051eba6" (UID: "48c97d74-b920-4e52-b90d-44faa051eba6"). InnerVolumeSpecName "kube-api-access-5w5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.542370 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "48c97d74-b920-4e52-b90d-44faa051eba6" (UID: "48c97d74-b920-4e52-b90d-44faa051eba6"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.562536 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "48c97d74-b920-4e52-b90d-44faa051eba6" (UID: "48c97d74-b920-4e52-b90d-44faa051eba6"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.565352 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "48c97d74-b920-4e52-b90d-44faa051eba6" (UID: "48c97d74-b920-4e52-b90d-44faa051eba6"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.569895 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "48c97d74-b920-4e52-b90d-44faa051eba6" (UID: "48c97d74-b920-4e52-b90d-44faa051eba6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.575370 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "48c97d74-b920-4e52-b90d-44faa051eba6" (UID: "48c97d74-b920-4e52-b90d-44faa051eba6"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.589655 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "48c97d74-b920-4e52-b90d-44faa051eba6" (UID: "48c97d74-b920-4e52-b90d-44faa051eba6"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.589976 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "48c97d74-b920-4e52-b90d-44faa051eba6" (UID: "48c97d74-b920-4e52-b90d-44faa051eba6"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.593492 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-inventory" (OuterVolumeSpecName: "inventory") pod "48c97d74-b920-4e52-b90d-44faa051eba6" (UID: "48c97d74-b920-4e52-b90d-44faa051eba6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.605638 4766 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.605710 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.605725 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w5z5\" (UniqueName: \"kubernetes.io/projected/48c97d74-b920-4e52-b90d-44faa051eba6-kube-api-access-5w5z5\") on node \"crc\" DevicePath \"\"" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.605737 4766 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.605753 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.605795 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.605807 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.605819 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.605831 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.605869 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.605884 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/48c97d74-b920-4e52-b90d-44faa051eba6-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.883215 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:14:17 crc kubenswrapper[4766]: E1002 13:14:17.883900 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.893257 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.908453 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-nbn66" event={"ID":"48c97d74-b920-4e52-b90d-44faa051eba6","Type":"ContainerDied","Data":"23cae37aa7f053a3ffcfb3fba65c77472ed41725b3994b3de02f83307d437652"} Oct 02 13:14:17 crc kubenswrapper[4766]: I1002 13:14:17.908495 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23cae37aa7f053a3ffcfb3fba65c77472ed41725b3994b3de02f83307d437652" Oct 02 13:14:32 crc kubenswrapper[4766]: I1002 13:14:32.882335 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:14:32 crc kubenswrapper[4766]: E1002 13:14:32.883665 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.055672 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-p4nqg"] Oct 02 13:14:35 crc kubenswrapper[4766]: E1002 13:14:35.056716 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c97d74-b920-4e52-b90d-44faa051eba6" containerName="nova-cell1-openstack-openstack-cell1" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.056736 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c97d74-b920-4e52-b90d-44faa051eba6" containerName="nova-cell1-openstack-openstack-cell1" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.057039 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c97d74-b920-4e52-b90d-44faa051eba6" containerName="nova-cell1-openstack-openstack-cell1" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.058218 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.064100 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.064225 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.064400 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.064472 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.064482 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.064800 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.065147 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.083953 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-p4nqg"] Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.099827 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.099918 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.099974 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkv85\" (UniqueName: \"kubernetes.io/projected/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-kube-api-access-lkv85\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.100013 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.100040 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-inventory\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.100075 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-ceph\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.100139 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.100170 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.100194 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.100222 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.100247 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.207410 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-ceph\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.207512 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.207541 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.207559 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.207584 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.207605 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.207670 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.207712 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.207747 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkv85\" (UniqueName: \"kubernetes.io/projected/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-kube-api-access-lkv85\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.207772 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.207790 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-inventory\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.208694 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.209453 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.214130 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.214925 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-ceph\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.215568 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.215831 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.215843 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.215878 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-inventory\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.219571 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.221132 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.228966 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkv85\" (UniqueName: \"kubernetes.io/projected/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-kube-api-access-lkv85\") pod \"nova-cell1-openstack-openstack-cell1-p4nqg\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:35 crc kubenswrapper[4766]: I1002 13:14:35.407753 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:14:36 crc kubenswrapper[4766]: I1002 13:14:36.065290 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-p4nqg"] Oct 02 13:14:36 crc kubenswrapper[4766]: I1002 13:14:36.098184 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" event={"ID":"87b4132e-2db0-40be-9e2d-7c7c8261f7bc","Type":"ContainerStarted","Data":"d0612013115e5f8dd35ead9e002aaafecdc27e5acede25b0ccb6ec0d68e3a131"} Oct 02 13:14:37 crc kubenswrapper[4766]: I1002 13:14:37.117175 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" event={"ID":"87b4132e-2db0-40be-9e2d-7c7c8261f7bc","Type":"ContainerStarted","Data":"859f3a2b9ae6532a1b81654712e29b9cee92a01a213e7f01b6baaf895dc0690d"} Oct 02 13:14:37 crc kubenswrapper[4766]: I1002 13:14:37.156238 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" podStartSLOduration=1.9903821069999998 podStartE2EDuration="2.156214641s" podCreationTimestamp="2025-10-02 13:14:35 +0000 UTC" firstStartedPulling="2025-10-02 13:14:36.069347558 +0000 UTC m=+8591.012218492" lastFinishedPulling="2025-10-02 13:14:36.235180082 +0000 UTC m=+8591.178051026" observedRunningTime="2025-10-02 13:14:37.139893169 +0000 UTC m=+8592.082764133" watchObservedRunningTime="2025-10-02 13:14:37.156214641 +0000 UTC m=+8592.099085605" Oct 02 13:14:45 crc kubenswrapper[4766]: I1002 13:14:45.891457 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:14:45 crc kubenswrapper[4766]: E1002 13:14:45.892378 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:14:56 crc kubenswrapper[4766]: I1002 13:14:56.881901 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:14:56 crc kubenswrapper[4766]: E1002 13:14:56.882979 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:15:00 crc kubenswrapper[4766]: I1002 13:15:00.158809 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st"] Oct 02 13:15:00 crc kubenswrapper[4766]: I1002 13:15:00.161605 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" Oct 02 13:15:00 crc kubenswrapper[4766]: I1002 13:15:00.163930 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 02 13:15:00 crc kubenswrapper[4766]: I1002 13:15:00.164448 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 02 13:15:00 crc kubenswrapper[4766]: I1002 13:15:00.175700 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st"] Oct 02 13:15:00 crc kubenswrapper[4766]: I1002 13:15:00.306891 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vknb4\" (UniqueName: \"kubernetes.io/projected/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-kube-api-access-vknb4\") pod \"collect-profiles-29323515-br6st\" (UID: \"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" Oct 02 13:15:00 crc kubenswrapper[4766]: I1002 13:15:00.307096 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-secret-volume\") pod \"collect-profiles-29323515-br6st\" (UID: \"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" Oct 02 13:15:00 crc kubenswrapper[4766]: I1002 13:15:00.307141 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-config-volume\") pod \"collect-profiles-29323515-br6st\" (UID: \"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" Oct 02 13:15:00 crc kubenswrapper[4766]: I1002 13:15:00.410039 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-secret-volume\") pod \"collect-profiles-29323515-br6st\" (UID: \"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" Oct 02 13:15:00 crc kubenswrapper[4766]: I1002 13:15:00.410154 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-config-volume\") pod \"collect-profiles-29323515-br6st\" (UID: \"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" Oct 02 13:15:00 crc kubenswrapper[4766]: I1002 13:15:00.410329 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vknb4\" (UniqueName: \"kubernetes.io/projected/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-kube-api-access-vknb4\") pod \"collect-profiles-29323515-br6st\" (UID: \"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" Oct 02 13:15:00 crc kubenswrapper[4766]: I1002 13:15:00.410972 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-config-volume\") pod \"collect-profiles-29323515-br6st\" (UID: \"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" Oct 02 13:15:00 crc kubenswrapper[4766]: I1002 13:15:00.415731 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-secret-volume\") pod \"collect-profiles-29323515-br6st\" (UID: \"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" Oct 02 13:15:00 crc kubenswrapper[4766]: I1002 13:15:00.426226 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vknb4\" (UniqueName: \"kubernetes.io/projected/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-kube-api-access-vknb4\") pod \"collect-profiles-29323515-br6st\" (UID: \"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" Oct 02 13:15:00 crc kubenswrapper[4766]: I1002 13:15:00.497476 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" Oct 02 13:15:01 crc kubenswrapper[4766]: I1002 13:15:01.007160 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st"] Oct 02 13:15:01 crc kubenswrapper[4766]: I1002 13:15:01.405838 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" event={"ID":"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c","Type":"ContainerStarted","Data":"dce9ff83273c13143a976fc62b4006035484ef994687ceda1af10ab17e5ae2ae"} Oct 02 13:15:01 crc kubenswrapper[4766]: I1002 13:15:01.406191 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" event={"ID":"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c","Type":"ContainerStarted","Data":"6786255f4a53903c717043fe426747a731afc2e433e3e1c900aabe009e09e3ca"} Oct 02 13:15:01 crc kubenswrapper[4766]: I1002 13:15:01.443212 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" podStartSLOduration=1.443188095 podStartE2EDuration="1.443188095s" podCreationTimestamp="2025-10-02 13:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:15:01.43118642 +0000 UTC m=+8616.374057384" watchObservedRunningTime="2025-10-02 13:15:01.443188095 +0000 UTC m=+8616.386059049" Oct 02 13:15:02 crc kubenswrapper[4766]: I1002 13:15:02.426144 4766 generic.go:334] "Generic (PLEG): container finished" podID="52ab692d-bd6a-4304-82d7-3c7f77b0ff5c" containerID="dce9ff83273c13143a976fc62b4006035484ef994687ceda1af10ab17e5ae2ae" exitCode=0 Oct 02 13:15:02 crc kubenswrapper[4766]: I1002 13:15:02.426223 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" event={"ID":"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c","Type":"ContainerDied","Data":"dce9ff83273c13143a976fc62b4006035484ef994687ceda1af10ab17e5ae2ae"} Oct 02 13:15:03 crc kubenswrapper[4766]: I1002 13:15:03.837619 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" Oct 02 13:15:03 crc kubenswrapper[4766]: I1002 13:15:03.995062 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-config-volume\") pod \"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c\" (UID: \"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c\") " Oct 02 13:15:03 crc kubenswrapper[4766]: I1002 13:15:03.995376 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vknb4\" (UniqueName: \"kubernetes.io/projected/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-kube-api-access-vknb4\") pod \"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c\" (UID: \"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c\") " Oct 02 13:15:03 crc kubenswrapper[4766]: I1002 13:15:03.995635 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-secret-volume\") pod \"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c\" (UID: \"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c\") " Oct 02 13:15:03 crc kubenswrapper[4766]: I1002 13:15:03.996078 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-config-volume" (OuterVolumeSpecName: "config-volume") pod "52ab692d-bd6a-4304-82d7-3c7f77b0ff5c" (UID: "52ab692d-bd6a-4304-82d7-3c7f77b0ff5c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:15:03 crc kubenswrapper[4766]: I1002 13:15:03.996587 4766 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-config-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:04 crc kubenswrapper[4766]: I1002 13:15:04.003906 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "52ab692d-bd6a-4304-82d7-3c7f77b0ff5c" (UID: "52ab692d-bd6a-4304-82d7-3c7f77b0ff5c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:04 crc kubenswrapper[4766]: I1002 13:15:04.004245 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-kube-api-access-vknb4" (OuterVolumeSpecName: "kube-api-access-vknb4") pod "52ab692d-bd6a-4304-82d7-3c7f77b0ff5c" (UID: "52ab692d-bd6a-4304-82d7-3c7f77b0ff5c"). InnerVolumeSpecName "kube-api-access-vknb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:15:04 crc kubenswrapper[4766]: I1002 13:15:04.099551 4766 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:04 crc kubenswrapper[4766]: I1002 13:15:04.099618 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vknb4\" (UniqueName: \"kubernetes.io/projected/52ab692d-bd6a-4304-82d7-3c7f77b0ff5c-kube-api-access-vknb4\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:04 crc kubenswrapper[4766]: I1002 13:15:04.448816 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" event={"ID":"52ab692d-bd6a-4304-82d7-3c7f77b0ff5c","Type":"ContainerDied","Data":"6786255f4a53903c717043fe426747a731afc2e433e3e1c900aabe009e09e3ca"} Oct 02 13:15:04 crc kubenswrapper[4766]: I1002 13:15:04.448861 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6786255f4a53903c717043fe426747a731afc2e433e3e1c900aabe009e09e3ca" Oct 02 13:15:04 crc kubenswrapper[4766]: I1002 13:15:04.448908 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29323515-br6st" Oct 02 13:15:04 crc kubenswrapper[4766]: I1002 13:15:04.516171 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz"] Oct 02 13:15:04 crc kubenswrapper[4766]: I1002 13:15:04.524861 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29323470-7vnzz"] Oct 02 13:15:04 crc kubenswrapper[4766]: E1002 13:15:04.638511 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52ab692d_bd6a_4304_82d7_3c7f77b0ff5c.slice\": RecentStats: unable to find data in memory cache]" Oct 02 13:15:05 crc kubenswrapper[4766]: I1002 13:15:05.902745 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1948a4f1-8655-488c-8519-4e5e05806567" path="/var/lib/kubelet/pods/1948a4f1-8655-488c-8519-4e5e05806567/volumes" Oct 02 13:15:09 crc kubenswrapper[4766]: I1002 13:15:09.882298 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:15:09 crc kubenswrapper[4766]: E1002 13:15:09.883272 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:15:23 crc kubenswrapper[4766]: I1002 13:15:23.882119 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:15:23 crc kubenswrapper[4766]: E1002 13:15:23.883135 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:15:29 crc kubenswrapper[4766]: I1002 13:15:29.766526 4766 generic.go:334] "Generic (PLEG): container finished" podID="87b4132e-2db0-40be-9e2d-7c7c8261f7bc" containerID="859f3a2b9ae6532a1b81654712e29b9cee92a01a213e7f01b6baaf895dc0690d" exitCode=2 Oct 02 13:15:29 crc kubenswrapper[4766]: I1002 13:15:29.767106 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" event={"ID":"87b4132e-2db0-40be-9e2d-7c7c8261f7bc","Type":"ContainerDied","Data":"859f3a2b9ae6532a1b81654712e29b9cee92a01a213e7f01b6baaf895dc0690d"} Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.308079 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.368246 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-ceph\") pod \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.368306 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-inventory\") pod \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.368348 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-migration-ssh-key-0\") pod \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.368549 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-migration-ssh-key-1\") pod \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.368610 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-compute-config-1\") pod \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.368660 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkv85\" (UniqueName: \"kubernetes.io/projected/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-kube-api-access-lkv85\") pod \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.368711 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cells-global-config-1\") pod \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.368757 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cells-global-config-0\") pod \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.368793 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-combined-ca-bundle\") pod \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.368828 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-ssh-key\") pod \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.368904 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-compute-config-0\") pod \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\" (UID: \"87b4132e-2db0-40be-9e2d-7c7c8261f7bc\") " Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.375398 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-kube-api-access-lkv85" (OuterVolumeSpecName: "kube-api-access-lkv85") pod "87b4132e-2db0-40be-9e2d-7c7c8261f7bc" (UID: "87b4132e-2db0-40be-9e2d-7c7c8261f7bc"). InnerVolumeSpecName "kube-api-access-lkv85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.375636 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-ceph" (OuterVolumeSpecName: "ceph") pod "87b4132e-2db0-40be-9e2d-7c7c8261f7bc" (UID: "87b4132e-2db0-40be-9e2d-7c7c8261f7bc"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.381113 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "87b4132e-2db0-40be-9e2d-7c7c8261f7bc" (UID: "87b4132e-2db0-40be-9e2d-7c7c8261f7bc"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.438390 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "87b4132e-2db0-40be-9e2d-7c7c8261f7bc" (UID: "87b4132e-2db0-40be-9e2d-7c7c8261f7bc"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.442752 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "87b4132e-2db0-40be-9e2d-7c7c8261f7bc" (UID: "87b4132e-2db0-40be-9e2d-7c7c8261f7bc"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.448544 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "87b4132e-2db0-40be-9e2d-7c7c8261f7bc" (UID: "87b4132e-2db0-40be-9e2d-7c7c8261f7bc"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.448568 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "87b4132e-2db0-40be-9e2d-7c7c8261f7bc" (UID: "87b4132e-2db0-40be-9e2d-7c7c8261f7bc"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.462248 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "87b4132e-2db0-40be-9e2d-7c7c8261f7bc" (UID: "87b4132e-2db0-40be-9e2d-7c7c8261f7bc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.462304 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "87b4132e-2db0-40be-9e2d-7c7c8261f7bc" (UID: "87b4132e-2db0-40be-9e2d-7c7c8261f7bc"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.465188 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-inventory" (OuterVolumeSpecName: "inventory") pod "87b4132e-2db0-40be-9e2d-7c7c8261f7bc" (UID: "87b4132e-2db0-40be-9e2d-7c7c8261f7bc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.468993 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "87b4132e-2db0-40be-9e2d-7c7c8261f7bc" (UID: "87b4132e-2db0-40be-9e2d-7c7c8261f7bc"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.471137 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.471166 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.471180 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.471208 4766 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.471218 4766 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.471226 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.471235 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkv85\" (UniqueName: \"kubernetes.io/projected/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-kube-api-access-lkv85\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.471244 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.471253 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.471278 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.471289 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87b4132e-2db0-40be-9e2d-7c7c8261f7bc-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.801375 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" event={"ID":"87b4132e-2db0-40be-9e2d-7c7c8261f7bc","Type":"ContainerDied","Data":"d0612013115e5f8dd35ead9e002aaafecdc27e5acede25b0ccb6ec0d68e3a131"} Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.801419 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0612013115e5f8dd35ead9e002aaafecdc27e5acede25b0ccb6ec0d68e3a131" Oct 02 13:15:31 crc kubenswrapper[4766]: I1002 13:15:31.801540 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-p4nqg" Oct 02 13:15:36 crc kubenswrapper[4766]: I1002 13:15:36.881969 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:15:36 crc kubenswrapper[4766]: E1002 13:15:36.882984 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:15:42 crc kubenswrapper[4766]: I1002 13:15:42.566435 4766 scope.go:117] "RemoveContainer" containerID="206d025528e4fbb40e49dd73b157c1bf2fdc662d99e87adc30308eb2fc0b237b" Oct 02 13:15:47 crc kubenswrapper[4766]: I1002 13:15:47.882596 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:15:47 crc kubenswrapper[4766]: E1002 13:15:47.883427 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:15:59 crc kubenswrapper[4766]: I1002 13:15:59.881448 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:15:59 crc kubenswrapper[4766]: E1002 13:15:59.882437 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.041691 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-4npcm"] Oct 02 13:16:09 crc kubenswrapper[4766]: E1002 13:16:09.042711 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87b4132e-2db0-40be-9e2d-7c7c8261f7bc" containerName="nova-cell1-openstack-openstack-cell1" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.042723 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="87b4132e-2db0-40be-9e2d-7c7c8261f7bc" containerName="nova-cell1-openstack-openstack-cell1" Oct 02 13:16:09 crc kubenswrapper[4766]: E1002 13:16:09.042774 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52ab692d-bd6a-4304-82d7-3c7f77b0ff5c" containerName="collect-profiles" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.042781 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="52ab692d-bd6a-4304-82d7-3c7f77b0ff5c" containerName="collect-profiles" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.046060 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="87b4132e-2db0-40be-9e2d-7c7c8261f7bc" containerName="nova-cell1-openstack-openstack-cell1" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.046109 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="52ab692d-bd6a-4304-82d7-3c7f77b0ff5c" containerName="collect-profiles" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.046980 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.049027 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.051474 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-rlmpb" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.051594 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.051629 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.051544 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.052163 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.052380 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.059103 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-4npcm"] Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.209751 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.210306 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.210357 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.210391 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.210416 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.210570 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.210625 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.210687 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-ceph\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.210712 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.210769 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6mfm\" (UniqueName: \"kubernetes.io/projected/f9175f1f-c2e8-4454-86c2-5e4c795834b1-kube-api-access-r6mfm\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.210823 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.316372 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.318077 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.318127 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.318395 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.318490 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.318643 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-ceph\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.318695 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.318827 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6mfm\" (UniqueName: \"kubernetes.io/projected/f9175f1f-c2e8-4454-86c2-5e4c795834b1-kube-api-access-r6mfm\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.319055 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.319201 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.319299 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.320605 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.320634 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.323415 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.323421 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.324066 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.324062 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.325863 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.325972 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.327782 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-ceph\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.331926 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.337045 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6mfm\" (UniqueName: \"kubernetes.io/projected/f9175f1f-c2e8-4454-86c2-5e4c795834b1-kube-api-access-r6mfm\") pod \"nova-cell1-openstack-openstack-cell1-4npcm\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:09 crc kubenswrapper[4766]: I1002 13:16:09.398875 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:16:10 crc kubenswrapper[4766]: I1002 13:16:10.027269 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-4npcm"] Oct 02 13:16:10 crc kubenswrapper[4766]: I1002 13:16:10.036785 4766 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 02 13:16:10 crc kubenswrapper[4766]: I1002 13:16:10.301173 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" event={"ID":"f9175f1f-c2e8-4454-86c2-5e4c795834b1","Type":"ContainerStarted","Data":"3b149d1e35221647f7021e7212b298e094849f1b598c1875a74af413770c037f"} Oct 02 13:16:11 crc kubenswrapper[4766]: I1002 13:16:11.316802 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" event={"ID":"f9175f1f-c2e8-4454-86c2-5e4c795834b1","Type":"ContainerStarted","Data":"ae33d1dcab929ad54087712a2e1b6c69ffe4f3f9a926ce18a63d3c33aa21271a"} Oct 02 13:16:12 crc kubenswrapper[4766]: I1002 13:16:12.881774 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:16:12 crc kubenswrapper[4766]: E1002 13:16:12.882717 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:16:23 crc kubenswrapper[4766]: I1002 13:16:23.882474 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:16:23 crc kubenswrapper[4766]: E1002 13:16:23.885469 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:16:38 crc kubenswrapper[4766]: I1002 13:16:38.883754 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:16:38 crc kubenswrapper[4766]: E1002 13:16:38.885045 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:16:51 crc kubenswrapper[4766]: I1002 13:16:51.882449 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:16:51 crc kubenswrapper[4766]: E1002 13:16:51.884013 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:17:01 crc kubenswrapper[4766]: I1002 13:17:01.947277 4766 generic.go:334] "Generic (PLEG): container finished" podID="f9175f1f-c2e8-4454-86c2-5e4c795834b1" containerID="ae33d1dcab929ad54087712a2e1b6c69ffe4f3f9a926ce18a63d3c33aa21271a" exitCode=2 Oct 02 13:17:01 crc kubenswrapper[4766]: I1002 13:17:01.947381 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" event={"ID":"f9175f1f-c2e8-4454-86c2-5e4c795834b1","Type":"ContainerDied","Data":"ae33d1dcab929ad54087712a2e1b6c69ffe4f3f9a926ce18a63d3c33aa21271a"} Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.505655 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.609878 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-migration-ssh-key-0\") pod \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.610004 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-compute-config-1\") pod \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.610133 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6mfm\" (UniqueName: \"kubernetes.io/projected/f9175f1f-c2e8-4454-86c2-5e4c795834b1-kube-api-access-r6mfm\") pod \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.610237 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cells-global-config-0\") pod \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.610348 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-ceph\") pod \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.610437 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cells-global-config-1\") pod \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.610478 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-ssh-key\") pod \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.610527 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-inventory\") pod \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.610567 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-compute-config-0\") pod \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.610705 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-migration-ssh-key-1\") pod \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.610803 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-combined-ca-bundle\") pod \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\" (UID: \"f9175f1f-c2e8-4454-86c2-5e4c795834b1\") " Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.618353 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "f9175f1f-c2e8-4454-86c2-5e4c795834b1" (UID: "f9175f1f-c2e8-4454-86c2-5e4c795834b1"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.622257 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9175f1f-c2e8-4454-86c2-5e4c795834b1-kube-api-access-r6mfm" (OuterVolumeSpecName: "kube-api-access-r6mfm") pod "f9175f1f-c2e8-4454-86c2-5e4c795834b1" (UID: "f9175f1f-c2e8-4454-86c2-5e4c795834b1"). InnerVolumeSpecName "kube-api-access-r6mfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.622473 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-ceph" (OuterVolumeSpecName: "ceph") pod "f9175f1f-c2e8-4454-86c2-5e4c795834b1" (UID: "f9175f1f-c2e8-4454-86c2-5e4c795834b1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.646206 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-inventory" (OuterVolumeSpecName: "inventory") pod "f9175f1f-c2e8-4454-86c2-5e4c795834b1" (UID: "f9175f1f-c2e8-4454-86c2-5e4c795834b1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.650176 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "f9175f1f-c2e8-4454-86c2-5e4c795834b1" (UID: "f9175f1f-c2e8-4454-86c2-5e4c795834b1"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.651583 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "f9175f1f-c2e8-4454-86c2-5e4c795834b1" (UID: "f9175f1f-c2e8-4454-86c2-5e4c795834b1"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.654711 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "f9175f1f-c2e8-4454-86c2-5e4c795834b1" (UID: "f9175f1f-c2e8-4454-86c2-5e4c795834b1"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.655641 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "f9175f1f-c2e8-4454-86c2-5e4c795834b1" (UID: "f9175f1f-c2e8-4454-86c2-5e4c795834b1"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.665030 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f9175f1f-c2e8-4454-86c2-5e4c795834b1" (UID: "f9175f1f-c2e8-4454-86c2-5e4c795834b1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.666666 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "f9175f1f-c2e8-4454-86c2-5e4c795834b1" (UID: "f9175f1f-c2e8-4454-86c2-5e4c795834b1"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.668496 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "f9175f1f-c2e8-4454-86c2-5e4c795834b1" (UID: "f9175f1f-c2e8-4454-86c2-5e4c795834b1"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.714884 4766 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.714935 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.714949 4766 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.714962 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.714971 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.714979 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6mfm\" (UniqueName: \"kubernetes.io/projected/f9175f1f-c2e8-4454-86c2-5e4c795834b1-kube-api-access-r6mfm\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.714998 4766 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-ceph\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.715008 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.715017 4766 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.715026 4766 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-inventory\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:03 crc kubenswrapper[4766]: I1002 13:17:03.715035 4766 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f9175f1f-c2e8-4454-86c2-5e4c795834b1-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 02 13:17:04 crc kubenswrapper[4766]: I1002 13:17:04.001599 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" event={"ID":"f9175f1f-c2e8-4454-86c2-5e4c795834b1","Type":"ContainerDied","Data":"3b149d1e35221647f7021e7212b298e094849f1b598c1875a74af413770c037f"} Oct 02 13:17:04 crc kubenswrapper[4766]: I1002 13:17:04.001707 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-4npcm" Oct 02 13:17:04 crc kubenswrapper[4766]: I1002 13:17:04.001732 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b149d1e35221647f7021e7212b298e094849f1b598c1875a74af413770c037f" Oct 02 13:17:06 crc kubenswrapper[4766]: I1002 13:17:06.882062 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:17:08 crc kubenswrapper[4766]: I1002 13:17:08.064886 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"ebffd89870cf914fa65f0c122b373b894237020d282dbb532b184737bc6c17ba"} Oct 02 13:17:52 crc kubenswrapper[4766]: I1002 13:17:52.009982 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-thb7m"] Oct 02 13:17:52 crc kubenswrapper[4766]: E1002 13:17:52.011900 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9175f1f-c2e8-4454-86c2-5e4c795834b1" containerName="nova-cell1-openstack-openstack-cell1" Oct 02 13:17:52 crc kubenswrapper[4766]: I1002 13:17:52.011937 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9175f1f-c2e8-4454-86c2-5e4c795834b1" containerName="nova-cell1-openstack-openstack-cell1" Oct 02 13:17:52 crc kubenswrapper[4766]: I1002 13:17:52.012666 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9175f1f-c2e8-4454-86c2-5e4c795834b1" containerName="nova-cell1-openstack-openstack-cell1" Oct 02 13:17:52 crc kubenswrapper[4766]: I1002 13:17:52.016647 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thb7m" Oct 02 13:17:52 crc kubenswrapper[4766]: I1002 13:17:52.025025 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-thb7m"] Oct 02 13:17:52 crc kubenswrapper[4766]: I1002 13:17:52.156648 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb6ef02-f60f-41da-9afc-588dc18c7882-catalog-content\") pod \"certified-operators-thb7m\" (UID: \"ebb6ef02-f60f-41da-9afc-588dc18c7882\") " pod="openshift-marketplace/certified-operators-thb7m" Oct 02 13:17:52 crc kubenswrapper[4766]: I1002 13:17:52.157023 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb6ef02-f60f-41da-9afc-588dc18c7882-utilities\") pod \"certified-operators-thb7m\" (UID: \"ebb6ef02-f60f-41da-9afc-588dc18c7882\") " pod="openshift-marketplace/certified-operators-thb7m" Oct 02 13:17:52 crc kubenswrapper[4766]: I1002 13:17:52.157579 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h297m\" (UniqueName: \"kubernetes.io/projected/ebb6ef02-f60f-41da-9afc-588dc18c7882-kube-api-access-h297m\") pod \"certified-operators-thb7m\" (UID: \"ebb6ef02-f60f-41da-9afc-588dc18c7882\") " pod="openshift-marketplace/certified-operators-thb7m" Oct 02 13:17:52 crc kubenswrapper[4766]: I1002 13:17:52.259626 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h297m\" (UniqueName: \"kubernetes.io/projected/ebb6ef02-f60f-41da-9afc-588dc18c7882-kube-api-access-h297m\") pod \"certified-operators-thb7m\" (UID: \"ebb6ef02-f60f-41da-9afc-588dc18c7882\") " pod="openshift-marketplace/certified-operators-thb7m" Oct 02 13:17:52 crc kubenswrapper[4766]: I1002 13:17:52.259706 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb6ef02-f60f-41da-9afc-588dc18c7882-catalog-content\") pod \"certified-operators-thb7m\" (UID: \"ebb6ef02-f60f-41da-9afc-588dc18c7882\") " pod="openshift-marketplace/certified-operators-thb7m" Oct 02 13:17:52 crc kubenswrapper[4766]: I1002 13:17:52.259847 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb6ef02-f60f-41da-9afc-588dc18c7882-utilities\") pod \"certified-operators-thb7m\" (UID: \"ebb6ef02-f60f-41da-9afc-588dc18c7882\") " pod="openshift-marketplace/certified-operators-thb7m" Oct 02 13:17:52 crc kubenswrapper[4766]: I1002 13:17:52.260196 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb6ef02-f60f-41da-9afc-588dc18c7882-catalog-content\") pod \"certified-operators-thb7m\" (UID: \"ebb6ef02-f60f-41da-9afc-588dc18c7882\") " pod="openshift-marketplace/certified-operators-thb7m" Oct 02 13:17:52 crc kubenswrapper[4766]: I1002 13:17:52.260453 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb6ef02-f60f-41da-9afc-588dc18c7882-utilities\") pod \"certified-operators-thb7m\" (UID: \"ebb6ef02-f60f-41da-9afc-588dc18c7882\") " pod="openshift-marketplace/certified-operators-thb7m" Oct 02 13:17:52 crc kubenswrapper[4766]: I1002 13:17:52.283437 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h297m\" (UniqueName: \"kubernetes.io/projected/ebb6ef02-f60f-41da-9afc-588dc18c7882-kube-api-access-h297m\") pod \"certified-operators-thb7m\" (UID: \"ebb6ef02-f60f-41da-9afc-588dc18c7882\") " pod="openshift-marketplace/certified-operators-thb7m" Oct 02 13:17:52 crc kubenswrapper[4766]: I1002 13:17:52.342241 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thb7m" Oct 02 13:17:52 crc kubenswrapper[4766]: I1002 13:17:52.932243 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-thb7m"] Oct 02 13:17:53 crc kubenswrapper[4766]: I1002 13:17:53.675018 4766 generic.go:334] "Generic (PLEG): container finished" podID="ebb6ef02-f60f-41da-9afc-588dc18c7882" containerID="838d300bb4daee09ed25344b659b92b90c86b9a5a6be2551d7f6bea4b2334c25" exitCode=0 Oct 02 13:17:53 crc kubenswrapper[4766]: I1002 13:17:53.675535 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thb7m" event={"ID":"ebb6ef02-f60f-41da-9afc-588dc18c7882","Type":"ContainerDied","Data":"838d300bb4daee09ed25344b659b92b90c86b9a5a6be2551d7f6bea4b2334c25"} Oct 02 13:17:53 crc kubenswrapper[4766]: I1002 13:17:53.676656 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thb7m" event={"ID":"ebb6ef02-f60f-41da-9afc-588dc18c7882","Type":"ContainerStarted","Data":"00b803bcee6dde69a8253195423d0ce9c3d43e454ca7bdb19c66b6e80ba1bbad"} Oct 02 13:17:54 crc kubenswrapper[4766]: I1002 13:17:54.690070 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thb7m" event={"ID":"ebb6ef02-f60f-41da-9afc-588dc18c7882","Type":"ContainerStarted","Data":"91deaad039d5d0e41648c148c26d3e1b4fd092da1a0e78724885e6002941d816"} Oct 02 13:17:56 crc kubenswrapper[4766]: I1002 13:17:56.715387 4766 generic.go:334] "Generic (PLEG): container finished" podID="ebb6ef02-f60f-41da-9afc-588dc18c7882" containerID="91deaad039d5d0e41648c148c26d3e1b4fd092da1a0e78724885e6002941d816" exitCode=0 Oct 02 13:17:56 crc kubenswrapper[4766]: I1002 13:17:56.715696 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thb7m" event={"ID":"ebb6ef02-f60f-41da-9afc-588dc18c7882","Type":"ContainerDied","Data":"91deaad039d5d0e41648c148c26d3e1b4fd092da1a0e78724885e6002941d816"} Oct 02 13:17:57 crc kubenswrapper[4766]: I1002 13:17:57.741454 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thb7m" event={"ID":"ebb6ef02-f60f-41da-9afc-588dc18c7882","Type":"ContainerStarted","Data":"5bc141a856f4f595682faccc2789c7ff05ed268428266f06e4d9ba47f66294ad"} Oct 02 13:17:57 crc kubenswrapper[4766]: I1002 13:17:57.777304 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-thb7m" podStartSLOduration=3.328604688 podStartE2EDuration="6.777278422s" podCreationTimestamp="2025-10-02 13:17:51 +0000 UTC" firstStartedPulling="2025-10-02 13:17:53.679017046 +0000 UTC m=+8788.621888030" lastFinishedPulling="2025-10-02 13:17:57.12769079 +0000 UTC m=+8792.070561764" observedRunningTime="2025-10-02 13:17:57.757005322 +0000 UTC m=+8792.699876276" watchObservedRunningTime="2025-10-02 13:17:57.777278422 +0000 UTC m=+8792.720149406" Oct 02 13:17:58 crc kubenswrapper[4766]: I1002 13:17:58.549968 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7gs9r/must-gather-glrbt"] Oct 02 13:17:58 crc kubenswrapper[4766]: I1002 13:17:58.551920 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gs9r/must-gather-glrbt" Oct 02 13:17:58 crc kubenswrapper[4766]: I1002 13:17:58.559410 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7gs9r/must-gather-glrbt"] Oct 02 13:17:58 crc kubenswrapper[4766]: I1002 13:17:58.561349 4766 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7gs9r"/"default-dockercfg-wpkz8" Oct 02 13:17:58 crc kubenswrapper[4766]: I1002 13:17:58.561384 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7gs9r"/"openshift-service-ca.crt" Oct 02 13:17:58 crc kubenswrapper[4766]: I1002 13:17:58.561400 4766 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7gs9r"/"kube-root-ca.crt" Oct 02 13:17:58 crc kubenswrapper[4766]: I1002 13:17:58.620701 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cb3e12a5-d177-4e17-9cb4-71e3efeb1c36-must-gather-output\") pod \"must-gather-glrbt\" (UID: \"cb3e12a5-d177-4e17-9cb4-71e3efeb1c36\") " pod="openshift-must-gather-7gs9r/must-gather-glrbt" Oct 02 13:17:58 crc kubenswrapper[4766]: I1002 13:17:58.620918 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zxkf\" (UniqueName: \"kubernetes.io/projected/cb3e12a5-d177-4e17-9cb4-71e3efeb1c36-kube-api-access-6zxkf\") pod \"must-gather-glrbt\" (UID: \"cb3e12a5-d177-4e17-9cb4-71e3efeb1c36\") " pod="openshift-must-gather-7gs9r/must-gather-glrbt" Oct 02 13:17:58 crc kubenswrapper[4766]: I1002 13:17:58.722294 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zxkf\" (UniqueName: \"kubernetes.io/projected/cb3e12a5-d177-4e17-9cb4-71e3efeb1c36-kube-api-access-6zxkf\") pod \"must-gather-glrbt\" (UID: \"cb3e12a5-d177-4e17-9cb4-71e3efeb1c36\") " pod="openshift-must-gather-7gs9r/must-gather-glrbt" Oct 02 13:17:58 crc kubenswrapper[4766]: I1002 13:17:58.722399 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cb3e12a5-d177-4e17-9cb4-71e3efeb1c36-must-gather-output\") pod \"must-gather-glrbt\" (UID: \"cb3e12a5-d177-4e17-9cb4-71e3efeb1c36\") " pod="openshift-must-gather-7gs9r/must-gather-glrbt" Oct 02 13:17:58 crc kubenswrapper[4766]: I1002 13:17:58.722862 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cb3e12a5-d177-4e17-9cb4-71e3efeb1c36-must-gather-output\") pod \"must-gather-glrbt\" (UID: \"cb3e12a5-d177-4e17-9cb4-71e3efeb1c36\") " pod="openshift-must-gather-7gs9r/must-gather-glrbt" Oct 02 13:17:58 crc kubenswrapper[4766]: I1002 13:17:58.741706 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zxkf\" (UniqueName: \"kubernetes.io/projected/cb3e12a5-d177-4e17-9cb4-71e3efeb1c36-kube-api-access-6zxkf\") pod \"must-gather-glrbt\" (UID: \"cb3e12a5-d177-4e17-9cb4-71e3efeb1c36\") " pod="openshift-must-gather-7gs9r/must-gather-glrbt" Oct 02 13:17:58 crc kubenswrapper[4766]: I1002 13:17:58.874694 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gs9r/must-gather-glrbt" Oct 02 13:17:59 crc kubenswrapper[4766]: I1002 13:17:59.370865 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7gs9r/must-gather-glrbt"] Oct 02 13:17:59 crc kubenswrapper[4766]: W1002 13:17:59.377355 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb3e12a5_d177_4e17_9cb4_71e3efeb1c36.slice/crio-d4a5a5211a4534f6c6766b0cb9a1fb55803bb0d5e34b2a0166e9dd8d96398b6f WatchSource:0}: Error finding container d4a5a5211a4534f6c6766b0cb9a1fb55803bb0d5e34b2a0166e9dd8d96398b6f: Status 404 returned error can't find the container with id d4a5a5211a4534f6c6766b0cb9a1fb55803bb0d5e34b2a0166e9dd8d96398b6f Oct 02 13:17:59 crc kubenswrapper[4766]: I1002 13:17:59.772813 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gs9r/must-gather-glrbt" event={"ID":"cb3e12a5-d177-4e17-9cb4-71e3efeb1c36","Type":"ContainerStarted","Data":"d4a5a5211a4534f6c6766b0cb9a1fb55803bb0d5e34b2a0166e9dd8d96398b6f"} Oct 02 13:18:02 crc kubenswrapper[4766]: I1002 13:18:02.342740 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-thb7m" Oct 02 13:18:02 crc kubenswrapper[4766]: I1002 13:18:02.345059 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-thb7m" Oct 02 13:18:02 crc kubenswrapper[4766]: I1002 13:18:02.395847 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-thb7m" Oct 02 13:18:02 crc kubenswrapper[4766]: I1002 13:18:02.915341 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-thb7m" Oct 02 13:18:02 crc kubenswrapper[4766]: I1002 13:18:02.968673 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-thb7m"] Oct 02 13:18:04 crc kubenswrapper[4766]: I1002 13:18:04.834956 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-thb7m" podUID="ebb6ef02-f60f-41da-9afc-588dc18c7882" containerName="registry-server" containerID="cri-o://5bc141a856f4f595682faccc2789c7ff05ed268428266f06e4d9ba47f66294ad" gracePeriod=2 Oct 02 13:18:05 crc kubenswrapper[4766]: I1002 13:18:05.847854 4766 generic.go:334] "Generic (PLEG): container finished" podID="ebb6ef02-f60f-41da-9afc-588dc18c7882" containerID="5bc141a856f4f595682faccc2789c7ff05ed268428266f06e4d9ba47f66294ad" exitCode=0 Oct 02 13:18:05 crc kubenswrapper[4766]: I1002 13:18:05.847911 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thb7m" event={"ID":"ebb6ef02-f60f-41da-9afc-588dc18c7882","Type":"ContainerDied","Data":"5bc141a856f4f595682faccc2789c7ff05ed268428266f06e4d9ba47f66294ad"} Oct 02 13:18:06 crc kubenswrapper[4766]: I1002 13:18:06.862488 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thb7m" event={"ID":"ebb6ef02-f60f-41da-9afc-588dc18c7882","Type":"ContainerDied","Data":"00b803bcee6dde69a8253195423d0ce9c3d43e454ca7bdb19c66b6e80ba1bbad"} Oct 02 13:18:06 crc kubenswrapper[4766]: I1002 13:18:06.862827 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00b803bcee6dde69a8253195423d0ce9c3d43e454ca7bdb19c66b6e80ba1bbad" Oct 02 13:18:06 crc kubenswrapper[4766]: I1002 13:18:06.913875 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thb7m" Oct 02 13:18:07 crc kubenswrapper[4766]: I1002 13:18:07.028974 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb6ef02-f60f-41da-9afc-588dc18c7882-utilities\") pod \"ebb6ef02-f60f-41da-9afc-588dc18c7882\" (UID: \"ebb6ef02-f60f-41da-9afc-588dc18c7882\") " Oct 02 13:18:07 crc kubenswrapper[4766]: I1002 13:18:07.029320 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h297m\" (UniqueName: \"kubernetes.io/projected/ebb6ef02-f60f-41da-9afc-588dc18c7882-kube-api-access-h297m\") pod \"ebb6ef02-f60f-41da-9afc-588dc18c7882\" (UID: \"ebb6ef02-f60f-41da-9afc-588dc18c7882\") " Oct 02 13:18:07 crc kubenswrapper[4766]: I1002 13:18:07.029486 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb6ef02-f60f-41da-9afc-588dc18c7882-catalog-content\") pod \"ebb6ef02-f60f-41da-9afc-588dc18c7882\" (UID: \"ebb6ef02-f60f-41da-9afc-588dc18c7882\") " Oct 02 13:18:07 crc kubenswrapper[4766]: I1002 13:18:07.029905 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebb6ef02-f60f-41da-9afc-588dc18c7882-utilities" (OuterVolumeSpecName: "utilities") pod "ebb6ef02-f60f-41da-9afc-588dc18c7882" (UID: "ebb6ef02-f60f-41da-9afc-588dc18c7882"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:18:07 crc kubenswrapper[4766]: I1002 13:18:07.030742 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb6ef02-f60f-41da-9afc-588dc18c7882-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:07 crc kubenswrapper[4766]: I1002 13:18:07.037136 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebb6ef02-f60f-41da-9afc-588dc18c7882-kube-api-access-h297m" (OuterVolumeSpecName: "kube-api-access-h297m") pod "ebb6ef02-f60f-41da-9afc-588dc18c7882" (UID: "ebb6ef02-f60f-41da-9afc-588dc18c7882"). InnerVolumeSpecName "kube-api-access-h297m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:18:07 crc kubenswrapper[4766]: I1002 13:18:07.095480 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebb6ef02-f60f-41da-9afc-588dc18c7882-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebb6ef02-f60f-41da-9afc-588dc18c7882" (UID: "ebb6ef02-f60f-41da-9afc-588dc18c7882"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:18:07 crc kubenswrapper[4766]: I1002 13:18:07.132605 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h297m\" (UniqueName: \"kubernetes.io/projected/ebb6ef02-f60f-41da-9afc-588dc18c7882-kube-api-access-h297m\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:07 crc kubenswrapper[4766]: I1002 13:18:07.132641 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb6ef02-f60f-41da-9afc-588dc18c7882-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:18:07 crc kubenswrapper[4766]: I1002 13:18:07.877309 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gs9r/must-gather-glrbt" event={"ID":"cb3e12a5-d177-4e17-9cb4-71e3efeb1c36","Type":"ContainerStarted","Data":"4690fa4dd4d6f0b89a1f426dfe000902a254a54805de0be734be6c9db6d7584e"} Oct 02 13:18:07 crc kubenswrapper[4766]: I1002 13:18:07.877685 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gs9r/must-gather-glrbt" event={"ID":"cb3e12a5-d177-4e17-9cb4-71e3efeb1c36","Type":"ContainerStarted","Data":"fdc7b82e259c85e4ab6f57117e8ba0086f719250fd8c7f488df7cc4e3a946f8e"} Oct 02 13:18:07 crc kubenswrapper[4766]: I1002 13:18:07.877344 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thb7m" Oct 02 13:18:07 crc kubenswrapper[4766]: I1002 13:18:07.899662 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7gs9r/must-gather-glrbt" podStartSLOduration=3.006538656 podStartE2EDuration="9.899640407s" podCreationTimestamp="2025-10-02 13:17:58 +0000 UTC" firstStartedPulling="2025-10-02 13:17:59.379867819 +0000 UTC m=+8794.322738763" lastFinishedPulling="2025-10-02 13:18:06.272969559 +0000 UTC m=+8801.215840514" observedRunningTime="2025-10-02 13:18:07.896839728 +0000 UTC m=+8802.839710702" watchObservedRunningTime="2025-10-02 13:18:07.899640407 +0000 UTC m=+8802.842511351" Oct 02 13:18:07 crc kubenswrapper[4766]: I1002 13:18:07.939089 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-thb7m"] Oct 02 13:18:07 crc kubenswrapper[4766]: I1002 13:18:07.957430 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-thb7m"] Oct 02 13:18:09 crc kubenswrapper[4766]: I1002 13:18:09.896202 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebb6ef02-f60f-41da-9afc-588dc18c7882" path="/var/lib/kubelet/pods/ebb6ef02-f60f-41da-9afc-588dc18c7882/volumes" Oct 02 13:18:13 crc kubenswrapper[4766]: I1002 13:18:13.204409 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7gs9r/crc-debug-98t8k"] Oct 02 13:18:13 crc kubenswrapper[4766]: E1002 13:18:13.205268 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb6ef02-f60f-41da-9afc-588dc18c7882" containerName="registry-server" Oct 02 13:18:13 crc kubenswrapper[4766]: I1002 13:18:13.205310 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb6ef02-f60f-41da-9afc-588dc18c7882" containerName="registry-server" Oct 02 13:18:13 crc kubenswrapper[4766]: E1002 13:18:13.205344 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb6ef02-f60f-41da-9afc-588dc18c7882" containerName="extract-utilities" Oct 02 13:18:13 crc kubenswrapper[4766]: I1002 13:18:13.205352 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb6ef02-f60f-41da-9afc-588dc18c7882" containerName="extract-utilities" Oct 02 13:18:13 crc kubenswrapper[4766]: E1002 13:18:13.205365 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebb6ef02-f60f-41da-9afc-588dc18c7882" containerName="extract-content" Oct 02 13:18:13 crc kubenswrapper[4766]: I1002 13:18:13.205371 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebb6ef02-f60f-41da-9afc-588dc18c7882" containerName="extract-content" Oct 02 13:18:13 crc kubenswrapper[4766]: I1002 13:18:13.205621 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebb6ef02-f60f-41da-9afc-588dc18c7882" containerName="registry-server" Oct 02 13:18:13 crc kubenswrapper[4766]: I1002 13:18:13.206351 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gs9r/crc-debug-98t8k" Oct 02 13:18:13 crc kubenswrapper[4766]: I1002 13:18:13.362377 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttssr\" (UniqueName: \"kubernetes.io/projected/eba16009-2412-4fac-95fe-6a5464c43608-kube-api-access-ttssr\") pod \"crc-debug-98t8k\" (UID: \"eba16009-2412-4fac-95fe-6a5464c43608\") " pod="openshift-must-gather-7gs9r/crc-debug-98t8k" Oct 02 13:18:13 crc kubenswrapper[4766]: I1002 13:18:13.362792 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eba16009-2412-4fac-95fe-6a5464c43608-host\") pod \"crc-debug-98t8k\" (UID: \"eba16009-2412-4fac-95fe-6a5464c43608\") " pod="openshift-must-gather-7gs9r/crc-debug-98t8k" Oct 02 13:18:13 crc kubenswrapper[4766]: I1002 13:18:13.464920 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttssr\" (UniqueName: \"kubernetes.io/projected/eba16009-2412-4fac-95fe-6a5464c43608-kube-api-access-ttssr\") pod \"crc-debug-98t8k\" (UID: \"eba16009-2412-4fac-95fe-6a5464c43608\") " pod="openshift-must-gather-7gs9r/crc-debug-98t8k" Oct 02 13:18:13 crc kubenswrapper[4766]: I1002 13:18:13.465068 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eba16009-2412-4fac-95fe-6a5464c43608-host\") pod \"crc-debug-98t8k\" (UID: \"eba16009-2412-4fac-95fe-6a5464c43608\") " pod="openshift-must-gather-7gs9r/crc-debug-98t8k" Oct 02 13:18:13 crc kubenswrapper[4766]: I1002 13:18:13.465215 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eba16009-2412-4fac-95fe-6a5464c43608-host\") pod \"crc-debug-98t8k\" (UID: \"eba16009-2412-4fac-95fe-6a5464c43608\") " pod="openshift-must-gather-7gs9r/crc-debug-98t8k" Oct 02 13:18:13 crc kubenswrapper[4766]: I1002 13:18:13.486488 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttssr\" (UniqueName: \"kubernetes.io/projected/eba16009-2412-4fac-95fe-6a5464c43608-kube-api-access-ttssr\") pod \"crc-debug-98t8k\" (UID: \"eba16009-2412-4fac-95fe-6a5464c43608\") " pod="openshift-must-gather-7gs9r/crc-debug-98t8k" Oct 02 13:18:13 crc kubenswrapper[4766]: I1002 13:18:13.525577 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gs9r/crc-debug-98t8k" Oct 02 13:18:13 crc kubenswrapper[4766]: I1002 13:18:13.944438 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gs9r/crc-debug-98t8k" event={"ID":"eba16009-2412-4fac-95fe-6a5464c43608","Type":"ContainerStarted","Data":"4996b5d45e08cb7fa30691ed0ab45e9285c711656f54a0dd1b1b5359581f319a"} Oct 02 13:18:26 crc kubenswrapper[4766]: I1002 13:18:26.109738 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gs9r/crc-debug-98t8k" event={"ID":"eba16009-2412-4fac-95fe-6a5464c43608","Type":"ContainerStarted","Data":"cc8362feff2618262831769a37ba173901c1f43ac901092733ed8d99b6848493"} Oct 02 13:18:26 crc kubenswrapper[4766]: I1002 13:18:26.128347 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7gs9r/crc-debug-98t8k" podStartSLOduration=1.888540929 podStartE2EDuration="13.128326246s" podCreationTimestamp="2025-10-02 13:18:13 +0000 UTC" firstStartedPulling="2025-10-02 13:18:13.572445702 +0000 UTC m=+8808.515316646" lastFinishedPulling="2025-10-02 13:18:24.812231029 +0000 UTC m=+8819.755101963" observedRunningTime="2025-10-02 13:18:26.123671757 +0000 UTC m=+8821.066542711" watchObservedRunningTime="2025-10-02 13:18:26.128326246 +0000 UTC m=+8821.071197200" Oct 02 13:18:47 crc kubenswrapper[4766]: I1002 13:18:47.472109 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wrkpd"] Oct 02 13:18:47 crc kubenswrapper[4766]: I1002 13:18:47.491316 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wrkpd"] Oct 02 13:18:47 crc kubenswrapper[4766]: I1002 13:18:47.491489 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrkpd" Oct 02 13:18:47 crc kubenswrapper[4766]: I1002 13:18:47.600324 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dbdf152-6a29-4c12-9986-25d3112a9618-utilities\") pod \"community-operators-wrkpd\" (UID: \"7dbdf152-6a29-4c12-9986-25d3112a9618\") " pod="openshift-marketplace/community-operators-wrkpd" Oct 02 13:18:47 crc kubenswrapper[4766]: I1002 13:18:47.600401 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dbdf152-6a29-4c12-9986-25d3112a9618-catalog-content\") pod \"community-operators-wrkpd\" (UID: \"7dbdf152-6a29-4c12-9986-25d3112a9618\") " pod="openshift-marketplace/community-operators-wrkpd" Oct 02 13:18:47 crc kubenswrapper[4766]: I1002 13:18:47.600886 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsqrl\" (UniqueName: \"kubernetes.io/projected/7dbdf152-6a29-4c12-9986-25d3112a9618-kube-api-access-hsqrl\") pod \"community-operators-wrkpd\" (UID: \"7dbdf152-6a29-4c12-9986-25d3112a9618\") " pod="openshift-marketplace/community-operators-wrkpd" Oct 02 13:18:47 crc kubenswrapper[4766]: I1002 13:18:47.702928 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dbdf152-6a29-4c12-9986-25d3112a9618-catalog-content\") pod \"community-operators-wrkpd\" (UID: \"7dbdf152-6a29-4c12-9986-25d3112a9618\") " pod="openshift-marketplace/community-operators-wrkpd" Oct 02 13:18:47 crc kubenswrapper[4766]: I1002 13:18:47.703083 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsqrl\" (UniqueName: \"kubernetes.io/projected/7dbdf152-6a29-4c12-9986-25d3112a9618-kube-api-access-hsqrl\") pod \"community-operators-wrkpd\" (UID: \"7dbdf152-6a29-4c12-9986-25d3112a9618\") " pod="openshift-marketplace/community-operators-wrkpd" Oct 02 13:18:47 crc kubenswrapper[4766]: I1002 13:18:47.703139 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dbdf152-6a29-4c12-9986-25d3112a9618-utilities\") pod \"community-operators-wrkpd\" (UID: \"7dbdf152-6a29-4c12-9986-25d3112a9618\") " pod="openshift-marketplace/community-operators-wrkpd" Oct 02 13:18:47 crc kubenswrapper[4766]: I1002 13:18:47.703685 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dbdf152-6a29-4c12-9986-25d3112a9618-utilities\") pod \"community-operators-wrkpd\" (UID: \"7dbdf152-6a29-4c12-9986-25d3112a9618\") " pod="openshift-marketplace/community-operators-wrkpd" Oct 02 13:18:47 crc kubenswrapper[4766]: I1002 13:18:47.703759 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dbdf152-6a29-4c12-9986-25d3112a9618-catalog-content\") pod \"community-operators-wrkpd\" (UID: \"7dbdf152-6a29-4c12-9986-25d3112a9618\") " pod="openshift-marketplace/community-operators-wrkpd" Oct 02 13:18:47 crc kubenswrapper[4766]: I1002 13:18:47.723100 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsqrl\" (UniqueName: \"kubernetes.io/projected/7dbdf152-6a29-4c12-9986-25d3112a9618-kube-api-access-hsqrl\") pod \"community-operators-wrkpd\" (UID: \"7dbdf152-6a29-4c12-9986-25d3112a9618\") " pod="openshift-marketplace/community-operators-wrkpd" Oct 02 13:18:47 crc kubenswrapper[4766]: I1002 13:18:47.831857 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrkpd" Oct 02 13:18:48 crc kubenswrapper[4766]: I1002 13:18:48.448382 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wrkpd"] Oct 02 13:18:49 crc kubenswrapper[4766]: I1002 13:18:49.385483 4766 generic.go:334] "Generic (PLEG): container finished" podID="7dbdf152-6a29-4c12-9986-25d3112a9618" containerID="21391aa88db6bb2340e8d3df0a84305846d1e983f598cbf45c5a1860fa79f710" exitCode=0 Oct 02 13:18:49 crc kubenswrapper[4766]: I1002 13:18:49.385711 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrkpd" event={"ID":"7dbdf152-6a29-4c12-9986-25d3112a9618","Type":"ContainerDied","Data":"21391aa88db6bb2340e8d3df0a84305846d1e983f598cbf45c5a1860fa79f710"} Oct 02 13:18:49 crc kubenswrapper[4766]: I1002 13:18:49.385850 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrkpd" event={"ID":"7dbdf152-6a29-4c12-9986-25d3112a9618","Type":"ContainerStarted","Data":"19419877b10f30c86dcd3c24689491583796e560f806e5fed15fc5b0150d896f"} Oct 02 13:18:51 crc kubenswrapper[4766]: I1002 13:18:51.415286 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrkpd" event={"ID":"7dbdf152-6a29-4c12-9986-25d3112a9618","Type":"ContainerStarted","Data":"cf9b27992965ad8f8ab60368b18dd029faf0818efccb2dbdba87c8c637db0cda"} Oct 02 13:18:53 crc kubenswrapper[4766]: I1002 13:18:53.460369 4766 generic.go:334] "Generic (PLEG): container finished" podID="7dbdf152-6a29-4c12-9986-25d3112a9618" containerID="cf9b27992965ad8f8ab60368b18dd029faf0818efccb2dbdba87c8c637db0cda" exitCode=0 Oct 02 13:18:53 crc kubenswrapper[4766]: I1002 13:18:53.460454 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrkpd" event={"ID":"7dbdf152-6a29-4c12-9986-25d3112a9618","Type":"ContainerDied","Data":"cf9b27992965ad8f8ab60368b18dd029faf0818efccb2dbdba87c8c637db0cda"} Oct 02 13:18:54 crc kubenswrapper[4766]: I1002 13:18:54.472244 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrkpd" event={"ID":"7dbdf152-6a29-4c12-9986-25d3112a9618","Type":"ContainerStarted","Data":"621e9e07d3f74109d11150f513147f2e81c57e27af168660d218b0c7507b16b2"} Oct 02 13:18:54 crc kubenswrapper[4766]: I1002 13:18:54.489966 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wrkpd" podStartSLOduration=2.971902442 podStartE2EDuration="7.489944388s" podCreationTimestamp="2025-10-02 13:18:47 +0000 UTC" firstStartedPulling="2025-10-02 13:18:49.387780868 +0000 UTC m=+8844.330651812" lastFinishedPulling="2025-10-02 13:18:53.905822814 +0000 UTC m=+8848.848693758" observedRunningTime="2025-10-02 13:18:54.488619847 +0000 UTC m=+8849.431490801" watchObservedRunningTime="2025-10-02 13:18:54.489944388 +0000 UTC m=+8849.432815342" Oct 02 13:18:57 crc kubenswrapper[4766]: I1002 13:18:57.832624 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wrkpd" Oct 02 13:18:57 crc kubenswrapper[4766]: I1002 13:18:57.833200 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wrkpd" Oct 02 13:18:57 crc kubenswrapper[4766]: I1002 13:18:57.916154 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wrkpd" Oct 02 13:18:59 crc kubenswrapper[4766]: I1002 13:18:59.602391 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wrkpd" Oct 02 13:18:59 crc kubenswrapper[4766]: I1002 13:18:59.675393 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wrkpd"] Oct 02 13:19:01 crc kubenswrapper[4766]: I1002 13:19:01.561161 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wrkpd" podUID="7dbdf152-6a29-4c12-9986-25d3112a9618" containerName="registry-server" containerID="cri-o://621e9e07d3f74109d11150f513147f2e81c57e27af168660d218b0c7507b16b2" gracePeriod=2 Oct 02 13:19:01 crc kubenswrapper[4766]: E1002 13:19:01.744224 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dbdf152_6a29_4c12_9986_25d3112a9618.slice/crio-621e9e07d3f74109d11150f513147f2e81c57e27af168660d218b0c7507b16b2.scope\": RecentStats: unable to find data in memory cache]" Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.186386 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrkpd" Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.235921 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsqrl\" (UniqueName: \"kubernetes.io/projected/7dbdf152-6a29-4c12-9986-25d3112a9618-kube-api-access-hsqrl\") pod \"7dbdf152-6a29-4c12-9986-25d3112a9618\" (UID: \"7dbdf152-6a29-4c12-9986-25d3112a9618\") " Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.236013 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dbdf152-6a29-4c12-9986-25d3112a9618-catalog-content\") pod \"7dbdf152-6a29-4c12-9986-25d3112a9618\" (UID: \"7dbdf152-6a29-4c12-9986-25d3112a9618\") " Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.236054 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dbdf152-6a29-4c12-9986-25d3112a9618-utilities\") pod \"7dbdf152-6a29-4c12-9986-25d3112a9618\" (UID: \"7dbdf152-6a29-4c12-9986-25d3112a9618\") " Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.237621 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dbdf152-6a29-4c12-9986-25d3112a9618-utilities" (OuterVolumeSpecName: "utilities") pod "7dbdf152-6a29-4c12-9986-25d3112a9618" (UID: "7dbdf152-6a29-4c12-9986-25d3112a9618"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.297251 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dbdf152-6a29-4c12-9986-25d3112a9618-kube-api-access-hsqrl" (OuterVolumeSpecName: "kube-api-access-hsqrl") pod "7dbdf152-6a29-4c12-9986-25d3112a9618" (UID: "7dbdf152-6a29-4c12-9986-25d3112a9618"). InnerVolumeSpecName "kube-api-access-hsqrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.307461 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7dbdf152-6a29-4c12-9986-25d3112a9618-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7dbdf152-6a29-4c12-9986-25d3112a9618" (UID: "7dbdf152-6a29-4c12-9986-25d3112a9618"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.339396 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsqrl\" (UniqueName: \"kubernetes.io/projected/7dbdf152-6a29-4c12-9986-25d3112a9618-kube-api-access-hsqrl\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.339440 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7dbdf152-6a29-4c12-9986-25d3112a9618-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.339450 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7dbdf152-6a29-4c12-9986-25d3112a9618-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.573124 4766 generic.go:334] "Generic (PLEG): container finished" podID="7dbdf152-6a29-4c12-9986-25d3112a9618" containerID="621e9e07d3f74109d11150f513147f2e81c57e27af168660d218b0c7507b16b2" exitCode=0 Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.573424 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrkpd" event={"ID":"7dbdf152-6a29-4c12-9986-25d3112a9618","Type":"ContainerDied","Data":"621e9e07d3f74109d11150f513147f2e81c57e27af168660d218b0c7507b16b2"} Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.573457 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wrkpd" event={"ID":"7dbdf152-6a29-4c12-9986-25d3112a9618","Type":"ContainerDied","Data":"19419877b10f30c86dcd3c24689491583796e560f806e5fed15fc5b0150d896f"} Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.573480 4766 scope.go:117] "RemoveContainer" containerID="621e9e07d3f74109d11150f513147f2e81c57e27af168660d218b0c7507b16b2" Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.573684 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wrkpd" Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.601802 4766 scope.go:117] "RemoveContainer" containerID="cf9b27992965ad8f8ab60368b18dd029faf0818efccb2dbdba87c8c637db0cda" Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.628429 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wrkpd"] Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.646127 4766 scope.go:117] "RemoveContainer" containerID="21391aa88db6bb2340e8d3df0a84305846d1e983f598cbf45c5a1860fa79f710" Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.647649 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wrkpd"] Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.684788 4766 scope.go:117] "RemoveContainer" containerID="621e9e07d3f74109d11150f513147f2e81c57e27af168660d218b0c7507b16b2" Oct 02 13:19:02 crc kubenswrapper[4766]: E1002 13:19:02.685414 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"621e9e07d3f74109d11150f513147f2e81c57e27af168660d218b0c7507b16b2\": container with ID starting with 621e9e07d3f74109d11150f513147f2e81c57e27af168660d218b0c7507b16b2 not found: ID does not exist" containerID="621e9e07d3f74109d11150f513147f2e81c57e27af168660d218b0c7507b16b2" Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.685444 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"621e9e07d3f74109d11150f513147f2e81c57e27af168660d218b0c7507b16b2"} err="failed to get container status \"621e9e07d3f74109d11150f513147f2e81c57e27af168660d218b0c7507b16b2\": rpc error: code = NotFound desc = could not find container \"621e9e07d3f74109d11150f513147f2e81c57e27af168660d218b0c7507b16b2\": container with ID starting with 621e9e07d3f74109d11150f513147f2e81c57e27af168660d218b0c7507b16b2 not found: ID does not exist" Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.685470 4766 scope.go:117] "RemoveContainer" containerID="cf9b27992965ad8f8ab60368b18dd029faf0818efccb2dbdba87c8c637db0cda" Oct 02 13:19:02 crc kubenswrapper[4766]: E1002 13:19:02.685909 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf9b27992965ad8f8ab60368b18dd029faf0818efccb2dbdba87c8c637db0cda\": container with ID starting with cf9b27992965ad8f8ab60368b18dd029faf0818efccb2dbdba87c8c637db0cda not found: ID does not exist" containerID="cf9b27992965ad8f8ab60368b18dd029faf0818efccb2dbdba87c8c637db0cda" Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.685931 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9b27992965ad8f8ab60368b18dd029faf0818efccb2dbdba87c8c637db0cda"} err="failed to get container status \"cf9b27992965ad8f8ab60368b18dd029faf0818efccb2dbdba87c8c637db0cda\": rpc error: code = NotFound desc = could not find container \"cf9b27992965ad8f8ab60368b18dd029faf0818efccb2dbdba87c8c637db0cda\": container with ID starting with cf9b27992965ad8f8ab60368b18dd029faf0818efccb2dbdba87c8c637db0cda not found: ID does not exist" Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.685946 4766 scope.go:117] "RemoveContainer" containerID="21391aa88db6bb2340e8d3df0a84305846d1e983f598cbf45c5a1860fa79f710" Oct 02 13:19:02 crc kubenswrapper[4766]: E1002 13:19:02.686300 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21391aa88db6bb2340e8d3df0a84305846d1e983f598cbf45c5a1860fa79f710\": container with ID starting with 21391aa88db6bb2340e8d3df0a84305846d1e983f598cbf45c5a1860fa79f710 not found: ID does not exist" containerID="21391aa88db6bb2340e8d3df0a84305846d1e983f598cbf45c5a1860fa79f710" Oct 02 13:19:02 crc kubenswrapper[4766]: I1002 13:19:02.686321 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21391aa88db6bb2340e8d3df0a84305846d1e983f598cbf45c5a1860fa79f710"} err="failed to get container status \"21391aa88db6bb2340e8d3df0a84305846d1e983f598cbf45c5a1860fa79f710\": rpc error: code = NotFound desc = could not find container \"21391aa88db6bb2340e8d3df0a84305846d1e983f598cbf45c5a1860fa79f710\": container with ID starting with 21391aa88db6bb2340e8d3df0a84305846d1e983f598cbf45c5a1860fa79f710 not found: ID does not exist" Oct 02 13:19:03 crc kubenswrapper[4766]: I1002 13:19:03.893098 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dbdf152-6a29-4c12-9986-25d3112a9618" path="/var/lib/kubelet/pods/7dbdf152-6a29-4c12-9986-25d3112a9618/volumes" Oct 02 13:19:24 crc kubenswrapper[4766]: I1002 13:19:24.431754 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:19:24 crc kubenswrapper[4766]: I1002 13:19:24.432159 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:19:43 crc kubenswrapper[4766]: I1002 13:19:43.850259 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_dc85d4fb-980e-4303-8850-ec3da21b43b2/init-config-reloader/0.log" Oct 02 13:19:44 crc kubenswrapper[4766]: I1002 13:19:44.023647 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_dc85d4fb-980e-4303-8850-ec3da21b43b2/init-config-reloader/0.log" Oct 02 13:19:44 crc kubenswrapper[4766]: I1002 13:19:44.100117 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_dc85d4fb-980e-4303-8850-ec3da21b43b2/alertmanager/0.log" Oct 02 13:19:44 crc kubenswrapper[4766]: I1002 13:19:44.201096 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_dc85d4fb-980e-4303-8850-ec3da21b43b2/config-reloader/0.log" Oct 02 13:19:44 crc kubenswrapper[4766]: I1002 13:19:44.395915 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_823c5010-35e5-4ab1-8b6b-d8c41b014442/aodh-api/0.log" Oct 02 13:19:44 crc kubenswrapper[4766]: I1002 13:19:44.564994 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_823c5010-35e5-4ab1-8b6b-d8c41b014442/aodh-evaluator/0.log" Oct 02 13:19:44 crc kubenswrapper[4766]: I1002 13:19:44.605388 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_823c5010-35e5-4ab1-8b6b-d8c41b014442/aodh-listener/0.log" Oct 02 13:19:44 crc kubenswrapper[4766]: I1002 13:19:44.769698 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_823c5010-35e5-4ab1-8b6b-d8c41b014442/aodh-notifier/0.log" Oct 02 13:19:44 crc kubenswrapper[4766]: I1002 13:19:44.972286 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5c58d8d958-sws7w_fcb2376e-df6b-448f-8b2a-3a8bfc8e7638/barbican-api/0.log" Oct 02 13:19:45 crc kubenswrapper[4766]: I1002 13:19:45.100288 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5c58d8d958-sws7w_fcb2376e-df6b-448f-8b2a-3a8bfc8e7638/barbican-api-log/0.log" Oct 02 13:19:45 crc kubenswrapper[4766]: I1002 13:19:45.315126 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b6ffc6db8-wgm5d_e65be6ba-47fe-4928-b461-53031fd0e5eb/barbican-keystone-listener/0.log" Oct 02 13:19:45 crc kubenswrapper[4766]: I1002 13:19:45.412031 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b6ffc6db8-wgm5d_e65be6ba-47fe-4928-b461-53031fd0e5eb/barbican-keystone-listener-log/0.log" Oct 02 13:19:45 crc kubenswrapper[4766]: I1002 13:19:45.497556 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-f95f7bf9c-29dkx_385a22c3-88e6-49f7-8e51-147925a9baef/barbican-worker/0.log" Oct 02 13:19:45 crc kubenswrapper[4766]: I1002 13:19:45.639003 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-f95f7bf9c-29dkx_385a22c3-88e6-49f7-8e51-147925a9baef/barbican-worker-log/0.log" Oct 02 13:19:45 crc kubenswrapper[4766]: I1002 13:19:45.823774 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-6x2jq_829dc872-61b8-4549-a976-404ea823ea25/bootstrap-openstack-openstack-cell1/0.log" Oct 02 13:19:45 crc kubenswrapper[4766]: I1002 13:19:45.997284 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3e07a5a4-b364-4459-84cf-badcf5cccab9/ceilometer-central-agent/0.log" Oct 02 13:19:46 crc kubenswrapper[4766]: I1002 13:19:46.070076 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3e07a5a4-b364-4459-84cf-badcf5cccab9/ceilometer-notification-agent/0.log" Oct 02 13:19:46 crc kubenswrapper[4766]: I1002 13:19:46.108965 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3e07a5a4-b364-4459-84cf-badcf5cccab9/proxy-httpd/0.log" Oct 02 13:19:46 crc kubenswrapper[4766]: I1002 13:19:46.223760 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3e07a5a4-b364-4459-84cf-badcf5cccab9/sg-core/0.log" Oct 02 13:19:46 crc kubenswrapper[4766]: I1002 13:19:46.302998 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-d6hht_79198143-fe64-4121-bb10-e86f9425fc1d/ceph-client-openstack-openstack-cell1/0.log" Oct 02 13:19:46 crc kubenswrapper[4766]: I1002 13:19:46.530487 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_40e44d35-376b-45b5-a4e1-8efd82067224/cinder-api-log/0.log" Oct 02 13:19:46 crc kubenswrapper[4766]: I1002 13:19:46.546212 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_40e44d35-376b-45b5-a4e1-8efd82067224/cinder-api/0.log" Oct 02 13:19:46 crc kubenswrapper[4766]: I1002 13:19:46.976952 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_3d28bc87-18c8-41c8-9747-cd0c23d2c98e/probe/0.log" Oct 02 13:19:47 crc kubenswrapper[4766]: I1002 13:19:47.025964 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_3d28bc87-18c8-41c8-9747-cd0c23d2c98e/cinder-backup/0.log" Oct 02 13:19:47 crc kubenswrapper[4766]: I1002 13:19:47.216578 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fd39d44b-ae8b-45fe-b571-6825a7febb30/cinder-scheduler/0.log" Oct 02 13:19:47 crc kubenswrapper[4766]: I1002 13:19:47.270252 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fd39d44b-ae8b-45fe-b571-6825a7febb30/probe/0.log" Oct 02 13:19:47 crc kubenswrapper[4766]: I1002 13:19:47.507545 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_49e0dfda-90fe-4595-9d8a-f0ebf15566dd/cinder-volume/0.log" Oct 02 13:19:47 crc kubenswrapper[4766]: I1002 13:19:47.511058 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_49e0dfda-90fe-4595-9d8a-f0ebf15566dd/probe/0.log" Oct 02 13:19:47 crc kubenswrapper[4766]: I1002 13:19:47.681177 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-pzt5t_286eccce-3f88-4796-b896-cd03ccfc3eba/configure-network-openstack-openstack-cell1/0.log" Oct 02 13:19:47 crc kubenswrapper[4766]: I1002 13:19:47.837744 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-v5xkb_72b4bb1d-0502-443a-814d-26667e9885f8/configure-os-openstack-openstack-cell1/0.log" Oct 02 13:19:47 crc kubenswrapper[4766]: I1002 13:19:47.941674 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bf98f57-9h5jh_675d2c19-820c-4e5e-b461-b44b4afe9d41/init/0.log" Oct 02 13:19:48 crc kubenswrapper[4766]: I1002 13:19:48.101352 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bf98f57-9h5jh_675d2c19-820c-4e5e-b461-b44b4afe9d41/init/0.log" Oct 02 13:19:48 crc kubenswrapper[4766]: I1002 13:19:48.120424 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bf98f57-9h5jh_675d2c19-820c-4e5e-b461-b44b4afe9d41/dnsmasq-dns/0.log" Oct 02 13:19:48 crc kubenswrapper[4766]: I1002 13:19:48.312777 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-whm5p_c5b90806-4f3e-49ce-a40a-3a51ee20b419/download-cache-openstack-openstack-cell1/0.log" Oct 02 13:19:48 crc kubenswrapper[4766]: I1002 13:19:48.455317 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e248d4b1-ecec-4d44-96cb-25f552b28709/glance-httpd/0.log" Oct 02 13:19:48 crc kubenswrapper[4766]: I1002 13:19:48.522118 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e248d4b1-ecec-4d44-96cb-25f552b28709/glance-log/0.log" Oct 02 13:19:48 crc kubenswrapper[4766]: I1002 13:19:48.660387 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c8e00770-3051-4ec4-a44c-364d503cb96c/glance-httpd/0.log" Oct 02 13:19:48 crc kubenswrapper[4766]: I1002 13:19:48.705184 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c8e00770-3051-4ec4-a44c-364d503cb96c/glance-log/0.log" Oct 02 13:19:48 crc kubenswrapper[4766]: I1002 13:19:48.888633 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-78997d45f6-fcx4l_8defab53-42c6-4cff-b024-5014eae2d6f8/heat-api/0.log" Oct 02 13:19:49 crc kubenswrapper[4766]: I1002 13:19:49.144453 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-58bc7d788f-cz9w5_1d049d96-d99d-4e97-84ed-0310c5d0b772/heat-cfnapi/0.log" Oct 02 13:19:49 crc kubenswrapper[4766]: I1002 13:19:49.240633 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-67cdcf9c8-cwdb9_d2e256f4-b176-44f6-9cb0-d8019ae9cc2c/heat-engine/0.log" Oct 02 13:19:49 crc kubenswrapper[4766]: I1002 13:19:49.443342 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5d8dc9db9c-bxckd_adfb66d8-7e20-477f-adce-87cacf4382d5/horizon/0.log" Oct 02 13:19:49 crc kubenswrapper[4766]: I1002 13:19:49.523202 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5d8dc9db9c-bxckd_adfb66d8-7e20-477f-adce-87cacf4382d5/horizon-log/0.log" Oct 02 13:19:49 crc kubenswrapper[4766]: I1002 13:19:49.659603 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-cq4mv_c4528d6f-0f1c-4624-abbd-03e6ee59ebba/install-certs-openstack-openstack-cell1/0.log" Oct 02 13:19:50 crc kubenswrapper[4766]: I1002 13:19:50.580859 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-56g6g_78818c83-db7c-4b55-bef0-a04d906450e7/install-os-openstack-openstack-cell1/0.log" Oct 02 13:19:51 crc kubenswrapper[4766]: I1002 13:19:51.077242 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5f56bd4789-wt4gd_d6827aee-b0ec-4d7a-a38c-31cb39c3679d/keystone-api/0.log" Oct 02 13:19:51 crc kubenswrapper[4766]: I1002 13:19:51.182165 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29323501-8bcqs_3f7942f7-8765-4c2f-8fbb-2f66d8170a56/keystone-cron/0.log" Oct 02 13:19:51 crc kubenswrapper[4766]: I1002 13:19:51.242701 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_9d25d74d-1b30-4bb2-8cc2-401004b37624/kube-state-metrics/0.log" Oct 02 13:19:51 crc kubenswrapper[4766]: I1002 13:19:51.458739 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-qhxsq_5bd9d737-6ebe-4a7c-b9f7-2d4ba486ba22/libvirt-openstack-openstack-cell1/0.log" Oct 02 13:19:51 crc kubenswrapper[4766]: I1002 13:19:51.636150 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_a0573958-ab76-414b-9eb8-b0ae73580f5a/manila-api-log/0.log" Oct 02 13:19:51 crc kubenswrapper[4766]: I1002 13:19:51.743862 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_a0573958-ab76-414b-9eb8-b0ae73580f5a/manila-api/0.log" Oct 02 13:19:51 crc kubenswrapper[4766]: I1002 13:19:51.902769 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_fb671f48-8819-4b2f-b52e-5bb8d5161e4c/manila-scheduler/0.log" Oct 02 13:19:51 crc kubenswrapper[4766]: I1002 13:19:51.944060 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_fb671f48-8819-4b2f-b52e-5bb8d5161e4c/probe/0.log" Oct 02 13:19:52 crc kubenswrapper[4766]: I1002 13:19:52.897898 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_48612c1a-5be8-48a6-bed8-26f26d78ef8e/probe/0.log" Oct 02 13:19:52 crc kubenswrapper[4766]: I1002 13:19:52.921784 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_48612c1a-5be8-48a6-bed8-26f26d78ef8e/manila-share/0.log" Oct 02 13:19:53 crc kubenswrapper[4766]: I1002 13:19:53.085819 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-copy-data_c82724de-a001-4f24-83ca-aa7d76bb293f/adoption/0.log" Oct 02 13:19:53 crc kubenswrapper[4766]: I1002 13:19:53.494439 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-cddb5dc7-2k4p9_4e5e1566-6708-4f6e-857e-ca1d6fe153ec/neutron-api/0.log" Oct 02 13:19:53 crc kubenswrapper[4766]: I1002 13:19:53.699305 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-cddb5dc7-2k4p9_4e5e1566-6708-4f6e-857e-ca1d6fe153ec/neutron-httpd/0.log" Oct 02 13:19:53 crc kubenswrapper[4766]: I1002 13:19:53.911234 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-xvfww_4e487036-bfdb-4c21-9e4a-7abec8f180d7/neutron-metadata-openstack-openstack-cell1/0.log" Oct 02 13:19:54 crc kubenswrapper[4766]: I1002 13:19:54.240426 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_cec8d0e7-a5b3-4f81-8690-1060c5802f29/nova-api-api/0.log" Oct 02 13:19:54 crc kubenswrapper[4766]: I1002 13:19:54.401699 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_cec8d0e7-a5b3-4f81-8690-1060c5802f29/nova-api-log/0.log" Oct 02 13:19:54 crc kubenswrapper[4766]: I1002 13:19:54.432019 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:19:54 crc kubenswrapper[4766]: I1002 13:19:54.432069 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:19:54 crc kubenswrapper[4766]: I1002 13:19:54.647202 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_df90d518-ac68-4f3b-97be-d914dbab2a48/nova-cell0-conductor-conductor/0.log" Oct 02 13:19:54 crc kubenswrapper[4766]: I1002 13:19:54.956389 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_d388c6de-eb31-4738-a537-8679908b7240/memcached/0.log" Oct 02 13:19:54 crc kubenswrapper[4766]: I1002 13:19:54.972455 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_14d108de-2d45-473e-bdcc-cc37740131d0/nova-cell1-conductor-conductor/0.log" Oct 02 13:19:55 crc kubenswrapper[4766]: I1002 13:19:55.128269 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_972be125-9e9e-4bc0-b89b-70813ccd3f53/nova-cell1-novncproxy-novncproxy/0.log" Oct 02 13:19:55 crc kubenswrapper[4766]: I1002 13:19:55.229284 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-4npcm_f9175f1f-c2e8-4454-86c2-5e4c795834b1/nova-cell1-openstack-openstack-cell1/0.log" Oct 02 13:19:55 crc kubenswrapper[4766]: I1002 13:19:55.441540 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-7k7hx_3584b308-cda0-4e37-a0ef-63fef09a9be8/nova-cell1-openstack-openstack-cell1/0.log" Oct 02 13:19:55 crc kubenswrapper[4766]: I1002 13:19:55.734120 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-nbn66_48c97d74-b920-4e52-b90d-44faa051eba6/nova-cell1-openstack-openstack-cell1/0.log" Oct 02 13:19:55 crc kubenswrapper[4766]: I1002 13:19:55.837468 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-p4nqg_87b4132e-2db0-40be-9e2d-7c7c8261f7bc/nova-cell1-openstack-openstack-cell1/0.log" Oct 02 13:19:56 crc kubenswrapper[4766]: I1002 13:19:56.063186 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_58717615-2658-46d5-9945-b726dc965af3/nova-metadata-log/0.log" Oct 02 13:19:56 crc kubenswrapper[4766]: I1002 13:19:56.212660 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_58717615-2658-46d5-9945-b726dc965af3/nova-metadata-metadata/0.log" Oct 02 13:19:56 crc kubenswrapper[4766]: I1002 13:19:56.405823 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_7bf3ca71-58f7-4730-8ab4-24ff2dbf95c4/nova-scheduler-scheduler/0.log" Oct 02 13:19:56 crc kubenswrapper[4766]: I1002 13:19:56.544768 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-846f489cc6-ggdkp_cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3/init/0.log" Oct 02 13:19:56 crc kubenswrapper[4766]: I1002 13:19:56.725869 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-846f489cc6-ggdkp_cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3/init/0.log" Oct 02 13:19:56 crc kubenswrapper[4766]: I1002 13:19:56.736658 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-846f489cc6-ggdkp_cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3/octavia-api-provider-agent/0.log" Oct 02 13:19:56 crc kubenswrapper[4766]: I1002 13:19:56.886575 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-846f489cc6-ggdkp_cfb5d33e-3fbd-4bef-9d65-1e0ca95f61f3/octavia-api/0.log" Oct 02 13:19:56 crc kubenswrapper[4766]: I1002 13:19:56.946358 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-cqtgm_888c9f82-8929-4dda-b89a-cbd917f2026d/init/0.log" Oct 02 13:19:57 crc kubenswrapper[4766]: I1002 13:19:57.158406 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-cqtgm_888c9f82-8929-4dda-b89a-cbd917f2026d/init/0.log" Oct 02 13:19:57 crc kubenswrapper[4766]: I1002 13:19:57.203379 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-cqtgm_888c9f82-8929-4dda-b89a-cbd917f2026d/octavia-healthmanager/0.log" Oct 02 13:19:57 crc kubenswrapper[4766]: I1002 13:19:57.342606 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-t6x4h_0c16aa8e-cc08-497e-8ad3-db18bbc82afd/init/0.log" Oct 02 13:19:57 crc kubenswrapper[4766]: I1002 13:19:57.641449 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-t6x4h_0c16aa8e-cc08-497e-8ad3-db18bbc82afd/init/0.log" Oct 02 13:19:57 crc kubenswrapper[4766]: I1002 13:19:57.679962 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-t6x4h_0c16aa8e-cc08-497e-8ad3-db18bbc82afd/octavia-housekeeping/0.log" Oct 02 13:19:58 crc kubenswrapper[4766]: I1002 13:19:58.079887 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-2l26f_16a82893-bce9-4426-ba47-7da418e9ba66/init/0.log" Oct 02 13:19:58 crc kubenswrapper[4766]: I1002 13:19:58.386953 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-2l26f_16a82893-bce9-4426-ba47-7da418e9ba66/octavia-amphora-httpd/0.log" Oct 02 13:19:58 crc kubenswrapper[4766]: I1002 13:19:58.413373 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-59f8cff499-2l26f_16a82893-bce9-4426-ba47-7da418e9ba66/init/0.log" Oct 02 13:19:58 crc kubenswrapper[4766]: I1002 13:19:58.587002 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-r2wtj_e436f55e-dc19-4d4b-be98-d024fb589618/init/0.log" Oct 02 13:19:58 crc kubenswrapper[4766]: I1002 13:19:58.806673 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-r2wtj_e436f55e-dc19-4d4b-be98-d024fb589618/init/0.log" Oct 02 13:19:58 crc kubenswrapper[4766]: I1002 13:19:58.848756 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-r2wtj_e436f55e-dc19-4d4b-be98-d024fb589618/octavia-rsyslog/0.log" Oct 02 13:19:59 crc kubenswrapper[4766]: I1002 13:19:59.019222 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-7j2sx_958c4797-4e76-4c64-9e99-8059508526c6/init/0.log" Oct 02 13:19:59 crc kubenswrapper[4766]: I1002 13:19:59.192896 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-7j2sx_958c4797-4e76-4c64-9e99-8059508526c6/init/0.log" Oct 02 13:19:59 crc kubenswrapper[4766]: I1002 13:19:59.330943 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-7j2sx_958c4797-4e76-4c64-9e99-8059508526c6/octavia-worker/0.log" Oct 02 13:19:59 crc kubenswrapper[4766]: I1002 13:19:59.434715 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c7a337c5-3d90-4978-b2e2-1bd756a4a967/mysql-bootstrap/0.log" Oct 02 13:19:59 crc kubenswrapper[4766]: I1002 13:19:59.646916 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c7a337c5-3d90-4978-b2e2-1bd756a4a967/mysql-bootstrap/0.log" Oct 02 13:19:59 crc kubenswrapper[4766]: I1002 13:19:59.658308 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c7a337c5-3d90-4978-b2e2-1bd756a4a967/galera/0.log" Oct 02 13:19:59 crc kubenswrapper[4766]: I1002 13:19:59.861896 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_48638127-8158-456c-ae7e-77d9ba95fd0b/mysql-bootstrap/0.log" Oct 02 13:20:00 crc kubenswrapper[4766]: I1002 13:20:00.067522 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_48638127-8158-456c-ae7e-77d9ba95fd0b/mysql-bootstrap/0.log" Oct 02 13:20:00 crc kubenswrapper[4766]: I1002 13:20:00.096113 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_48638127-8158-456c-ae7e-77d9ba95fd0b/galera/0.log" Oct 02 13:20:00 crc kubenswrapper[4766]: I1002 13:20:00.289778 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_fa24afd2-9499-490a-bc1a-8261b74d0dae/openstackclient/0.log" Oct 02 13:20:00 crc kubenswrapper[4766]: I1002 13:20:00.371051 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-z5dll_25979d18-2dce-4710-9661-e1272a2935ea/openstack-network-exporter/0.log" Oct 02 13:20:00 crc kubenswrapper[4766]: I1002 13:20:00.577671 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8px54_b8e19d03-175e-4bdf-8675-cab23ab974ea/ovsdb-server-init/0.log" Oct 02 13:20:00 crc kubenswrapper[4766]: I1002 13:20:00.815978 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8px54_b8e19d03-175e-4bdf-8675-cab23ab974ea/ovs-vswitchd/0.log" Oct 02 13:20:00 crc kubenswrapper[4766]: I1002 13:20:00.865510 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8px54_b8e19d03-175e-4bdf-8675-cab23ab974ea/ovsdb-server/0.log" Oct 02 13:20:00 crc kubenswrapper[4766]: I1002 13:20:00.883011 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-8px54_b8e19d03-175e-4bdf-8675-cab23ab974ea/ovsdb-server-init/0.log" Oct 02 13:20:01 crc kubenswrapper[4766]: I1002 13:20:01.074919 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-q6w2f_059b4130-8ca1-4df7-87ca-762fbcf3048e/ovn-controller/0.log" Oct 02 13:20:01 crc kubenswrapper[4766]: I1002 13:20:01.282490 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-copy-data_59964a1b-9dfe-49fc-b2e7-6d6f30959b26/adoption/0.log" Oct 02 13:20:01 crc kubenswrapper[4766]: I1002 13:20:01.324181 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0d182bc5-db60-4980-8df6-469f2efb5188/openstack-network-exporter/0.log" Oct 02 13:20:01 crc kubenswrapper[4766]: I1002 13:20:01.498949 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_0d182bc5-db60-4980-8df6-469f2efb5188/ovn-northd/0.log" Oct 02 13:20:02 crc kubenswrapper[4766]: I1002 13:20:02.526048 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_653c6c64-ca0a-46f2-8548-2e9b94dd9f34/openstack-network-exporter/0.log" Oct 02 13:20:02 crc kubenswrapper[4766]: I1002 13:20:02.540795 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-2wfs4_a5b18380-ebf6-4969-b53a-463b4734baa9/ovn-openstack-openstack-cell1/0.log" Oct 02 13:20:02 crc kubenswrapper[4766]: I1002 13:20:02.704356 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_d5dc71b5-2203-47e7-9006-85d5c360d2a7/openstack-network-exporter/0.log" Oct 02 13:20:02 crc kubenswrapper[4766]: I1002 13:20:02.735396 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_653c6c64-ca0a-46f2-8548-2e9b94dd9f34/ovsdbserver-nb/0.log" Oct 02 13:20:02 crc kubenswrapper[4766]: I1002 13:20:02.952252 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_d5dc71b5-2203-47e7-9006-85d5c360d2a7/ovsdbserver-nb/0.log" Oct 02 13:20:02 crc kubenswrapper[4766]: I1002 13:20:02.977549 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_24e149ea-94ee-4a26-9e9a-900be46fb609/openstack-network-exporter/0.log" Oct 02 13:20:03 crc kubenswrapper[4766]: I1002 13:20:03.126170 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_24e149ea-94ee-4a26-9e9a-900be46fb609/ovsdbserver-nb/0.log" Oct 02 13:20:03 crc kubenswrapper[4766]: I1002 13:20:03.210247 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a/openstack-network-exporter/0.log" Oct 02 13:20:03 crc kubenswrapper[4766]: I1002 13:20:03.613308 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_02d613a1-8be4-43a0-a8b0-fa8c25fa9d4a/ovsdbserver-sb/0.log" Oct 02 13:20:03 crc kubenswrapper[4766]: I1002 13:20:03.627981 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_6153a2cd-5c95-43ec-8238-f2a2e63598cb/openstack-network-exporter/0.log" Oct 02 13:20:04 crc kubenswrapper[4766]: I1002 13:20:04.400340 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_6153a2cd-5c95-43ec-8238-f2a2e63598cb/ovsdbserver-sb/0.log" Oct 02 13:20:04 crc kubenswrapper[4766]: I1002 13:20:04.450716 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_5be0a42a-c47f-4b40-ae00-72f013eaf3cb/openstack-network-exporter/0.log" Oct 02 13:20:04 crc kubenswrapper[4766]: I1002 13:20:04.586545 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_5be0a42a-c47f-4b40-ae00-72f013eaf3cb/ovsdbserver-sb/0.log" Oct 02 13:20:04 crc kubenswrapper[4766]: I1002 13:20:04.704740 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-f4c5978dd-trc79_7d2ad094-910d-40a1-b172-b1ad77166e18/placement-api/0.log" Oct 02 13:20:04 crc kubenswrapper[4766]: I1002 13:20:04.819256 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-f4c5978dd-trc79_7d2ad094-910d-40a1-b172-b1ad77166e18/placement-log/0.log" Oct 02 13:20:04 crc kubenswrapper[4766]: I1002 13:20:04.901576 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-c8p8cm_2d25c6d1-9077-4c8d-b356-44d4a0abb5fa/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Oct 02 13:20:05 crc kubenswrapper[4766]: I1002 13:20:05.134593 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e/init-config-reloader/0.log" Oct 02 13:20:05 crc kubenswrapper[4766]: I1002 13:20:05.330851 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e/config-reloader/0.log" Oct 02 13:20:05 crc kubenswrapper[4766]: I1002 13:20:05.331353 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e/prometheus/0.log" Oct 02 13:20:05 crc kubenswrapper[4766]: I1002 13:20:05.384556 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e/init-config-reloader/0.log" Oct 02 13:20:05 crc kubenswrapper[4766]: I1002 13:20:05.561382 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_6faef06b-4d1a-4d60-b4ee-f97ee57d2c0e/thanos-sidecar/0.log" Oct 02 13:20:05 crc kubenswrapper[4766]: I1002 13:20:05.733169 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f39320fe-abbc-4c64-8b86-1b32a7924017/setup-container/0.log" Oct 02 13:20:05 crc kubenswrapper[4766]: I1002 13:20:05.985721 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f39320fe-abbc-4c64-8b86-1b32a7924017/setup-container/0.log" Oct 02 13:20:06 crc kubenswrapper[4766]: I1002 13:20:06.068635 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f39320fe-abbc-4c64-8b86-1b32a7924017/rabbitmq/0.log" Oct 02 13:20:06 crc kubenswrapper[4766]: I1002 13:20:06.213056 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_35a7d34a-27b2-496f-aa63-b04439becb52/setup-container/0.log" Oct 02 13:20:06 crc kubenswrapper[4766]: I1002 13:20:06.394089 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_35a7d34a-27b2-496f-aa63-b04439becb52/setup-container/0.log" Oct 02 13:20:06 crc kubenswrapper[4766]: I1002 13:20:06.675319 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-4zfvg_96d36e75-8d95-4fb5-9601-bbc75eb150d4/reboot-os-openstack-openstack-cell1/0.log" Oct 02 13:20:06 crc kubenswrapper[4766]: I1002 13:20:06.892779 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_35a7d34a-27b2-496f-aa63-b04439becb52/rabbitmq/0.log" Oct 02 13:20:06 crc kubenswrapper[4766]: I1002 13:20:06.913128 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-wxt5h_598a570b-1120-474e-a6a1-e46a82ff8272/run-os-openstack-openstack-cell1/0.log" Oct 02 13:20:07 crc kubenswrapper[4766]: I1002 13:20:07.074322 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-5hwdj_8fbae716-c2f8-43f5-9129-632f37db1f4e/ssh-known-hosts-openstack/0.log" Oct 02 13:20:07 crc kubenswrapper[4766]: I1002 13:20:07.334111 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-mlx58_88d78077-1bd0-416c-979a-b52075152089/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Oct 02 13:20:07 crc kubenswrapper[4766]: I1002 13:20:07.370096 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-46x8b_3344e415-6ae1-4d8e-b27e-73aeb7bba387/validate-network-openstack-openstack-cell1/0.log" Oct 02 13:20:24 crc kubenswrapper[4766]: I1002 13:20:24.431681 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:20:24 crc kubenswrapper[4766]: I1002 13:20:24.432265 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:20:24 crc kubenswrapper[4766]: I1002 13:20:24.432313 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 13:20:24 crc kubenswrapper[4766]: I1002 13:20:24.433167 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ebffd89870cf914fa65f0c122b373b894237020d282dbb532b184737bc6c17ba"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:20:24 crc kubenswrapper[4766]: I1002 13:20:24.433225 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://ebffd89870cf914fa65f0c122b373b894237020d282dbb532b184737bc6c17ba" gracePeriod=600 Oct 02 13:20:24 crc kubenswrapper[4766]: I1002 13:20:24.575630 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="ebffd89870cf914fa65f0c122b373b894237020d282dbb532b184737bc6c17ba" exitCode=0 Oct 02 13:20:24 crc kubenswrapper[4766]: I1002 13:20:24.575697 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"ebffd89870cf914fa65f0c122b373b894237020d282dbb532b184737bc6c17ba"} Oct 02 13:20:24 crc kubenswrapper[4766]: I1002 13:20:24.575744 4766 scope.go:117] "RemoveContainer" containerID="13b9a7d3abea387266acab011094c407bc354be3c33a31fb3d625f22b13c1be8" Oct 02 13:20:25 crc kubenswrapper[4766]: I1002 13:20:25.591037 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerStarted","Data":"ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6"} Oct 02 13:20:41 crc kubenswrapper[4766]: I1002 13:20:41.775300 4766 generic.go:334] "Generic (PLEG): container finished" podID="eba16009-2412-4fac-95fe-6a5464c43608" containerID="cc8362feff2618262831769a37ba173901c1f43ac901092733ed8d99b6848493" exitCode=0 Oct 02 13:20:41 crc kubenswrapper[4766]: I1002 13:20:41.775603 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gs9r/crc-debug-98t8k" event={"ID":"eba16009-2412-4fac-95fe-6a5464c43608","Type":"ContainerDied","Data":"cc8362feff2618262831769a37ba173901c1f43ac901092733ed8d99b6848493"} Oct 02 13:20:41 crc kubenswrapper[4766]: I1002 13:20:41.931390 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-67gx7"] Oct 02 13:20:41 crc kubenswrapper[4766]: E1002 13:20:41.932226 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dbdf152-6a29-4c12-9986-25d3112a9618" containerName="registry-server" Oct 02 13:20:41 crc kubenswrapper[4766]: I1002 13:20:41.932246 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dbdf152-6a29-4c12-9986-25d3112a9618" containerName="registry-server" Oct 02 13:20:41 crc kubenswrapper[4766]: E1002 13:20:41.932278 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dbdf152-6a29-4c12-9986-25d3112a9618" containerName="extract-utilities" Oct 02 13:20:41 crc kubenswrapper[4766]: I1002 13:20:41.932287 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dbdf152-6a29-4c12-9986-25d3112a9618" containerName="extract-utilities" Oct 02 13:20:41 crc kubenswrapper[4766]: E1002 13:20:41.932332 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dbdf152-6a29-4c12-9986-25d3112a9618" containerName="extract-content" Oct 02 13:20:41 crc kubenswrapper[4766]: I1002 13:20:41.932341 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dbdf152-6a29-4c12-9986-25d3112a9618" containerName="extract-content" Oct 02 13:20:41 crc kubenswrapper[4766]: I1002 13:20:41.932612 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dbdf152-6a29-4c12-9986-25d3112a9618" containerName="registry-server" Oct 02 13:20:41 crc kubenswrapper[4766]: I1002 13:20:41.934802 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67gx7" Oct 02 13:20:41 crc kubenswrapper[4766]: I1002 13:20:41.948802 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67gx7"] Oct 02 13:20:41 crc kubenswrapper[4766]: I1002 13:20:41.965225 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/965a5af0-f685-4338-aeb7-98ae47151c0c-catalog-content\") pod \"redhat-marketplace-67gx7\" (UID: \"965a5af0-f685-4338-aeb7-98ae47151c0c\") " pod="openshift-marketplace/redhat-marketplace-67gx7" Oct 02 13:20:41 crc kubenswrapper[4766]: I1002 13:20:41.965391 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkt4b\" (UniqueName: \"kubernetes.io/projected/965a5af0-f685-4338-aeb7-98ae47151c0c-kube-api-access-dkt4b\") pod \"redhat-marketplace-67gx7\" (UID: \"965a5af0-f685-4338-aeb7-98ae47151c0c\") " pod="openshift-marketplace/redhat-marketplace-67gx7" Oct 02 13:20:41 crc kubenswrapper[4766]: I1002 13:20:41.965444 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/965a5af0-f685-4338-aeb7-98ae47151c0c-utilities\") pod \"redhat-marketplace-67gx7\" (UID: \"965a5af0-f685-4338-aeb7-98ae47151c0c\") " pod="openshift-marketplace/redhat-marketplace-67gx7" Oct 02 13:20:42 crc kubenswrapper[4766]: I1002 13:20:42.069372 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkt4b\" (UniqueName: \"kubernetes.io/projected/965a5af0-f685-4338-aeb7-98ae47151c0c-kube-api-access-dkt4b\") pod \"redhat-marketplace-67gx7\" (UID: \"965a5af0-f685-4338-aeb7-98ae47151c0c\") " pod="openshift-marketplace/redhat-marketplace-67gx7" Oct 02 13:20:42 crc kubenswrapper[4766]: I1002 13:20:42.069459 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/965a5af0-f685-4338-aeb7-98ae47151c0c-utilities\") pod \"redhat-marketplace-67gx7\" (UID: \"965a5af0-f685-4338-aeb7-98ae47151c0c\") " pod="openshift-marketplace/redhat-marketplace-67gx7" Oct 02 13:20:42 crc kubenswrapper[4766]: I1002 13:20:42.069628 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/965a5af0-f685-4338-aeb7-98ae47151c0c-catalog-content\") pod \"redhat-marketplace-67gx7\" (UID: \"965a5af0-f685-4338-aeb7-98ae47151c0c\") " pod="openshift-marketplace/redhat-marketplace-67gx7" Oct 02 13:20:42 crc kubenswrapper[4766]: I1002 13:20:42.070167 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/965a5af0-f685-4338-aeb7-98ae47151c0c-utilities\") pod \"redhat-marketplace-67gx7\" (UID: \"965a5af0-f685-4338-aeb7-98ae47151c0c\") " pod="openshift-marketplace/redhat-marketplace-67gx7" Oct 02 13:20:42 crc kubenswrapper[4766]: I1002 13:20:42.070225 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/965a5af0-f685-4338-aeb7-98ae47151c0c-catalog-content\") pod \"redhat-marketplace-67gx7\" (UID: \"965a5af0-f685-4338-aeb7-98ae47151c0c\") " pod="openshift-marketplace/redhat-marketplace-67gx7" Oct 02 13:20:42 crc kubenswrapper[4766]: I1002 13:20:42.088981 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkt4b\" (UniqueName: \"kubernetes.io/projected/965a5af0-f685-4338-aeb7-98ae47151c0c-kube-api-access-dkt4b\") pod \"redhat-marketplace-67gx7\" (UID: \"965a5af0-f685-4338-aeb7-98ae47151c0c\") " pod="openshift-marketplace/redhat-marketplace-67gx7" Oct 02 13:20:42 crc kubenswrapper[4766]: I1002 13:20:42.265144 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67gx7" Oct 02 13:20:42 crc kubenswrapper[4766]: I1002 13:20:42.877575 4766 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67gx7"] Oct 02 13:20:42 crc kubenswrapper[4766]: W1002 13:20:42.890404 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod965a5af0_f685_4338_aeb7_98ae47151c0c.slice/crio-133b674978d78ba8a01763e212fff13961ff5f00e42157a445d7e9a92e854add WatchSource:0}: Error finding container 133b674978d78ba8a01763e212fff13961ff5f00e42157a445d7e9a92e854add: Status 404 returned error can't find the container with id 133b674978d78ba8a01763e212fff13961ff5f00e42157a445d7e9a92e854add Oct 02 13:20:42 crc kubenswrapper[4766]: I1002 13:20:42.929121 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gs9r/crc-debug-98t8k" Oct 02 13:20:42 crc kubenswrapper[4766]: I1002 13:20:42.984577 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7gs9r/crc-debug-98t8k"] Oct 02 13:20:42 crc kubenswrapper[4766]: I1002 13:20:42.996579 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7gs9r/crc-debug-98t8k"] Oct 02 13:20:42 crc kubenswrapper[4766]: I1002 13:20:42.996662 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eba16009-2412-4fac-95fe-6a5464c43608-host\") pod \"eba16009-2412-4fac-95fe-6a5464c43608\" (UID: \"eba16009-2412-4fac-95fe-6a5464c43608\") " Oct 02 13:20:42 crc kubenswrapper[4766]: I1002 13:20:42.996770 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttssr\" (UniqueName: \"kubernetes.io/projected/eba16009-2412-4fac-95fe-6a5464c43608-kube-api-access-ttssr\") pod \"eba16009-2412-4fac-95fe-6a5464c43608\" (UID: \"eba16009-2412-4fac-95fe-6a5464c43608\") " Oct 02 13:20:42 crc kubenswrapper[4766]: I1002 13:20:42.997008 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eba16009-2412-4fac-95fe-6a5464c43608-host" (OuterVolumeSpecName: "host") pod "eba16009-2412-4fac-95fe-6a5464c43608" (UID: "eba16009-2412-4fac-95fe-6a5464c43608"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:20:42 crc kubenswrapper[4766]: I1002 13:20:42.997678 4766 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/eba16009-2412-4fac-95fe-6a5464c43608-host\") on node \"crc\" DevicePath \"\"" Oct 02 13:20:43 crc kubenswrapper[4766]: I1002 13:20:43.010750 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eba16009-2412-4fac-95fe-6a5464c43608-kube-api-access-ttssr" (OuterVolumeSpecName: "kube-api-access-ttssr") pod "eba16009-2412-4fac-95fe-6a5464c43608" (UID: "eba16009-2412-4fac-95fe-6a5464c43608"). InnerVolumeSpecName "kube-api-access-ttssr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:20:43 crc kubenswrapper[4766]: I1002 13:20:43.107121 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttssr\" (UniqueName: \"kubernetes.io/projected/eba16009-2412-4fac-95fe-6a5464c43608-kube-api-access-ttssr\") on node \"crc\" DevicePath \"\"" Oct 02 13:20:43 crc kubenswrapper[4766]: I1002 13:20:43.803944 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67gx7" event={"ID":"965a5af0-f685-4338-aeb7-98ae47151c0c","Type":"ContainerStarted","Data":"d18a96b8126acdc8b0dc4057cf0ac8edc4b645ad2e9979d6e72c83389c52a2cf"} Oct 02 13:20:43 crc kubenswrapper[4766]: I1002 13:20:43.804323 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67gx7" event={"ID":"965a5af0-f685-4338-aeb7-98ae47151c0c","Type":"ContainerStarted","Data":"133b674978d78ba8a01763e212fff13961ff5f00e42157a445d7e9a92e854add"} Oct 02 13:20:43 crc kubenswrapper[4766]: I1002 13:20:43.809523 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4996b5d45e08cb7fa30691ed0ab45e9285c711656f54a0dd1b1b5359581f319a" Oct 02 13:20:43 crc kubenswrapper[4766]: I1002 13:20:43.809636 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gs9r/crc-debug-98t8k" Oct 02 13:20:43 crc kubenswrapper[4766]: I1002 13:20:43.893238 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eba16009-2412-4fac-95fe-6a5464c43608" path="/var/lib/kubelet/pods/eba16009-2412-4fac-95fe-6a5464c43608/volumes" Oct 02 13:20:44 crc kubenswrapper[4766]: I1002 13:20:44.822209 4766 generic.go:334] "Generic (PLEG): container finished" podID="965a5af0-f685-4338-aeb7-98ae47151c0c" containerID="d18a96b8126acdc8b0dc4057cf0ac8edc4b645ad2e9979d6e72c83389c52a2cf" exitCode=0 Oct 02 13:20:44 crc kubenswrapper[4766]: I1002 13:20:44.822252 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67gx7" event={"ID":"965a5af0-f685-4338-aeb7-98ae47151c0c","Type":"ContainerDied","Data":"d18a96b8126acdc8b0dc4057cf0ac8edc4b645ad2e9979d6e72c83389c52a2cf"} Oct 02 13:20:44 crc kubenswrapper[4766]: I1002 13:20:44.985118 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7gs9r/crc-debug-jwms6"] Oct 02 13:20:44 crc kubenswrapper[4766]: E1002 13:20:44.985563 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eba16009-2412-4fac-95fe-6a5464c43608" containerName="container-00" Oct 02 13:20:44 crc kubenswrapper[4766]: I1002 13:20:44.985574 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="eba16009-2412-4fac-95fe-6a5464c43608" containerName="container-00" Oct 02 13:20:44 crc kubenswrapper[4766]: I1002 13:20:44.985785 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="eba16009-2412-4fac-95fe-6a5464c43608" containerName="container-00" Oct 02 13:20:44 crc kubenswrapper[4766]: I1002 13:20:44.986528 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gs9r/crc-debug-jwms6" Oct 02 13:20:45 crc kubenswrapper[4766]: I1002 13:20:45.045599 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e-host\") pod \"crc-debug-jwms6\" (UID: \"64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e\") " pod="openshift-must-gather-7gs9r/crc-debug-jwms6" Oct 02 13:20:45 crc kubenswrapper[4766]: I1002 13:20:45.045663 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cprht\" (UniqueName: \"kubernetes.io/projected/64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e-kube-api-access-cprht\") pod \"crc-debug-jwms6\" (UID: \"64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e\") " pod="openshift-must-gather-7gs9r/crc-debug-jwms6" Oct 02 13:20:45 crc kubenswrapper[4766]: I1002 13:20:45.147539 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e-host\") pod \"crc-debug-jwms6\" (UID: \"64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e\") " pod="openshift-must-gather-7gs9r/crc-debug-jwms6" Oct 02 13:20:45 crc kubenswrapper[4766]: I1002 13:20:45.147624 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cprht\" (UniqueName: \"kubernetes.io/projected/64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e-kube-api-access-cprht\") pod \"crc-debug-jwms6\" (UID: \"64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e\") " pod="openshift-must-gather-7gs9r/crc-debug-jwms6" Oct 02 13:20:45 crc kubenswrapper[4766]: I1002 13:20:45.148248 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e-host\") pod \"crc-debug-jwms6\" (UID: \"64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e\") " pod="openshift-must-gather-7gs9r/crc-debug-jwms6" Oct 02 13:20:45 crc kubenswrapper[4766]: I1002 13:20:45.612085 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cprht\" (UniqueName: \"kubernetes.io/projected/64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e-kube-api-access-cprht\") pod \"crc-debug-jwms6\" (UID: \"64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e\") " pod="openshift-must-gather-7gs9r/crc-debug-jwms6" Oct 02 13:20:45 crc kubenswrapper[4766]: I1002 13:20:45.834550 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67gx7" event={"ID":"965a5af0-f685-4338-aeb7-98ae47151c0c","Type":"ContainerStarted","Data":"c0968acc214df1515d35840df212b10720e581fbc5123b3138af54e79056deb5"} Oct 02 13:20:45 crc kubenswrapper[4766]: I1002 13:20:45.903255 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gs9r/crc-debug-jwms6" Oct 02 13:20:45 crc kubenswrapper[4766]: W1002 13:20:45.936917 4766 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64d7e6d7_5a94_48e6_bc28_ae5c3f1db83e.slice/crio-ca56d564b12333c252a547720a88e754cc294dd660101fc7aaa37c086cadc273 WatchSource:0}: Error finding container ca56d564b12333c252a547720a88e754cc294dd660101fc7aaa37c086cadc273: Status 404 returned error can't find the container with id ca56d564b12333c252a547720a88e754cc294dd660101fc7aaa37c086cadc273 Oct 02 13:20:46 crc kubenswrapper[4766]: I1002 13:20:46.844774 4766 generic.go:334] "Generic (PLEG): container finished" podID="965a5af0-f685-4338-aeb7-98ae47151c0c" containerID="c0968acc214df1515d35840df212b10720e581fbc5123b3138af54e79056deb5" exitCode=0 Oct 02 13:20:46 crc kubenswrapper[4766]: I1002 13:20:46.844865 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67gx7" event={"ID":"965a5af0-f685-4338-aeb7-98ae47151c0c","Type":"ContainerDied","Data":"c0968acc214df1515d35840df212b10720e581fbc5123b3138af54e79056deb5"} Oct 02 13:20:46 crc kubenswrapper[4766]: I1002 13:20:46.848606 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gs9r/crc-debug-jwms6" event={"ID":"64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e","Type":"ContainerStarted","Data":"23257bc4d761e7f87acc3381a5be41f8676fba1e63044fffa54703d02f3f38ec"} Oct 02 13:20:46 crc kubenswrapper[4766]: I1002 13:20:46.848680 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gs9r/crc-debug-jwms6" event={"ID":"64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e","Type":"ContainerStarted","Data":"ca56d564b12333c252a547720a88e754cc294dd660101fc7aaa37c086cadc273"} Oct 02 13:20:46 crc kubenswrapper[4766]: I1002 13:20:46.892667 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7gs9r/crc-debug-jwms6" podStartSLOduration=2.892630918 podStartE2EDuration="2.892630918s" podCreationTimestamp="2025-10-02 13:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-02 13:20:46.88109081 +0000 UTC m=+8961.823961764" watchObservedRunningTime="2025-10-02 13:20:46.892630918 +0000 UTC m=+8961.835501882" Oct 02 13:20:47 crc kubenswrapper[4766]: I1002 13:20:47.859882 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67gx7" event={"ID":"965a5af0-f685-4338-aeb7-98ae47151c0c","Type":"ContainerStarted","Data":"2b9bc3740beb2121725d47954cae9e5124f0078547d366667e56134d888ad94e"} Oct 02 13:20:47 crc kubenswrapper[4766]: I1002 13:20:47.864685 4766 generic.go:334] "Generic (PLEG): container finished" podID="64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e" containerID="23257bc4d761e7f87acc3381a5be41f8676fba1e63044fffa54703d02f3f38ec" exitCode=0 Oct 02 13:20:47 crc kubenswrapper[4766]: I1002 13:20:47.864734 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gs9r/crc-debug-jwms6" event={"ID":"64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e","Type":"ContainerDied","Data":"23257bc4d761e7f87acc3381a5be41f8676fba1e63044fffa54703d02f3f38ec"} Oct 02 13:20:47 crc kubenswrapper[4766]: I1002 13:20:47.918100 4766 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-67gx7" podStartSLOduration=4.49593754 podStartE2EDuration="6.918075354s" podCreationTimestamp="2025-10-02 13:20:41 +0000 UTC" firstStartedPulling="2025-10-02 13:20:44.825063225 +0000 UTC m=+8959.767934169" lastFinishedPulling="2025-10-02 13:20:47.247201039 +0000 UTC m=+8962.190071983" observedRunningTime="2025-10-02 13:20:47.887876296 +0000 UTC m=+8962.830747240" watchObservedRunningTime="2025-10-02 13:20:47.918075354 +0000 UTC m=+8962.860946298" Oct 02 13:20:49 crc kubenswrapper[4766]: I1002 13:20:49.033570 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gs9r/crc-debug-jwms6" Oct 02 13:20:49 crc kubenswrapper[4766]: I1002 13:20:49.133702 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cprht\" (UniqueName: \"kubernetes.io/projected/64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e-kube-api-access-cprht\") pod \"64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e\" (UID: \"64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e\") " Oct 02 13:20:49 crc kubenswrapper[4766]: I1002 13:20:49.133926 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e-host\") pod \"64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e\" (UID: \"64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e\") " Oct 02 13:20:49 crc kubenswrapper[4766]: I1002 13:20:49.134118 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e-host" (OuterVolumeSpecName: "host") pod "64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e" (UID: "64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:20:49 crc kubenswrapper[4766]: I1002 13:20:49.134560 4766 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e-host\") on node \"crc\" DevicePath \"\"" Oct 02 13:20:49 crc kubenswrapper[4766]: I1002 13:20:49.139141 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e-kube-api-access-cprht" (OuterVolumeSpecName: "kube-api-access-cprht") pod "64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e" (UID: "64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e"). InnerVolumeSpecName "kube-api-access-cprht". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:20:49 crc kubenswrapper[4766]: I1002 13:20:49.236467 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cprht\" (UniqueName: \"kubernetes.io/projected/64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e-kube-api-access-cprht\") on node \"crc\" DevicePath \"\"" Oct 02 13:20:49 crc kubenswrapper[4766]: I1002 13:20:49.882467 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gs9r/crc-debug-jwms6" Oct 02 13:20:49 crc kubenswrapper[4766]: I1002 13:20:49.901911 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gs9r/crc-debug-jwms6" event={"ID":"64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e","Type":"ContainerDied","Data":"ca56d564b12333c252a547720a88e754cc294dd660101fc7aaa37c086cadc273"} Oct 02 13:20:49 crc kubenswrapper[4766]: I1002 13:20:49.901950 4766 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca56d564b12333c252a547720a88e754cc294dd660101fc7aaa37c086cadc273" Oct 02 13:20:52 crc kubenswrapper[4766]: I1002 13:20:52.265330 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-67gx7" Oct 02 13:20:52 crc kubenswrapper[4766]: I1002 13:20:52.265977 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-67gx7" Oct 02 13:20:52 crc kubenswrapper[4766]: I1002 13:20:52.319239 4766 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-67gx7" Oct 02 13:20:52 crc kubenswrapper[4766]: I1002 13:20:52.963001 4766 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-67gx7" Oct 02 13:20:53 crc kubenswrapper[4766]: I1002 13:20:53.017572 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67gx7"] Oct 02 13:20:54 crc kubenswrapper[4766]: I1002 13:20:54.943286 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-67gx7" podUID="965a5af0-f685-4338-aeb7-98ae47151c0c" containerName="registry-server" containerID="cri-o://2b9bc3740beb2121725d47954cae9e5124f0078547d366667e56134d888ad94e" gracePeriod=2 Oct 02 13:20:55 crc kubenswrapper[4766]: E1002 13:20:55.152666 4766 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod965a5af0_f685_4338_aeb7_98ae47151c0c.slice/crio-2b9bc3740beb2121725d47954cae9e5124f0078547d366667e56134d888ad94e.scope\": RecentStats: unable to find data in memory cache]" Oct 02 13:20:55 crc kubenswrapper[4766]: I1002 13:20:55.594296 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67gx7" Oct 02 13:20:55 crc kubenswrapper[4766]: I1002 13:20:55.756803 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/965a5af0-f685-4338-aeb7-98ae47151c0c-utilities\") pod \"965a5af0-f685-4338-aeb7-98ae47151c0c\" (UID: \"965a5af0-f685-4338-aeb7-98ae47151c0c\") " Oct 02 13:20:55 crc kubenswrapper[4766]: I1002 13:20:55.757458 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/965a5af0-f685-4338-aeb7-98ae47151c0c-catalog-content\") pod \"965a5af0-f685-4338-aeb7-98ae47151c0c\" (UID: \"965a5af0-f685-4338-aeb7-98ae47151c0c\") " Oct 02 13:20:55 crc kubenswrapper[4766]: I1002 13:20:55.757753 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkt4b\" (UniqueName: \"kubernetes.io/projected/965a5af0-f685-4338-aeb7-98ae47151c0c-kube-api-access-dkt4b\") pod \"965a5af0-f685-4338-aeb7-98ae47151c0c\" (UID: \"965a5af0-f685-4338-aeb7-98ae47151c0c\") " Oct 02 13:20:55 crc kubenswrapper[4766]: I1002 13:20:55.757747 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/965a5af0-f685-4338-aeb7-98ae47151c0c-utilities" (OuterVolumeSpecName: "utilities") pod "965a5af0-f685-4338-aeb7-98ae47151c0c" (UID: "965a5af0-f685-4338-aeb7-98ae47151c0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:20:55 crc kubenswrapper[4766]: I1002 13:20:55.759338 4766 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/965a5af0-f685-4338-aeb7-98ae47151c0c-utilities\") on node \"crc\" DevicePath \"\"" Oct 02 13:20:55 crc kubenswrapper[4766]: I1002 13:20:55.764910 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/965a5af0-f685-4338-aeb7-98ae47151c0c-kube-api-access-dkt4b" (OuterVolumeSpecName: "kube-api-access-dkt4b") pod "965a5af0-f685-4338-aeb7-98ae47151c0c" (UID: "965a5af0-f685-4338-aeb7-98ae47151c0c"). InnerVolumeSpecName "kube-api-access-dkt4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:20:55 crc kubenswrapper[4766]: I1002 13:20:55.775226 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/965a5af0-f685-4338-aeb7-98ae47151c0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "965a5af0-f685-4338-aeb7-98ae47151c0c" (UID: "965a5af0-f685-4338-aeb7-98ae47151c0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:20:55 crc kubenswrapper[4766]: I1002 13:20:55.861250 4766 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/965a5af0-f685-4338-aeb7-98ae47151c0c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 02 13:20:55 crc kubenswrapper[4766]: I1002 13:20:55.861284 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkt4b\" (UniqueName: \"kubernetes.io/projected/965a5af0-f685-4338-aeb7-98ae47151c0c-kube-api-access-dkt4b\") on node \"crc\" DevicePath \"\"" Oct 02 13:20:55 crc kubenswrapper[4766]: I1002 13:20:55.955114 4766 generic.go:334] "Generic (PLEG): container finished" podID="965a5af0-f685-4338-aeb7-98ae47151c0c" containerID="2b9bc3740beb2121725d47954cae9e5124f0078547d366667e56134d888ad94e" exitCode=0 Oct 02 13:20:55 crc kubenswrapper[4766]: I1002 13:20:55.955212 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67gx7" event={"ID":"965a5af0-f685-4338-aeb7-98ae47151c0c","Type":"ContainerDied","Data":"2b9bc3740beb2121725d47954cae9e5124f0078547d366667e56134d888ad94e"} Oct 02 13:20:55 crc kubenswrapper[4766]: I1002 13:20:55.955251 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67gx7" event={"ID":"965a5af0-f685-4338-aeb7-98ae47151c0c","Type":"ContainerDied","Data":"133b674978d78ba8a01763e212fff13961ff5f00e42157a445d7e9a92e854add"} Oct 02 13:20:55 crc kubenswrapper[4766]: I1002 13:20:55.955274 4766 scope.go:117] "RemoveContainer" containerID="2b9bc3740beb2121725d47954cae9e5124f0078547d366667e56134d888ad94e" Oct 02 13:20:55 crc kubenswrapper[4766]: I1002 13:20:55.955598 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67gx7" Oct 02 13:20:55 crc kubenswrapper[4766]: I1002 13:20:55.990448 4766 scope.go:117] "RemoveContainer" containerID="c0968acc214df1515d35840df212b10720e581fbc5123b3138af54e79056deb5" Oct 02 13:20:55 crc kubenswrapper[4766]: I1002 13:20:55.996719 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67gx7"] Oct 02 13:20:56 crc kubenswrapper[4766]: I1002 13:20:56.012929 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-67gx7"] Oct 02 13:20:56 crc kubenswrapper[4766]: I1002 13:20:56.506649 4766 scope.go:117] "RemoveContainer" containerID="d18a96b8126acdc8b0dc4057cf0ac8edc4b645ad2e9979d6e72c83389c52a2cf" Oct 02 13:20:56 crc kubenswrapper[4766]: I1002 13:20:56.560366 4766 scope.go:117] "RemoveContainer" containerID="2b9bc3740beb2121725d47954cae9e5124f0078547d366667e56134d888ad94e" Oct 02 13:20:56 crc kubenswrapper[4766]: E1002 13:20:56.560946 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b9bc3740beb2121725d47954cae9e5124f0078547d366667e56134d888ad94e\": container with ID starting with 2b9bc3740beb2121725d47954cae9e5124f0078547d366667e56134d888ad94e not found: ID does not exist" containerID="2b9bc3740beb2121725d47954cae9e5124f0078547d366667e56134d888ad94e" Oct 02 13:20:56 crc kubenswrapper[4766]: I1002 13:20:56.561009 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b9bc3740beb2121725d47954cae9e5124f0078547d366667e56134d888ad94e"} err="failed to get container status \"2b9bc3740beb2121725d47954cae9e5124f0078547d366667e56134d888ad94e\": rpc error: code = NotFound desc = could not find container \"2b9bc3740beb2121725d47954cae9e5124f0078547d366667e56134d888ad94e\": container with ID starting with 2b9bc3740beb2121725d47954cae9e5124f0078547d366667e56134d888ad94e not found: ID does not exist" Oct 02 13:20:56 crc kubenswrapper[4766]: I1002 13:20:56.561049 4766 scope.go:117] "RemoveContainer" containerID="c0968acc214df1515d35840df212b10720e581fbc5123b3138af54e79056deb5" Oct 02 13:20:56 crc kubenswrapper[4766]: E1002 13:20:56.561856 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0968acc214df1515d35840df212b10720e581fbc5123b3138af54e79056deb5\": container with ID starting with c0968acc214df1515d35840df212b10720e581fbc5123b3138af54e79056deb5 not found: ID does not exist" containerID="c0968acc214df1515d35840df212b10720e581fbc5123b3138af54e79056deb5" Oct 02 13:20:56 crc kubenswrapper[4766]: I1002 13:20:56.561955 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0968acc214df1515d35840df212b10720e581fbc5123b3138af54e79056deb5"} err="failed to get container status \"c0968acc214df1515d35840df212b10720e581fbc5123b3138af54e79056deb5\": rpc error: code = NotFound desc = could not find container \"c0968acc214df1515d35840df212b10720e581fbc5123b3138af54e79056deb5\": container with ID starting with c0968acc214df1515d35840df212b10720e581fbc5123b3138af54e79056deb5 not found: ID does not exist" Oct 02 13:20:56 crc kubenswrapper[4766]: I1002 13:20:56.562031 4766 scope.go:117] "RemoveContainer" containerID="d18a96b8126acdc8b0dc4057cf0ac8edc4b645ad2e9979d6e72c83389c52a2cf" Oct 02 13:20:56 crc kubenswrapper[4766]: E1002 13:20:56.562307 4766 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18a96b8126acdc8b0dc4057cf0ac8edc4b645ad2e9979d6e72c83389c52a2cf\": container with ID starting with d18a96b8126acdc8b0dc4057cf0ac8edc4b645ad2e9979d6e72c83389c52a2cf not found: ID does not exist" containerID="d18a96b8126acdc8b0dc4057cf0ac8edc4b645ad2e9979d6e72c83389c52a2cf" Oct 02 13:20:56 crc kubenswrapper[4766]: I1002 13:20:56.562333 4766 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18a96b8126acdc8b0dc4057cf0ac8edc4b645ad2e9979d6e72c83389c52a2cf"} err="failed to get container status \"d18a96b8126acdc8b0dc4057cf0ac8edc4b645ad2e9979d6e72c83389c52a2cf\": rpc error: code = NotFound desc = could not find container \"d18a96b8126acdc8b0dc4057cf0ac8edc4b645ad2e9979d6e72c83389c52a2cf\": container with ID starting with d18a96b8126acdc8b0dc4057cf0ac8edc4b645ad2e9979d6e72c83389c52a2cf not found: ID does not exist" Oct 02 13:20:56 crc kubenswrapper[4766]: I1002 13:20:56.818130 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7gs9r/crc-debug-jwms6"] Oct 02 13:20:56 crc kubenswrapper[4766]: I1002 13:20:56.833870 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7gs9r/crc-debug-jwms6"] Oct 02 13:20:57 crc kubenswrapper[4766]: I1002 13:20:57.894063 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e" path="/var/lib/kubelet/pods/64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e/volumes" Oct 02 13:20:57 crc kubenswrapper[4766]: I1002 13:20:57.895164 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="965a5af0-f685-4338-aeb7-98ae47151c0c" path="/var/lib/kubelet/pods/965a5af0-f685-4338-aeb7-98ae47151c0c/volumes" Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.034652 4766 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7gs9r/crc-debug-89l24"] Oct 02 13:20:58 crc kubenswrapper[4766]: E1002 13:20:58.035154 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="965a5af0-f685-4338-aeb7-98ae47151c0c" containerName="extract-utilities" Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.035177 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="965a5af0-f685-4338-aeb7-98ae47151c0c" containerName="extract-utilities" Oct 02 13:20:58 crc kubenswrapper[4766]: E1002 13:20:58.035197 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="965a5af0-f685-4338-aeb7-98ae47151c0c" containerName="registry-server" Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.035243 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="965a5af0-f685-4338-aeb7-98ae47151c0c" containerName="registry-server" Oct 02 13:20:58 crc kubenswrapper[4766]: E1002 13:20:58.035260 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e" containerName="container-00" Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.035268 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e" containerName="container-00" Oct 02 13:20:58 crc kubenswrapper[4766]: E1002 13:20:58.035339 4766 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="965a5af0-f685-4338-aeb7-98ae47151c0c" containerName="extract-content" Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.035352 4766 state_mem.go:107] "Deleted CPUSet assignment" podUID="965a5af0-f685-4338-aeb7-98ae47151c0c" containerName="extract-content" Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.035848 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="965a5af0-f685-4338-aeb7-98ae47151c0c" containerName="registry-server" Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.035881 4766 memory_manager.go:354] "RemoveStaleState removing state" podUID="64d7e6d7-5a94-48e6-bc28-ae5c3f1db83e" containerName="container-00" Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.036858 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gs9r/crc-debug-89l24" Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.113938 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69572f5b-e84c-42dd-a837-aa5def2e5347-host\") pod \"crc-debug-89l24\" (UID: \"69572f5b-e84c-42dd-a837-aa5def2e5347\") " pod="openshift-must-gather-7gs9r/crc-debug-89l24" Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.114660 4766 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxdln\" (UniqueName: \"kubernetes.io/projected/69572f5b-e84c-42dd-a837-aa5def2e5347-kube-api-access-jxdln\") pod \"crc-debug-89l24\" (UID: \"69572f5b-e84c-42dd-a837-aa5def2e5347\") " pod="openshift-must-gather-7gs9r/crc-debug-89l24" Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.216457 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69572f5b-e84c-42dd-a837-aa5def2e5347-host\") pod \"crc-debug-89l24\" (UID: \"69572f5b-e84c-42dd-a837-aa5def2e5347\") " pod="openshift-must-gather-7gs9r/crc-debug-89l24" Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.216668 4766 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxdln\" (UniqueName: \"kubernetes.io/projected/69572f5b-e84c-42dd-a837-aa5def2e5347-kube-api-access-jxdln\") pod \"crc-debug-89l24\" (UID: \"69572f5b-e84c-42dd-a837-aa5def2e5347\") " pod="openshift-must-gather-7gs9r/crc-debug-89l24" Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.216662 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69572f5b-e84c-42dd-a837-aa5def2e5347-host\") pod \"crc-debug-89l24\" (UID: \"69572f5b-e84c-42dd-a837-aa5def2e5347\") " pod="openshift-must-gather-7gs9r/crc-debug-89l24" Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.239312 4766 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxdln\" (UniqueName: \"kubernetes.io/projected/69572f5b-e84c-42dd-a837-aa5def2e5347-kube-api-access-jxdln\") pod \"crc-debug-89l24\" (UID: \"69572f5b-e84c-42dd-a837-aa5def2e5347\") " pod="openshift-must-gather-7gs9r/crc-debug-89l24" Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.374823 4766 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gs9r/crc-debug-89l24" Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.993655 4766 generic.go:334] "Generic (PLEG): container finished" podID="69572f5b-e84c-42dd-a837-aa5def2e5347" containerID="ea49998458c19fe78668f181f67b78060423efa6314040bedd4e651ae4fe9a9f" exitCode=0 Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.993712 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gs9r/crc-debug-89l24" event={"ID":"69572f5b-e84c-42dd-a837-aa5def2e5347","Type":"ContainerDied","Data":"ea49998458c19fe78668f181f67b78060423efa6314040bedd4e651ae4fe9a9f"} Oct 02 13:20:58 crc kubenswrapper[4766]: I1002 13:20:58.993977 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gs9r/crc-debug-89l24" event={"ID":"69572f5b-e84c-42dd-a837-aa5def2e5347","Type":"ContainerStarted","Data":"4e2034cfa832ad810aa6c9becfc64ce45398fd5b7e580263bf579e425eac84c1"} Oct 02 13:20:59 crc kubenswrapper[4766]: I1002 13:20:59.043976 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7gs9r/crc-debug-89l24"] Oct 02 13:20:59 crc kubenswrapper[4766]: I1002 13:20:59.055723 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7gs9r/crc-debug-89l24"] Oct 02 13:21:00 crc kubenswrapper[4766]: I1002 13:21:00.141351 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gs9r/crc-debug-89l24" Oct 02 13:21:00 crc kubenswrapper[4766]: I1002 13:21:00.264877 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxdln\" (UniqueName: \"kubernetes.io/projected/69572f5b-e84c-42dd-a837-aa5def2e5347-kube-api-access-jxdln\") pod \"69572f5b-e84c-42dd-a837-aa5def2e5347\" (UID: \"69572f5b-e84c-42dd-a837-aa5def2e5347\") " Oct 02 13:21:00 crc kubenswrapper[4766]: I1002 13:21:00.265015 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69572f5b-e84c-42dd-a837-aa5def2e5347-host\") pod \"69572f5b-e84c-42dd-a837-aa5def2e5347\" (UID: \"69572f5b-e84c-42dd-a837-aa5def2e5347\") " Oct 02 13:21:00 crc kubenswrapper[4766]: I1002 13:21:00.265521 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/69572f5b-e84c-42dd-a837-aa5def2e5347-host" (OuterVolumeSpecName: "host") pod "69572f5b-e84c-42dd-a837-aa5def2e5347" (UID: "69572f5b-e84c-42dd-a837-aa5def2e5347"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 02 13:21:00 crc kubenswrapper[4766]: I1002 13:21:00.277607 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69572f5b-e84c-42dd-a837-aa5def2e5347-kube-api-access-jxdln" (OuterVolumeSpecName: "kube-api-access-jxdln") pod "69572f5b-e84c-42dd-a837-aa5def2e5347" (UID: "69572f5b-e84c-42dd-a837-aa5def2e5347"). InnerVolumeSpecName "kube-api-access-jxdln". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:21:00 crc kubenswrapper[4766]: I1002 13:21:00.367860 4766 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69572f5b-e84c-42dd-a837-aa5def2e5347-host\") on node \"crc\" DevicePath \"\"" Oct 02 13:21:00 crc kubenswrapper[4766]: I1002 13:21:00.367893 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxdln\" (UniqueName: \"kubernetes.io/projected/69572f5b-e84c-42dd-a837-aa5def2e5347-kube-api-access-jxdln\") on node \"crc\" DevicePath \"\"" Oct 02 13:21:01 crc kubenswrapper[4766]: I1002 13:21:01.017940 4766 scope.go:117] "RemoveContainer" containerID="ea49998458c19fe78668f181f67b78060423efa6314040bedd4e651ae4fe9a9f" Oct 02 13:21:01 crc kubenswrapper[4766]: I1002 13:21:01.018570 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gs9r/crc-debug-89l24" Oct 02 13:21:01 crc kubenswrapper[4766]: I1002 13:21:01.900024 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69572f5b-e84c-42dd-a837-aa5def2e5347" path="/var/lib/kubelet/pods/69572f5b-e84c-42dd-a837-aa5def2e5347/volumes" Oct 02 13:21:15 crc kubenswrapper[4766]: I1002 13:21:15.529191 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb_88981c56-312c-4225-b3c2-7fb698637653/util/0.log" Oct 02 13:21:15 crc kubenswrapper[4766]: I1002 13:21:15.714403 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb_88981c56-312c-4225-b3c2-7fb698637653/util/0.log" Oct 02 13:21:15 crc kubenswrapper[4766]: I1002 13:21:15.759054 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb_88981c56-312c-4225-b3c2-7fb698637653/pull/0.log" Oct 02 13:21:15 crc kubenswrapper[4766]: I1002 13:21:15.779569 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb_88981c56-312c-4225-b3c2-7fb698637653/pull/0.log" Oct 02 13:21:15 crc kubenswrapper[4766]: I1002 13:21:15.920692 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb_88981c56-312c-4225-b3c2-7fb698637653/util/0.log" Oct 02 13:21:15 crc kubenswrapper[4766]: I1002 13:21:15.982878 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb_88981c56-312c-4225-b3c2-7fb698637653/extract/0.log" Oct 02 13:21:16 crc kubenswrapper[4766]: I1002 13:21:16.000053 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_157dbb73256b1b6d15c92cb3b9832917051f27d0aa325f8cd46370e26dlqbqb_88981c56-312c-4225-b3c2-7fb698637653/pull/0.log" Oct 02 13:21:16 crc kubenswrapper[4766]: I1002 13:21:16.081216 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-n84bq_7ef42077-e956-405d-8e5e-ee28586502dd/kube-rbac-proxy/0.log" Oct 02 13:21:16 crc kubenswrapper[4766]: I1002 13:21:16.231423 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-7wbpc_eaa7722d-af7b-44aa-992b-9304ab1a56c3/kube-rbac-proxy/0.log" Oct 02 13:21:16 crc kubenswrapper[4766]: I1002 13:21:16.285759 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-n84bq_7ef42077-e956-405d-8e5e-ee28586502dd/manager/0.log" Oct 02 13:21:16 crc kubenswrapper[4766]: I1002 13:21:16.347446 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-7wbpc_eaa7722d-af7b-44aa-992b-9304ab1a56c3/manager/0.log" Oct 02 13:21:16 crc kubenswrapper[4766]: I1002 13:21:16.442899 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-8jlf2_fd0148cc-8cbc-4204-9c03-b6d446ec4b13/kube-rbac-proxy/0.log" Oct 02 13:21:16 crc kubenswrapper[4766]: I1002 13:21:16.515611 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-8jlf2_fd0148cc-8cbc-4204-9c03-b6d446ec4b13/manager/0.log" Oct 02 13:21:16 crc kubenswrapper[4766]: I1002 13:21:16.611978 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-4jnlb_712078f7-0205-4259-843b-10ca0a292fcb/kube-rbac-proxy/0.log" Oct 02 13:21:16 crc kubenswrapper[4766]: I1002 13:21:16.723271 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-4jnlb_712078f7-0205-4259-843b-10ca0a292fcb/manager/0.log" Oct 02 13:21:16 crc kubenswrapper[4766]: I1002 13:21:16.998769 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-jc4jh_9df7b61f-82c3-4c2f-af77-b152b69666d7/kube-rbac-proxy/0.log" Oct 02 13:21:17 crc kubenswrapper[4766]: I1002 13:21:17.081513 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-jc4jh_9df7b61f-82c3-4c2f-af77-b152b69666d7/manager/0.log" Oct 02 13:21:17 crc kubenswrapper[4766]: I1002 13:21:17.233299 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-xjhnl_53ef1cad-2b60-4a0d-896c-958c59652c91/kube-rbac-proxy/0.log" Oct 02 13:21:17 crc kubenswrapper[4766]: I1002 13:21:17.293004 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-xjhnl_53ef1cad-2b60-4a0d-896c-958c59652c91/manager/0.log" Oct 02 13:21:17 crc kubenswrapper[4766]: I1002 13:21:17.380223 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-x4rnr_15ef5082-eda7-4994-8631-8f896fd8a456/kube-rbac-proxy/0.log" Oct 02 13:21:17 crc kubenswrapper[4766]: I1002 13:21:17.585402 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-2sg8s_1e727ede-3058-4edc-8631-a3c12bfa0b32/kube-rbac-proxy/0.log" Oct 02 13:21:17 crc kubenswrapper[4766]: I1002 13:21:17.670621 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-2sg8s_1e727ede-3058-4edc-8631-a3c12bfa0b32/manager/0.log" Oct 02 13:21:17 crc kubenswrapper[4766]: I1002 13:21:17.699984 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-x4rnr_15ef5082-eda7-4994-8631-8f896fd8a456/manager/0.log" Oct 02 13:21:17 crc kubenswrapper[4766]: I1002 13:21:17.824466 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-85bcm_25d1f804-fe78-4cc5-85a4-584ba18bf566/kube-rbac-proxy/0.log" Oct 02 13:21:17 crc kubenswrapper[4766]: I1002 13:21:17.955403 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-85bcm_25d1f804-fe78-4cc5-85a4-584ba18bf566/manager/0.log" Oct 02 13:21:18 crc kubenswrapper[4766]: I1002 13:21:18.040685 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-g4js2_0b227a91-0adf-4131-bb61-e11c995527ca/kube-rbac-proxy/0.log" Oct 02 13:21:18 crc kubenswrapper[4766]: I1002 13:21:18.137054 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-g4js2_0b227a91-0adf-4131-bb61-e11c995527ca/manager/0.log" Oct 02 13:21:18 crc kubenswrapper[4766]: I1002 13:21:18.156697 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-wtdmk_2ba5311d-1e3c-4bf2-890e-836a7dda4335/kube-rbac-proxy/0.log" Oct 02 13:21:18 crc kubenswrapper[4766]: I1002 13:21:18.842262 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-b28pm_2dbff594-01b2-495a-af08-2c23c0d986de/kube-rbac-proxy/0.log" Oct 02 13:21:18 crc kubenswrapper[4766]: I1002 13:21:18.874459 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-wtdmk_2ba5311d-1e3c-4bf2-890e-836a7dda4335/manager/0.log" Oct 02 13:21:18 crc kubenswrapper[4766]: I1002 13:21:18.908941 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-b28pm_2dbff594-01b2-495a-af08-2c23c0d986de/manager/0.log" Oct 02 13:21:19 crc kubenswrapper[4766]: I1002 13:21:19.068336 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-d546t_133e16f7-a3d0-4920-827f-8da5e5d81d98/kube-rbac-proxy/0.log" Oct 02 13:21:19 crc kubenswrapper[4766]: I1002 13:21:19.089708 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-khvnh_7c68c9c4-8848-48df-a28a-830a547f469a/kube-rbac-proxy/0.log" Oct 02 13:21:19 crc kubenswrapper[4766]: I1002 13:21:19.165145 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-khvnh_7c68c9c4-8848-48df-a28a-830a547f469a/manager/0.log" Oct 02 13:21:19 crc kubenswrapper[4766]: I1002 13:21:19.244835 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-d546t_133e16f7-a3d0-4920-827f-8da5e5d81d98/manager/0.log" Oct 02 13:21:19 crc kubenswrapper[4766]: I1002 13:21:19.296569 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5869cb545-gbnjj_c2986de4-8b91-42e2-b0a4-4032b1ce7ae5/kube-rbac-proxy/0.log" Oct 02 13:21:19 crc kubenswrapper[4766]: I1002 13:21:19.321782 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5869cb545-gbnjj_c2986de4-8b91-42e2-b0a4-4032b1ce7ae5/manager/0.log" Oct 02 13:21:19 crc kubenswrapper[4766]: I1002 13:21:19.478798 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5f7d749dc7-n4nqz_7d01c25f-6e83-4e83-8193-203e990ffd70/kube-rbac-proxy/0.log" Oct 02 13:21:19 crc kubenswrapper[4766]: I1002 13:21:19.521814 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-cc764bd77-pvh4g_d0a78ee4-d3b9-48b0-941a-cf1c73d8c3b1/kube-rbac-proxy/0.log" Oct 02 13:21:19 crc kubenswrapper[4766]: I1002 13:21:19.769998 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-llggj_1727e62e-9173-45f1-b7dc-f4721872708a/registry-server/0.log" Oct 02 13:21:19 crc kubenswrapper[4766]: I1002 13:21:19.780991 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-cc764bd77-pvh4g_d0a78ee4-d3b9-48b0-941a-cf1c73d8c3b1/operator/0.log" Oct 02 13:21:20 crc kubenswrapper[4766]: I1002 13:21:20.012741 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-km85r_2615c22d-ad24-47ec-bfb1-f0227eb91300/kube-rbac-proxy/0.log" Oct 02 13:21:20 crc kubenswrapper[4766]: I1002 13:21:20.114782 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-km85r_2615c22d-ad24-47ec-bfb1-f0227eb91300/manager/0.log" Oct 02 13:21:20 crc kubenswrapper[4766]: I1002 13:21:20.124105 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-k5ms6_43a5f26a-7b51-4514-afc3-15048f9acec9/kube-rbac-proxy/0.log" Oct 02 13:21:20 crc kubenswrapper[4766]: I1002 13:21:20.973900 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-lrnd8_9b6bf2a3-2784-4940-8a10-a42a0f876577/operator/0.log" Oct 02 13:21:21 crc kubenswrapper[4766]: I1002 13:21:21.031821 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-k5ms6_43a5f26a-7b51-4514-afc3-15048f9acec9/manager/0.log" Oct 02 13:21:21 crc kubenswrapper[4766]: I1002 13:21:21.162201 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-62s67_1ffa0b32-f1ea-4273-baf6-67b9217803b3/kube-rbac-proxy/0.log" Oct 02 13:21:21 crc kubenswrapper[4766]: I1002 13:21:21.276930 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-62s67_1ffa0b32-f1ea-4273-baf6-67b9217803b3/manager/0.log" Oct 02 13:21:21 crc kubenswrapper[4766]: I1002 13:21:21.298731 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-kchvm_cc498066-7f28-4345-8cd9-b3168f10fe32/kube-rbac-proxy/0.log" Oct 02 13:21:21 crc kubenswrapper[4766]: I1002 13:21:21.499919 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-t4r98_02c6432b-aae3-4392-9d39-edbbf8b5e48a/manager/0.log" Oct 02 13:21:21 crc kubenswrapper[4766]: I1002 13:21:21.506359 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-t4r98_02c6432b-aae3-4392-9d39-edbbf8b5e48a/kube-rbac-proxy/0.log" Oct 02 13:21:21 crc kubenswrapper[4766]: I1002 13:21:21.668829 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-kchvm_cc498066-7f28-4345-8cd9-b3168f10fe32/manager/0.log" Oct 02 13:21:21 crc kubenswrapper[4766]: I1002 13:21:21.882899 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-mn5ks_a2658564-9624-44ee-b9ce-1579493d044f/kube-rbac-proxy/0.log" Oct 02 13:21:21 crc kubenswrapper[4766]: I1002 13:21:21.894687 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-mn5ks_a2658564-9624-44ee-b9ce-1579493d044f/manager/0.log" Oct 02 13:21:21 crc kubenswrapper[4766]: I1002 13:21:21.978650 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5f7d749dc7-n4nqz_7d01c25f-6e83-4e83-8193-203e990ffd70/manager/0.log" Oct 02 13:21:40 crc kubenswrapper[4766]: I1002 13:21:40.080246 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-rlrhw_fdd489b9-775f-4a9a-b3de-8ac4d8fcf8fe/control-plane-machine-set-operator/0.log" Oct 02 13:21:40 crc kubenswrapper[4766]: I1002 13:21:40.189458 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-zkc9n_c30c3a5c-a29e-48a7-b446-b68f9cce2742/kube-rbac-proxy/0.log" Oct 02 13:21:40 crc kubenswrapper[4766]: I1002 13:21:40.256217 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-zkc9n_c30c3a5c-a29e-48a7-b446-b68f9cce2742/machine-api-operator/0.log" Oct 02 13:21:53 crc kubenswrapper[4766]: I1002 13:21:53.622444 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-7d4cc89fcb-krw9f_e6b26aea-d5dc-4599-baee-0d3046b6f822/cert-manager-controller/0.log" Oct 02 13:21:54 crc kubenswrapper[4766]: I1002 13:21:54.294221 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-d969966f-ggcgs_64245672-eb6a-4b99-9550-ac59d359dddf/cert-manager-webhook/0.log" Oct 02 13:21:54 crc kubenswrapper[4766]: I1002 13:21:54.338851 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7d9f95dbf-rfgbm_c13af787-6251-4bbe-88b2-e47927aabd14/cert-manager-cainjector/0.log" Oct 02 13:22:06 crc kubenswrapper[4766]: I1002 13:22:06.870148 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-q5bx2_87c86148-6c6d-48a2-bd6c-4004f6d782e8/nmstate-console-plugin/0.log" Oct 02 13:22:07 crc kubenswrapper[4766]: I1002 13:22:07.072379 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-dbwjf_5504968e-95f4-4664-bbd0-958eb8efb21e/kube-rbac-proxy/0.log" Oct 02 13:22:07 crc kubenswrapper[4766]: I1002 13:22:07.111936 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mg42f_5834ff77-38e7-40e3-a6a0-ce908f1343f0/nmstate-handler/0.log" Oct 02 13:22:07 crc kubenswrapper[4766]: I1002 13:22:07.137913 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-dbwjf_5504968e-95f4-4664-bbd0-958eb8efb21e/nmstate-metrics/0.log" Oct 02 13:22:07 crc kubenswrapper[4766]: I1002 13:22:07.293786 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-5hsll_dd8848bc-a5ea-40ae-9e27-eadbeef93edb/nmstate-operator/0.log" Oct 02 13:22:07 crc kubenswrapper[4766]: I1002 13:22:07.342072 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-v9vf7_841c7884-b0c7-45fc-9032-9d5c27cd862a/nmstate-webhook/0.log" Oct 02 13:22:21 crc kubenswrapper[4766]: I1002 13:22:21.491935 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-l68rk_b03add67-e52f-47b3-9936-b029f88e9f1b/kube-rbac-proxy/0.log" Oct 02 13:22:21 crc kubenswrapper[4766]: I1002 13:22:21.736695 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/cp-frr-files/0.log" Oct 02 13:22:21 crc kubenswrapper[4766]: I1002 13:22:21.888837 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/cp-frr-files/0.log" Oct 02 13:22:21 crc kubenswrapper[4766]: I1002 13:22:21.953634 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/cp-reloader/0.log" Oct 02 13:22:21 crc kubenswrapper[4766]: I1002 13:22:21.975934 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/cp-metrics/0.log" Oct 02 13:22:22 crc kubenswrapper[4766]: I1002 13:22:22.069531 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-l68rk_b03add67-e52f-47b3-9936-b029f88e9f1b/controller/0.log" Oct 02 13:22:22 crc kubenswrapper[4766]: I1002 13:22:22.101233 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/cp-reloader/0.log" Oct 02 13:22:22 crc kubenswrapper[4766]: I1002 13:22:22.276761 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/cp-reloader/0.log" Oct 02 13:22:22 crc kubenswrapper[4766]: I1002 13:22:22.284176 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/cp-metrics/0.log" Oct 02 13:22:22 crc kubenswrapper[4766]: I1002 13:22:22.311458 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/cp-frr-files/0.log" Oct 02 13:22:22 crc kubenswrapper[4766]: I1002 13:22:22.349840 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/cp-metrics/0.log" Oct 02 13:22:22 crc kubenswrapper[4766]: I1002 13:22:22.490288 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/cp-frr-files/0.log" Oct 02 13:22:22 crc kubenswrapper[4766]: I1002 13:22:22.526162 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/cp-metrics/0.log" Oct 02 13:22:22 crc kubenswrapper[4766]: I1002 13:22:22.531849 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/controller/0.log" Oct 02 13:22:22 crc kubenswrapper[4766]: I1002 13:22:22.535135 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/cp-reloader/0.log" Oct 02 13:22:22 crc kubenswrapper[4766]: I1002 13:22:22.687343 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/frr-metrics/0.log" Oct 02 13:22:22 crc kubenswrapper[4766]: I1002 13:22:22.688889 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/kube-rbac-proxy/0.log" Oct 02 13:22:22 crc kubenswrapper[4766]: I1002 13:22:22.740317 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/kube-rbac-proxy-frr/0.log" Oct 02 13:22:22 crc kubenswrapper[4766]: I1002 13:22:22.911480 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/reloader/0.log" Oct 02 13:22:22 crc kubenswrapper[4766]: I1002 13:22:22.972143 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-52rmm_9402a3a2-7e5c-4d01-bd76-27ac148ca1cb/frr-k8s-webhook-server/0.log" Oct 02 13:22:23 crc kubenswrapper[4766]: I1002 13:22:23.234671 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-79dc498c69-l856r_a212e302-c57c-4d73-a1b3-94e720468352/manager/0.log" Oct 02 13:22:23 crc kubenswrapper[4766]: I1002 13:22:23.349603 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-596877795c-zts7d_636126b8-3906-49f5-8434-c324ab667177/webhook-server/0.log" Oct 02 13:22:23 crc kubenswrapper[4766]: I1002 13:22:23.463420 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qzdkz_18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd/kube-rbac-proxy/0.log" Oct 02 13:22:24 crc kubenswrapper[4766]: I1002 13:22:24.433749 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:22:24 crc kubenswrapper[4766]: I1002 13:22:24.434181 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:22:24 crc kubenswrapper[4766]: I1002 13:22:24.730425 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qzdkz_18cbb20b-0ac7-4b62-86f8-7dbcf7ae2afd/speaker/0.log" Oct 02 13:22:26 crc kubenswrapper[4766]: I1002 13:22:26.251671 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-j49dz_4eadfcf0-faf8-455c-a0f7-f49298dffdee/frr/0.log" Oct 02 13:22:37 crc kubenswrapper[4766]: I1002 13:22:37.643394 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx_1afef370-5e0c-402e-972b-6375f5c7a86e/util/0.log" Oct 02 13:22:38 crc kubenswrapper[4766]: I1002 13:22:38.365396 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx_1afef370-5e0c-402e-972b-6375f5c7a86e/util/0.log" Oct 02 13:22:38 crc kubenswrapper[4766]: I1002 13:22:38.399608 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx_1afef370-5e0c-402e-972b-6375f5c7a86e/pull/0.log" Oct 02 13:22:38 crc kubenswrapper[4766]: I1002 13:22:38.430764 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx_1afef370-5e0c-402e-972b-6375f5c7a86e/pull/0.log" Oct 02 13:22:38 crc kubenswrapper[4766]: I1002 13:22:38.576689 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx_1afef370-5e0c-402e-972b-6375f5c7a86e/pull/0.log" Oct 02 13:22:38 crc kubenswrapper[4766]: I1002 13:22:38.599604 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx_1afef370-5e0c-402e-972b-6375f5c7a86e/util/0.log" Oct 02 13:22:38 crc kubenswrapper[4766]: I1002 13:22:38.605095 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_695e9552c02c72940c72621f824780f00ca58086c3badc308bf0a2eb69rxkwx_1afef370-5e0c-402e-972b-6375f5c7a86e/extract/0.log" Oct 02 13:22:38 crc kubenswrapper[4766]: I1002 13:22:38.755052 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45_148906ba-bbc3-498d-91e3-b542ebf88b0e/util/0.log" Oct 02 13:22:38 crc kubenswrapper[4766]: I1002 13:22:38.934482 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45_148906ba-bbc3-498d-91e3-b542ebf88b0e/pull/0.log" Oct 02 13:22:38 crc kubenswrapper[4766]: I1002 13:22:38.956130 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45_148906ba-bbc3-498d-91e3-b542ebf88b0e/util/0.log" Oct 02 13:22:38 crc kubenswrapper[4766]: I1002 13:22:38.992023 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45_148906ba-bbc3-498d-91e3-b542ebf88b0e/pull/0.log" Oct 02 13:22:39 crc kubenswrapper[4766]: I1002 13:22:39.128732 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45_148906ba-bbc3-498d-91e3-b542ebf88b0e/pull/0.log" Oct 02 13:22:39 crc kubenswrapper[4766]: I1002 13:22:39.134729 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45_148906ba-bbc3-498d-91e3-b542ebf88b0e/util/0.log" Oct 02 13:22:39 crc kubenswrapper[4766]: I1002 13:22:39.173460 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d22jb45_148906ba-bbc3-498d-91e3-b542ebf88b0e/extract/0.log" Oct 02 13:22:39 crc kubenswrapper[4766]: I1002 13:22:39.316728 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4_403ad43d-bdf9-4c87-ad12-313410089de3/util/0.log" Oct 02 13:22:39 crc kubenswrapper[4766]: I1002 13:22:39.549713 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4_403ad43d-bdf9-4c87-ad12-313410089de3/util/0.log" Oct 02 13:22:39 crc kubenswrapper[4766]: I1002 13:22:39.550568 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4_403ad43d-bdf9-4c87-ad12-313410089de3/pull/0.log" Oct 02 13:22:39 crc kubenswrapper[4766]: I1002 13:22:39.566314 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4_403ad43d-bdf9-4c87-ad12-313410089de3/pull/0.log" Oct 02 13:22:39 crc kubenswrapper[4766]: I1002 13:22:39.768314 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4_403ad43d-bdf9-4c87-ad12-313410089de3/util/0.log" Oct 02 13:22:39 crc kubenswrapper[4766]: I1002 13:22:39.776659 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4_403ad43d-bdf9-4c87-ad12-313410089de3/extract/0.log" Oct 02 13:22:39 crc kubenswrapper[4766]: I1002 13:22:39.784118 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dvgqk4_403ad43d-bdf9-4c87-ad12-313410089de3/pull/0.log" Oct 02 13:22:39 crc kubenswrapper[4766]: I1002 13:22:39.931348 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wc2r7_699911dd-95da-451d-8ea1-731fe880bbfb/extract-utilities/0.log" Oct 02 13:22:40 crc kubenswrapper[4766]: I1002 13:22:40.138873 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wc2r7_699911dd-95da-451d-8ea1-731fe880bbfb/extract-content/0.log" Oct 02 13:22:40 crc kubenswrapper[4766]: I1002 13:22:40.144974 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wc2r7_699911dd-95da-451d-8ea1-731fe880bbfb/extract-utilities/0.log" Oct 02 13:22:40 crc kubenswrapper[4766]: I1002 13:22:40.145043 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wc2r7_699911dd-95da-451d-8ea1-731fe880bbfb/extract-content/0.log" Oct 02 13:22:40 crc kubenswrapper[4766]: I1002 13:22:40.304214 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wc2r7_699911dd-95da-451d-8ea1-731fe880bbfb/extract-content/0.log" Oct 02 13:22:40 crc kubenswrapper[4766]: I1002 13:22:40.352462 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wc2r7_699911dd-95da-451d-8ea1-731fe880bbfb/extract-utilities/0.log" Oct 02 13:22:40 crc kubenswrapper[4766]: I1002 13:22:40.589111 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4zwjb_9233c36a-a15b-4668-9da2-d7e2a778fa2e/extract-utilities/0.log" Oct 02 13:22:40 crc kubenswrapper[4766]: I1002 13:22:40.813789 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4zwjb_9233c36a-a15b-4668-9da2-d7e2a778fa2e/extract-content/0.log" Oct 02 13:22:40 crc kubenswrapper[4766]: I1002 13:22:40.814043 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4zwjb_9233c36a-a15b-4668-9da2-d7e2a778fa2e/extract-content/0.log" Oct 02 13:22:40 crc kubenswrapper[4766]: I1002 13:22:40.833069 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4zwjb_9233c36a-a15b-4668-9da2-d7e2a778fa2e/extract-utilities/0.log" Oct 02 13:22:40 crc kubenswrapper[4766]: I1002 13:22:40.885781 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wc2r7_699911dd-95da-451d-8ea1-731fe880bbfb/registry-server/0.log" Oct 02 13:22:40 crc kubenswrapper[4766]: I1002 13:22:40.980979 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4zwjb_9233c36a-a15b-4668-9da2-d7e2a778fa2e/extract-utilities/0.log" Oct 02 13:22:41 crc kubenswrapper[4766]: I1002 13:22:41.088898 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4zwjb_9233c36a-a15b-4668-9da2-d7e2a778fa2e/extract-content/0.log" Oct 02 13:22:41 crc kubenswrapper[4766]: I1002 13:22:41.143008 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch_7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da/util/0.log" Oct 02 13:22:41 crc kubenswrapper[4766]: I1002 13:22:41.368461 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch_7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da/util/0.log" Oct 02 13:22:41 crc kubenswrapper[4766]: I1002 13:22:41.477536 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch_7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da/pull/0.log" Oct 02 13:22:41 crc kubenswrapper[4766]: I1002 13:22:41.485915 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch_7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da/pull/0.log" Oct 02 13:22:41 crc kubenswrapper[4766]: I1002 13:22:41.729987 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch_7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da/pull/0.log" Oct 02 13:22:41 crc kubenswrapper[4766]: I1002 13:22:41.734937 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch_7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da/util/0.log" Oct 02 13:22:41 crc kubenswrapper[4766]: I1002 13:22:41.801166 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835ccmzch_7ac93a28-9dac-4739-9c7a-f6f0e5f0b8da/extract/0.log" Oct 02 13:22:41 crc kubenswrapper[4766]: I1002 13:22:41.970378 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p827n_697f9f5a-2f67-4b88-8fab-f29d029c1643/extract-utilities/0.log" Oct 02 13:22:41 crc kubenswrapper[4766]: I1002 13:22:41.981049 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-7wx9g_9677731c-12a8-4fa5-b5c1-ba1238a7f315/marketplace-operator/0.log" Oct 02 13:22:42 crc kubenswrapper[4766]: I1002 13:22:42.139086 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-4zwjb_9233c36a-a15b-4668-9da2-d7e2a778fa2e/registry-server/0.log" Oct 02 13:22:42 crc kubenswrapper[4766]: I1002 13:22:42.240364 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p827n_697f9f5a-2f67-4b88-8fab-f29d029c1643/extract-content/0.log" Oct 02 13:22:42 crc kubenswrapper[4766]: I1002 13:22:42.268610 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p827n_697f9f5a-2f67-4b88-8fab-f29d029c1643/extract-utilities/0.log" Oct 02 13:22:42 crc kubenswrapper[4766]: I1002 13:22:42.268613 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p827n_697f9f5a-2f67-4b88-8fab-f29d029c1643/extract-content/0.log" Oct 02 13:22:42 crc kubenswrapper[4766]: I1002 13:22:42.458060 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p827n_697f9f5a-2f67-4b88-8fab-f29d029c1643/extract-utilities/0.log" Oct 02 13:22:42 crc kubenswrapper[4766]: I1002 13:22:42.544814 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p827n_697f9f5a-2f67-4b88-8fab-f29d029c1643/extract-content/0.log" Oct 02 13:22:42 crc kubenswrapper[4766]: I1002 13:22:42.620221 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q8cpr_c98005ea-50d3-4b26-9049-2beb07771f21/extract-utilities/0.log" Oct 02 13:22:42 crc kubenswrapper[4766]: I1002 13:22:42.729303 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-p827n_697f9f5a-2f67-4b88-8fab-f29d029c1643/registry-server/0.log" Oct 02 13:22:42 crc kubenswrapper[4766]: I1002 13:22:42.757029 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q8cpr_c98005ea-50d3-4b26-9049-2beb07771f21/extract-content/0.log" Oct 02 13:22:42 crc kubenswrapper[4766]: I1002 13:22:42.785486 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q8cpr_c98005ea-50d3-4b26-9049-2beb07771f21/extract-utilities/0.log" Oct 02 13:22:42 crc kubenswrapper[4766]: I1002 13:22:42.805968 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q8cpr_c98005ea-50d3-4b26-9049-2beb07771f21/extract-content/0.log" Oct 02 13:22:42 crc kubenswrapper[4766]: I1002 13:22:42.988081 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q8cpr_c98005ea-50d3-4b26-9049-2beb07771f21/extract-content/0.log" Oct 02 13:22:43 crc kubenswrapper[4766]: I1002 13:22:43.000265 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q8cpr_c98005ea-50d3-4b26-9049-2beb07771f21/extract-utilities/0.log" Oct 02 13:22:43 crc kubenswrapper[4766]: I1002 13:22:43.939179 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-q8cpr_c98005ea-50d3-4b26-9049-2beb07771f21/registry-server/0.log" Oct 02 13:22:54 crc kubenswrapper[4766]: I1002 13:22:54.432337 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:22:54 crc kubenswrapper[4766]: I1002 13:22:54.432859 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:22:56 crc kubenswrapper[4766]: I1002 13:22:56.338716 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-vwl9k_453f7915-e705-47b3-9078-a7704846c9e0/prometheus-operator/0.log" Oct 02 13:22:56 crc kubenswrapper[4766]: I1002 13:22:56.519564 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-d9d9b9448-8ghqt_3c30e4de-fe7f-4f68-a633-bdf33112ef8e/prometheus-operator-admission-webhook/0.log" Oct 02 13:22:56 crc kubenswrapper[4766]: I1002 13:22:56.579819 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-d9d9b9448-n88t4_0ae4fe5d-375d-407c-9386-a99585c786ad/prometheus-operator-admission-webhook/0.log" Oct 02 13:22:56 crc kubenswrapper[4766]: I1002 13:22:56.688537 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-h8hd7_a50a36bc-6db8-4a5f-91c7-b01539ceaad9/operator/0.log" Oct 02 13:22:56 crc kubenswrapper[4766]: I1002 13:22:56.768222 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-2nb7v_8fd4ad3e-17b0-498d-8710-949d10cb68fd/perses-operator/0.log" Oct 02 13:23:17 crc kubenswrapper[4766]: E1002 13:23:17.272520 4766 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.200:51688->38.129.56.200:32845: write tcp 38.129.56.200:51688->38.129.56.200:32845: write: broken pipe Oct 02 13:23:22 crc kubenswrapper[4766]: E1002 13:23:22.648931 4766 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.200:57754->38.129.56.200:32845: write tcp 38.129.56.200:57754->38.129.56.200:32845: write: broken pipe Oct 02 13:23:24 crc kubenswrapper[4766]: I1002 13:23:24.431925 4766 patch_prober.go:28] interesting pod/machine-config-daemon-l99lx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 02 13:23:24 crc kubenswrapper[4766]: I1002 13:23:24.432230 4766 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 02 13:23:24 crc kubenswrapper[4766]: I1002 13:23:24.432276 4766 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" Oct 02 13:23:24 crc kubenswrapper[4766]: I1002 13:23:24.432792 4766 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6"} pod="openshift-machine-config-operator/machine-config-daemon-l99lx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 02 13:23:24 crc kubenswrapper[4766]: I1002 13:23:24.432844 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerName="machine-config-daemon" containerID="cri-o://ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" gracePeriod=600 Oct 02 13:23:24 crc kubenswrapper[4766]: E1002 13:23:24.566789 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:23:24 crc kubenswrapper[4766]: I1002 13:23:24.647036 4766 generic.go:334] "Generic (PLEG): container finished" podID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" exitCode=0 Oct 02 13:23:24 crc kubenswrapper[4766]: I1002 13:23:24.647099 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" event={"ID":"cd484f43-26b6-4e55-b872-7502e8d6e8c7","Type":"ContainerDied","Data":"ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6"} Oct 02 13:23:24 crc kubenswrapper[4766]: I1002 13:23:24.647156 4766 scope.go:117] "RemoveContainer" containerID="ebffd89870cf914fa65f0c122b373b894237020d282dbb532b184737bc6c17ba" Oct 02 13:23:24 crc kubenswrapper[4766]: I1002 13:23:24.648218 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:23:24 crc kubenswrapper[4766]: E1002 13:23:24.648775 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:23:35 crc kubenswrapper[4766]: I1002 13:23:35.881983 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:23:35 crc kubenswrapper[4766]: E1002 13:23:35.882678 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:23:46 crc kubenswrapper[4766]: I1002 13:23:46.881741 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:23:46 crc kubenswrapper[4766]: E1002 13:23:46.882523 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:24:00 crc kubenswrapper[4766]: I1002 13:24:00.882172 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:24:00 crc kubenswrapper[4766]: E1002 13:24:00.882977 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:24:14 crc kubenswrapper[4766]: I1002 13:24:14.881364 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:24:14 crc kubenswrapper[4766]: E1002 13:24:14.882266 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:24:25 crc kubenswrapper[4766]: I1002 13:24:25.888121 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:24:25 crc kubenswrapper[4766]: E1002 13:24:25.890779 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:24:38 crc kubenswrapper[4766]: I1002 13:24:38.886608 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:24:38 crc kubenswrapper[4766]: E1002 13:24:38.888908 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:24:42 crc kubenswrapper[4766]: I1002 13:24:42.902267 4766 scope.go:117] "RemoveContainer" containerID="5bc141a856f4f595682faccc2789c7ff05ed268428266f06e4d9ba47f66294ad" Oct 02 13:24:42 crc kubenswrapper[4766]: I1002 13:24:42.929424 4766 scope.go:117] "RemoveContainer" containerID="cc8362feff2618262831769a37ba173901c1f43ac901092733ed8d99b6848493" Oct 02 13:24:42 crc kubenswrapper[4766]: I1002 13:24:42.987155 4766 scope.go:117] "RemoveContainer" containerID="91deaad039d5d0e41648c148c26d3e1b4fd092da1a0e78724885e6002941d816" Oct 02 13:24:43 crc kubenswrapper[4766]: I1002 13:24:43.019240 4766 scope.go:117] "RemoveContainer" containerID="838d300bb4daee09ed25344b659b92b90c86b9a5a6be2551d7f6bea4b2334c25" Oct 02 13:24:51 crc kubenswrapper[4766]: I1002 13:24:51.886765 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:24:51 crc kubenswrapper[4766]: E1002 13:24:51.887562 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:25:06 crc kubenswrapper[4766]: I1002 13:25:06.881283 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:25:06 crc kubenswrapper[4766]: E1002 13:25:06.882450 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:25:20 crc kubenswrapper[4766]: I1002 13:25:20.884426 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:25:20 crc kubenswrapper[4766]: E1002 13:25:20.885285 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:25:32 crc kubenswrapper[4766]: I1002 13:25:32.119281 4766 generic.go:334] "Generic (PLEG): container finished" podID="cb3e12a5-d177-4e17-9cb4-71e3efeb1c36" containerID="fdc7b82e259c85e4ab6f57117e8ba0086f719250fd8c7f488df7cc4e3a946f8e" exitCode=0 Oct 02 13:25:32 crc kubenswrapper[4766]: I1002 13:25:32.119488 4766 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7gs9r/must-gather-glrbt" event={"ID":"cb3e12a5-d177-4e17-9cb4-71e3efeb1c36","Type":"ContainerDied","Data":"fdc7b82e259c85e4ab6f57117e8ba0086f719250fd8c7f488df7cc4e3a946f8e"} Oct 02 13:25:32 crc kubenswrapper[4766]: I1002 13:25:32.120884 4766 scope.go:117] "RemoveContainer" containerID="fdc7b82e259c85e4ab6f57117e8ba0086f719250fd8c7f488df7cc4e3a946f8e" Oct 02 13:25:32 crc kubenswrapper[4766]: I1002 13:25:32.688711 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7gs9r_must-gather-glrbt_cb3e12a5-d177-4e17-9cb4-71e3efeb1c36/gather/0.log" Oct 02 13:25:33 crc kubenswrapper[4766]: I1002 13:25:33.882248 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:25:33 crc kubenswrapper[4766]: E1002 13:25:33.882558 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:25:41 crc kubenswrapper[4766]: I1002 13:25:41.020417 4766 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7gs9r/must-gather-glrbt"] Oct 02 13:25:41 crc kubenswrapper[4766]: I1002 13:25:41.021702 4766 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7gs9r/must-gather-glrbt" podUID="cb3e12a5-d177-4e17-9cb4-71e3efeb1c36" containerName="copy" containerID="cri-o://4690fa4dd4d6f0b89a1f426dfe000902a254a54805de0be734be6c9db6d7584e" gracePeriod=2 Oct 02 13:25:41 crc kubenswrapper[4766]: I1002 13:25:41.056949 4766 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7gs9r/must-gather-glrbt"] Oct 02 13:25:41 crc kubenswrapper[4766]: I1002 13:25:41.247254 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7gs9r_must-gather-glrbt_cb3e12a5-d177-4e17-9cb4-71e3efeb1c36/copy/0.log" Oct 02 13:25:41 crc kubenswrapper[4766]: I1002 13:25:41.247760 4766 generic.go:334] "Generic (PLEG): container finished" podID="cb3e12a5-d177-4e17-9cb4-71e3efeb1c36" containerID="4690fa4dd4d6f0b89a1f426dfe000902a254a54805de0be734be6c9db6d7584e" exitCode=143 Oct 02 13:25:42 crc kubenswrapper[4766]: I1002 13:25:42.116055 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7gs9r_must-gather-glrbt_cb3e12a5-d177-4e17-9cb4-71e3efeb1c36/copy/0.log" Oct 02 13:25:42 crc kubenswrapper[4766]: I1002 13:25:42.118123 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gs9r/must-gather-glrbt" Oct 02 13:25:42 crc kubenswrapper[4766]: I1002 13:25:42.264440 4766 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7gs9r_must-gather-glrbt_cb3e12a5-d177-4e17-9cb4-71e3efeb1c36/copy/0.log" Oct 02 13:25:42 crc kubenswrapper[4766]: I1002 13:25:42.265280 4766 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7gs9r/must-gather-glrbt" Oct 02 13:25:42 crc kubenswrapper[4766]: I1002 13:25:42.265462 4766 scope.go:117] "RemoveContainer" containerID="4690fa4dd4d6f0b89a1f426dfe000902a254a54805de0be734be6c9db6d7584e" Oct 02 13:25:42 crc kubenswrapper[4766]: I1002 13:25:42.266868 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zxkf\" (UniqueName: \"kubernetes.io/projected/cb3e12a5-d177-4e17-9cb4-71e3efeb1c36-kube-api-access-6zxkf\") pod \"cb3e12a5-d177-4e17-9cb4-71e3efeb1c36\" (UID: \"cb3e12a5-d177-4e17-9cb4-71e3efeb1c36\") " Oct 02 13:25:42 crc kubenswrapper[4766]: I1002 13:25:42.267132 4766 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cb3e12a5-d177-4e17-9cb4-71e3efeb1c36-must-gather-output\") pod \"cb3e12a5-d177-4e17-9cb4-71e3efeb1c36\" (UID: \"cb3e12a5-d177-4e17-9cb4-71e3efeb1c36\") " Oct 02 13:25:42 crc kubenswrapper[4766]: I1002 13:25:42.273123 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb3e12a5-d177-4e17-9cb4-71e3efeb1c36-kube-api-access-6zxkf" (OuterVolumeSpecName: "kube-api-access-6zxkf") pod "cb3e12a5-d177-4e17-9cb4-71e3efeb1c36" (UID: "cb3e12a5-d177-4e17-9cb4-71e3efeb1c36"). InnerVolumeSpecName "kube-api-access-6zxkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 02 13:25:42 crc kubenswrapper[4766]: I1002 13:25:42.305716 4766 scope.go:117] "RemoveContainer" containerID="fdc7b82e259c85e4ab6f57117e8ba0086f719250fd8c7f488df7cc4e3a946f8e" Oct 02 13:25:42 crc kubenswrapper[4766]: I1002 13:25:42.370140 4766 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zxkf\" (UniqueName: \"kubernetes.io/projected/cb3e12a5-d177-4e17-9cb4-71e3efeb1c36-kube-api-access-6zxkf\") on node \"crc\" DevicePath \"\"" Oct 02 13:25:42 crc kubenswrapper[4766]: I1002 13:25:42.506227 4766 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb3e12a5-d177-4e17-9cb4-71e3efeb1c36-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "cb3e12a5-d177-4e17-9cb4-71e3efeb1c36" (UID: "cb3e12a5-d177-4e17-9cb4-71e3efeb1c36"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 02 13:25:42 crc kubenswrapper[4766]: I1002 13:25:42.575912 4766 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cb3e12a5-d177-4e17-9cb4-71e3efeb1c36-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 02 13:25:43 crc kubenswrapper[4766]: I1002 13:25:43.905709 4766 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb3e12a5-d177-4e17-9cb4-71e3efeb1c36" path="/var/lib/kubelet/pods/cb3e12a5-d177-4e17-9cb4-71e3efeb1c36/volumes" Oct 02 13:25:45 crc kubenswrapper[4766]: I1002 13:25:45.889917 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:25:45 crc kubenswrapper[4766]: E1002 13:25:45.891687 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:25:57 crc kubenswrapper[4766]: I1002 13:25:57.881948 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:25:57 crc kubenswrapper[4766]: E1002 13:25:57.882895 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:26:11 crc kubenswrapper[4766]: I1002 13:26:11.881893 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:26:11 crc kubenswrapper[4766]: E1002 13:26:11.882922 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:26:22 crc kubenswrapper[4766]: I1002 13:26:22.882523 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:26:22 crc kubenswrapper[4766]: E1002 13:26:22.883160 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:26:34 crc kubenswrapper[4766]: I1002 13:26:34.881770 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:26:34 crc kubenswrapper[4766]: E1002 13:26:34.883464 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:26:45 crc kubenswrapper[4766]: I1002 13:26:45.894938 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:26:45 crc kubenswrapper[4766]: E1002 13:26:45.895802 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:26:58 crc kubenswrapper[4766]: I1002 13:26:58.881869 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:26:58 crc kubenswrapper[4766]: E1002 13:26:58.882546 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:27:09 crc kubenswrapper[4766]: I1002 13:27:09.881970 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:27:09 crc kubenswrapper[4766]: E1002 13:27:09.883265 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:27:21 crc kubenswrapper[4766]: I1002 13:27:21.886405 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:27:21 crc kubenswrapper[4766]: E1002 13:27:21.887136 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:27:32 crc kubenswrapper[4766]: I1002 13:27:32.882098 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:27:32 crc kubenswrapper[4766]: E1002 13:27:32.883196 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:27:43 crc kubenswrapper[4766]: I1002 13:27:43.216636 4766 scope.go:117] "RemoveContainer" containerID="23257bc4d761e7f87acc3381a5be41f8676fba1e63044fffa54703d02f3f38ec" Oct 02 13:27:43 crc kubenswrapper[4766]: I1002 13:27:43.881136 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:27:43 crc kubenswrapper[4766]: E1002 13:27:43.881452 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7" Oct 02 13:27:57 crc kubenswrapper[4766]: I1002 13:27:57.881492 4766 scope.go:117] "RemoveContainer" containerID="ea194802f0a48eec55feb8689c3c9b47099e60f9da65d86474b550ba7b95dfc6" Oct 02 13:27:57 crc kubenswrapper[4766]: E1002 13:27:57.882449 4766 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-l99lx_openshift-machine-config-operator(cd484f43-26b6-4e55-b872-7502e8d6e8c7)\"" pod="openshift-machine-config-operator/machine-config-daemon-l99lx" podUID="cd484f43-26b6-4e55-b872-7502e8d6e8c7"